this post was submitted on 11 Jul 2024
134 points (99.3% liked)

Technology

37699 readers
313 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jarfil@beehaw.org 1 points 3 months ago

No pictures of kids for example

Meaning, an AI blind to kids.

Keep in mind that training data is required for both recognition, and generation. Legislating that kids "It doesn't look like anything to me", leads to things like:

  • Cars that don't stop for "It doesn't look like anything to me"
  • Spam filters that don't stop porn, or gore, or both, of "It doesn't look like anything to me"
  • Photo storage that erases empty photos which "It doesn't look like anything to me"

For porn specific AIs, don't allow users to upload custom images

Not sure how you think AIs work, but anyone can train a LoRa on their own laptop, no "uploading" to anywhere required.

Companies clearly can't be trusted to put in safeguards for themselves, so I guess it is time for legislation.

Cool, and I agree with that. I just think that example is horrific (for starters, it would make Lemmy's anti-CSAM filter illegal, since it's trained on pictures of kids).

Got any other proposals?