this post was submitted on 20 Nov 2023
1512 points (98.5% liked)

Technology

59298 readers
4551 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] smooth_tea@lemmy.world -1 points 11 months ago (3 children)

I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

[–] Floshie@lemmy.blahaj.zone 2 points 11 months ago

Consider the impact on human psychology. Not everyone has the guts to read and even look through these. And even though they appear to have, it still scars them inside.

Maybe There is no alternative for now, but don't do that to people with such low paycheck. Consider even the background of these people who may work on these tasks to not even live, but to survive. I would have preffered to wait 10 years than to indulge these horrifying tasks to those persons.

I'm sure there are lots of people who are in jail for creating/sharing or even making a profit off of these content. They could do that work ? But then again, even though it bothers me less than people who has no choice to live their lives, that is still an Idea I find ethically very questionable.

[–] barsoap@lemm.ee 1 points 11 months ago

Very much yes police authorities have CSAM databases. If what you want to do with it really is above board and sensible they'll let you access that stuff.

I don't doubt anything that OpenAI could do with that stuff can be above board, but sensible is another question: Any model that can detect something can be used to train a model which can generate it. As such those models are under lock and key just like their training sets, (social) media platforms which have a use for these things and the resources run them, under the watchful eye of the authorities. Think faceboogle. OpenAI could, in principle, try to get into the business of selling companies at that scale models they can, and have, trained themselves, I don't really see that making sense from the business POV, either.