this post was submitted on 17 Feb 2024
1058 points (98.8% liked)

Technology

59120 readers
2739 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] General_Effort@lemmy.world 5 points 8 months ago (1 children)

They say it’s $60 million on an annualized basis. I wonder who’d pay that, given that you can probably scrape it for free.

Maybe it’s the AI act in the EU. That might cause trouble in that regard. The US is seeing a lot of rent-seeker PR, too, of course. That might cause some to hedge their bets.

Maybe some people had not realized that yet, but limiting fair use does not just benefit the traditional media corporations but also the likes of Reddit, Facebook, Apple, etc. Making “robots.txt” legally binding would only benefit the tech companies.

[–] FaceDeer@kbin.social -1 points 8 months ago (1 children)

This is the most frustrating thing, so many people are arguing against their own interests with their efforts to "lock down" their content to prevent AIs from training on it. In this very thread I've been accused of being pro-giant-company when I'm quite the opposite. The harder we make it to train AI, the stronger the advantage that the existing giant companies have in this field.