this post was submitted on 26 Aug 2024
342 points (96.7% liked)

Technology

59197 readers
2909 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ekky@sopuli.xyz 51 points 2 months ago (3 children)

So now LLM makers actually have to sanitize their datasets? The horror....

[–] silence7 18 points 2 months ago (1 children)

I don't think that's tractable.

[–] ekky@sopuli.xyz 17 points 2 months ago

Oh no, it's very difficult, especially on the scale of LLMs.

That said, we others (those of us who have any amount of respect towards ourselves, our craft, and our fellow human) have been sourcing our data carefully since way before NNs, such as asking the relevant authority for it (ex. asking the post house for images of handwritten destinations).

Is this slow and cumbersome? Oh yes. But it delays the need for over-restrictive laws, just like with RC crafts before drones. And by extension, it allows those who could not source the material they needed through conventional means, or those small new startups with no idea what they were doing, to skim the gray border and still get a small and hopefully usable dataset.

And now, someone had the grand idea to not only scour and scavenge the whole internet with no abandon, but also boast about it. So now everyone gets punished.

At last: don't get me wrong, laws are good (duh), but less restrictive or incomplete laws can be nice as long as everyone respects each other. I'm excited to see what the future brings in this regard, but I hate the idea that those who facilitated this change likely are the only ones to go free.

[–] FiskFisk33@startrek.website 5 points 2 months ago (1 children)

that first L stands for large. sanitizing something of this size is not hard, it's functionally impossible.

[–] ekky@sopuli.xyz 4 points 2 months ago

You don't have to sanitize the weights, you have to sanitize the data you use to get the weights. Two very different things, and while I agree that sanitizing a LLM after training is close to impossible, sanitizing the data you give it is much, much easier.

[–] leftzero@lemmynsfw.com 1 points 2 months ago

They can't.

They went public too fast chasing quick profits and now the well is too poisoned to train new models with up to date information.