253
this post was submitted on 06 Dec 2023
253 points (95.7% liked)
Technology
59197 readers
3468 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
FWIW I work in the field and agree with this. LLMs in the current state are not so dangerous they can't be released to public. Generative image and video models are a much bigger threat, but that was largely something which came from open source.
If we really want to pearl clutch, it is NVIDIA which is really propping open this Pandora's box in terms of putting the capability in irresponsible hands
Are the cards really powerful enough for so much fuss?
Definitely. An A100 system is around $10k which is expensive, but definitely in reach, and you need two of them to run a 70B parameter model. Possibly one if you are clever about it.
And you can still do plenty of damage with a $1000 consumer grade GPU. Most deepfake videos are trained on these platforms.