this post was submitted on 28 Mar 2024
404 points (97.2% liked)
Technology
59120 readers
4131 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is the best summary I could come up with:
In-depth Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.
Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned.
He created huggingface-cli in December after seeing it repeatedly hallucinated by generative AI; by February this year, Alibaba was referring to it in GraphTranslator's README instructions rather than the real Hugging Face CLI tool.
Last year, through security firm Vulcan Cyber, Lanyado published research detailing how one might pose a coding question to an AI model like ChatGPT and receive an answer that recommends the use of a software library, package, or framework that doesn't exist.
The willingness of AI models to confidently cite non-existent court cases is now well known and has caused no small amount of embarrassment among attorneys unaware of this tendency.
As Lanyado noted previously, a miscreant might use an AI-invented name for a malicious package uploaded to some repository in the hope others might download the malware.
The original article contains 1,143 words, the summary contains 190 words. Saved 83%. I'm a bot and I'm open source!
so basically, given AI having full reigns, it'll take about 2 weeks before it all goes to complete shit, unreadable code, completely garbage software. Just an utter disaster waiting to happen. Cool.