this post was submitted on 21 Nov 2023
995 points (97.9% liked)
Technology
59582 readers
4407 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Quite the opposite. People who understand how LLMs work know their limitations. And AI in general is incapable deduction and creativity. It simply is not able to produce something new by using existing knowledge. Sure it can generate a number of outputs through some transformations of input data. But not create.
If you think developers, engineers, architects and others are going to lose their jobs you are severely mistaken. Even for content writers it's a temporary setback because AI generated content is just limited and as soon as quality of human input to same AI starts dropping in quality so will AI's output.
It's a tool that you have to babysit, at least for foreseeable future. In general it's always a bad idea for human to supervise the machine because in time we grow complacent of its results and that's when the mistakes happen. When it tomes to writing some content, biggest problem is inaccuracies or some typo. Your comparison to CAD software is not a good one, since CAD doesn't produce anything on its own. It's a software assisting human, not generating content. Imagine the horror with CAD software auto-generated bridges. It would be only a matter of time before someone would just skip on double-checking what was generated. And I am fully aware there are AI generated structural parts and testing, but it's a part of design process where results have to checked by a human again.
I do think AI has a place and purpose, but it's not going to cost people their jobs, only help them do it more efficiently. It's great in assisting people but not replacing. If there's a manager out there who thinks AI can replace a human, then I can show you a bad manager who doesn't understand what AI is. In the future we might arrive at a point in time where AI is good enough to do some jobs human find too repetitive or dangerous. But we are far from that.
Also, LLM is not something I'd call AI, or at least intelligent. They are specialized neural networks which are trained using human input and whose sole purpose is predicting the next word or sentence in relation to what's entered as input. Glorified and overly complicated auto-complete. There's no intelligence involved.
That's not exactly how I view outcome of introducing new tools, but that's will have to be agree to disagree part. In my opinion tools remove tedious tasks completely or make them easier giving you more time to focus on what matters.