Some of it is overinflated marketing, but for organizations trying to cut costs it could have a significant effect on a lot of their employees.
AI doesn't need to be good. It just needs to be cheaper and good enough.
Welcome to the Unpopular Opinion community!
How voting works:
Vote the opposite of the norm.
If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.
Guidelines:
Tag your post, if possible (not required)
Rules:
1. NO POLITICS
Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.
2. Be civil.
Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Shitposts and memes are allowed but...
Only until they prove to be a problem. They can and will be removed at moderator discretion.
5. No trolling.
This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.
Instance-wide rules always apply. https://legal.lemmy.world/tos/
Some of it is overinflated marketing, but for organizations trying to cut costs it could have a significant effect on a lot of their employees.
AI doesn't need to be good. It just needs to be cheaper and good enough.
So most people are assuming AI will do all the work of a job. Maybe it will someday, but my experience today with it appears to be able to do 80% of the work with only 20% human effort put in. So no, its not doing 100% of the work, it doing 80%, but it does that 80% in seconds for what used to take me hours or days.
That is a huge improvement over no AI use at all.
Improvement for who
it appears to be able to do 80% of the work with only 20% human effort put in. So no, its not doing 100% of the work, it doing 80%,
I think that's the calculation most organizations will make. If AI can do 80% of a job, they can fire 80-90% of their employees in that task, and use the remainder as AI wranglers.
That's a pretty significant workforce reduction, and it means the folks who remain employed spend less of their time doing what they trained for, and more time in an IT/management role.
Yea, I mostly mean the AGI nonsense. There are jobs where AI is helpful - tho imo it is worthy to point out that not all of it is purely benefit of AI.
Id argue this isn't unpopular to anyone who knows that "AI" is just pattern matching and marketing to people who don't understand tech.
People should actively be skeptical.
It's unsustainable right now because hardware and software are not aligned (yet). Software is currently out pacing hardware, but there are loads of companies working on specialist chips that will deal with the computing power problem and the energy consumption problem just by the shear factor of optimization benefits.
Plus, software optimizations are also well under way and models are always being fine tuned to run better/train better with less
I doubt how good results that could achieve. I agree that 10~100 times improvement is feasible by optimizing the hardware. But the hardware in general need to be improved, yet the impenetrable barrier light speed is blocking on the way.
And more complete AI systems should require hundreds of thousands times the computation power. Really, this has the same issue as bitcoin.
I think the specialized hardware for this task will be better than you expect. It’s like using a sledgehammer to carve something. Pretty soon a chisel will be given to the computer and it will be able to do its job much easier.
I doubt it since GPU was already not a bad tool for this job. The generality of GPGPU helped a lot here.
Classic Gartner hype cycle:
We're in the Peak of Inflated Expectations phase.
https://upload.wikimedia.org/wikipedia/commons/thumb/9/94/Gartner_Hype_Cycle.svg/1200px-Gartner_Hype_Cycle.svg.png
Meanwhile...
https://www.theregister.com/2023/10/11/github_ai_copilot_microsoft/
[...] while Microsoft charges $10 a month for the service, the software giant is losing $20 a month per user on average and heavier users are costing the company as much as $80 [...]
Mmm hmmm.
This could be one form of "course correction"; few people are going to care to participate if they're forced to pay what it actually costs.
I suspect this is all part of the long term plan; provide the service at a reduced fee so people gain reliance on the tech, then increase the cost over time. We see this happen everywhere.
Current AI isn't in any meaningful sense "intelligent". It's all smoke, mirrors, horses, and ponies put out on a fancy performance designed to transfer money from the public purse (directly or indirectly) into the pockets of sociopathic billionaires.
The "current gen AI" is the key here. How sustainable it is depends on how quickly it can grow and improve. Technology is growing much faster than in the past. I remember getting a dictation program in 1998. I had to spend 2 hours talking to it so it could learn my voice. Even after all that, it still only had about a 25% success rate in properly transcribing my text. In 2015 I bought my first smart watch. The first voice transcription I made from it was 100% correct with absolutely no learning of my voice at all.
I believe that the LLM will quickly give way to a different type of AI. There may be several different approaches to AI before something really takes hold and changes the game.
Operating an AI takes huge computing power.
For now. There are already plans to accelerate some specific machine learning workloads on next generations of low powered mobile chips. Think ChatGPT on a smartphone.
For other use-cases, you don't even need to wait. Google Coral can do object recognition for your security camera feed, using minuscule amount of power, compared to a GPU.
This is definitely true, but keep in mind that there is a limit to how far you can optimize a chip. Eventually we could have everything running on ASICs, but electronics do have a maximum speed that we may not be far from reaching.