this post was submitted on 10 Apr 2024
113 points (91.2% liked)

Technology

34920 readers
172 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

The promise of AI, for corporations and investors, is that companies can increase profits and productivity by slashing their reliance upon a skilled human workforce. But as this story and many others show, AI is just today’s buzzword for “outsourcing,” and it comes with the same problems that have plagued outsourced companies and workforces for decades.

you are viewing a single comment's thread
view the rest of the comments
[–] magic_lobster_party@kbin.run 1 points 7 months ago (1 children)

I’m talking about the entire process from design to product. Ok, maybe those useless chatbot “help staff” might be actual LLMs, but that Amazon grocery store used as example in the article was just Indian labor all along.

As soon you want to solve a very specific problem using AI, it can quickly get time consuming and expensive to develop the product. Maybe that off the shelf AI model isn’t good enough for your particular problem? Maybe it only gives 75% accuracy when you really need 95% to be competitive in the market. In that case you need to compare different models, figure out if there’s any trick you can do to boost the accuracy, try out different training strategies, etc.

And once the model has 95% accuracy on your own labeled data, it might turn out it’s completely worthless out in the field because it turns out the data you collected isn’t representative of the reality.

At that point you might just try to figure out how to offload the work someone else. I’ve even heard of self driving car companies who did exactly that.

[–] FaceDeer@fedia.io 1 points 7 months ago

Going back to my original comment:

Sure, but the fact that not all AI isn't really AI doesn't mean it isn't real.

The fact that Amazon was faking it in this one instance doesn't poof all the actual AI out of existence. There are plenty of off-the-shelf AI models that are good enough for various particular problems, they can go ahead and use them. You said it yourself, the chatbot "help staff" might be actual LLMs.

At that point you might just try to figure out how to offload the work someone else.

As I said, most companies using AI will likely be hiring professional AI service providers for it. That's where those hundreds of billions of dollars I mentioned above are going, where all the PhDs spending years on R&D are working.