this post was submitted on 27 Oct 2023
301 points (96.6% liked)

Technology

59414 readers
3138 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] nevemsenki@lemmy.world 100 points 1 year ago (6 children)

Can't wait for this AI bubble to fizzle. It's the blockchain insanity all over again.

[–] danielbln@lemmy.world 54 points 1 year ago (2 children)

Maybe, but gen AI produces actually useful, tractable results. That's already heaps more than crypto, which is just techno gambling

[–] nevemsenki@lemmy.world 35 points 1 year ago (4 children)

As long as LLM AI models are prone to hallucinating and there is no way to audit how they derive results (eg to verify accuracy), relying on them will have roadblocks/limitations. Once they solve this issue though, that will be a whole different story, I agree. As for other AIs such as image or video generation, I don't have enough experience to tell...

[–] danielbln@lemmy.world 19 points 1 year ago (4 children)

Hallucinations can be heavily reduced today by providing the LLM with grounding truth. People use naked LLMs as knowledge databases, which is prone to hallucinations indeed. However, provide them with verified data from the side and they are very, very good at keeping to the truth. I know, because we deploy these with clients to great avail.

Image, music, video models are making great strides and are already part of various pipelines, all the way up to the big boy tools like Photoshop (generative fill, for example).

The tech is being incorporated at a large scale by a lot of companies, from SME to megacorp. I don't see it going away any time soon, even if it doesn't improve from here on out (which it undoubtedly will).

[–] BastingChemina 2 points 1 year ago (1 children)

The issue is that there are from time to time they still confidently hallucinate and there is no way to detect if they are right or not.

load more comments (1 replies)
load more comments (3 replies)
[–] rambaroo@lemmy.world 8 points 1 year ago (1 children)

Hallucinations aren't the only issue with LLMs, they also have a limited amount of context they can recall and that problem won't go away.

load more comments (1 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] JDubbleu@programming.dev 24 points 1 year ago (1 children)

It's not quite blockchain. It is incredibly useful in a broad range of applications, and has genuinely changed how millions of people work. Sure it's not the magic bullet wall street thinks it is, but my work has been improved immensely through the use of generative AI. Especially with uniquely challenging software problems and niche questions.

I think it'll be similar to VR. Extremely useful and interesting, but over-hyped and not going to penetrate our lives as much as most people think.

[–] danielbln@lemmy.world 9 points 1 year ago (1 children)

My mom never used VR, but she happily talks to GPT4. From that perspective I think mindshare in the broader population will be significantly higher than VR (even if it doesn't live up to the hype VC/Wallstreet machine).

[–] R00bot@lemmy.blahaj.zone 5 points 1 year ago (5 children)

If you had to buy an expensive headset to use chatGPT she wouldn't either.

[–] lolcatnip@reddthat.com 4 points 1 year ago (1 children)

If she had wheels she'd be a wagon.

[–] R00bot@lemmy.blahaj.zone 2 points 1 year ago

I'm just saying the accessibility of AI doesn't necessarily mean it has more utility. Just that it's more accessible.

load more comments (4 replies)
[–] FaceDeer@kbin.social 14 points 1 year ago (1 children)

I'm still waiting for the electricity bubble to fizzle.

[–] TonyTonyChopper@mander.xyz 3 points 1 year ago

pretty sure it'll take nuclear war to pop that one

load more comments (2 replies)
[–] flop_leash_973@lemmy.world 41 points 1 year ago (2 children)

They did the same thing with “blockchain”, “NFTs”, and other largely vaporware crypto junk last year.

They are just riding the hype wave hoping to cash out before the bottom falls out of it like always.

[–] whoisearth@lemmy.ca 4 points 1 year ago (1 children)

You forgot RPA. That died down real quick from 2 years ago lool

[–] sik0fewl@kbin.social 2 points 1 year ago
[–] themurphy@lemmy.world 2 points 1 year ago (1 children)

Do you honestly think AI is just a farce like NFT?

[–] flop_leash_973@lemmy.world 5 points 1 year ago

Not in total. But I also don't think it is the king maker these investors are making it out to be. Just like Crypto, it is a tool that can enhance and improve things when applied to the right things in the right ways. It is not some magic bullet that is the easy button that the investor snake that is currently eating its tail speculating over it would have everyone believe.

[–] crazyminner@lemmy.ml 19 points 1 year ago (2 children)

Those rich fuckers are putting a lot of bets on AI. They're keeping us busy over working struggling to pay to exist, while they perfect our replacement so that they can be rid of us.

[–] atrielienz@lemmy.world 2 points 1 year ago (2 children)

Getting rid of us doesn't make sense. We circulate the money. They need us to generate the things that we then buy. Without that they'd need to actually spend money and they won't.

load more comments (2 replies)
[–] Smacks@lemmy.world 15 points 1 year ago (2 children)

AI is another one of those tech fabs that'll fade away, but not like blockchain and NFTs. There's money to be made because people are excited, but it's use-cases are much more complicated.

AI is a fantastic tool for creators, including programmers with Copilot. But it isn't a full-blown replacement for workers quite yet. A lot of capitalists are really excited to slash their workforce in half, sure, but they're utterly ignoring the true potential of AI. It's a tool, not a replacement (yet).

[–] hubobes@sh.itjust.works 3 points 1 year ago (1 children)

Those capitalists also do not understand that if these tools can replace workers everyone can, through FOSS projects, own these tools and let them work for themselves.

load more comments (1 replies)
[–] AngryCommieKender@lemmy.world 3 points 1 year ago (2 children)

If they really wanted to slash their workforce, middle management has been able to be automated for over a decade. They don't want to fire their friends and their kids.

load more comments (2 replies)
[–] CoderKat@lemm.ee 11 points 1 year ago (4 children)

I do think there's some use for AI in its current form (especially AI art as a tool for developing other works, like movies and video games), but I find it bizarre just how much investors value the current form of AI.

As cool as I find AI art, I'm not yet sure about it's commercial viability, given the serious legal issues it's facing. So why do investors, who are supposed to care about commercial viability, value it so much?

And for generative text, I have an even more negative stance. My understanding is that the cost to train and run those AIs is ludicrous. Sure, some companies will use it to make blog spam articles or replace their basic support staff with it, but is that really gonna make it profitable?

And I emphasized "current form" because the current AI is basically just predictive text. It's severely limited and this is extremely evident if you try to ask even basic math problems. It's not capable of actual intelligence, which is what has me very skeptical of it on the long term. Maybe these companies will come up with a new, better form of AI. Or maybe they won't. But it doesn't seem like "just increase the size of the model" is sustainable nor will frankly get closer to strong(ish?) AI.

[–] Kage520@lemmy.world 5 points 1 year ago

I haven't used this, but think about all narrators losing their jobs because so can do it with the click of a button. https://customers.microsoft.com/en-AU/story/1646266241611394912-project-gutenberg-nonprofit-azure-synapse-analytics-azure-ai-services

That's a lot of people not on the payroll anymore. No health insurance costs, no vacations. Just using the software.

Think of a lot of analytics jobs that ai can replace. You ever spend a day or two making a spreadsheet do whatever you need it to? That's probably a lot of people's jobs. AI can make those people more efficient (as long as a human checks it later), so companies can fire most of the team. That's a lot more people off the payroll.

And there are companies working on general ai. That will replace.... So many jobs.

[–] SCB@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

And I emphasized “current form” because the current AI is basically just predictive text.

This is a program I use daily at work. Costs me like $250/year on my budget - literally less than. one hotel stay for a work trip. I spent more on food last trip than this will cost my company.

https://www.synthesia.io/

It's a big step away from "predictive text." This is the AI revolution in action. There are dozens of products you don't know about shaking up professions you barely ever think about.

I don't have to build a Content Gen team because of this software, probably ever.

My buddy, meanwhile, is on a team building an "AI" for a major property insurance company to help them sift data. Small changes, incrementally, permeating through the system. That's strong adoption and worth investment.

[–] joel_feila@lemmy.world 3 points 1 year ago

To replace human labor. That what it is about.

[–] xenoclast@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Bandwagoning doesn't require thinking or logic. It's fomo capitalism

There's also no association between the product and it's value. It's perceived value only.

[–] kandoh@reddthat.com 11 points 1 year ago (2 children)

AI is great, I had gotten tired of Shutterstock and needed something to replace stock images.

Not sure it's good for much else though.

[–] matthewc@lemmy.world 12 points 1 year ago (2 children)

We are in the infancy of generative AI. For you it has already replaced an entire sector of the workforce: artists. For others it has replaced them wholesale. For others it just assists. Hollywood was trying to legally own actors voices and likenesses to replace them.

This technology is not standing still. It will be great at a lot of things in the future. It could be next month. It could be next year. It could be in a decade. Whenever it arrives for your job it will be cheaper than you. There will be no going backward on this technology.

[–] Semi-Hemi-Demigod@kbin.social 3 points 1 year ago (10 children)

I totally agree that we're just scratching the surface of what AI can do. But I don't think it's what Wall Street thinks it is. It's not too terribly difficult to spin up an LLM, which means it's going to be difficult to set up chokepoints to extract rent.

Though I bet they'll get the government's help with that by regulating AI for "safety." The big guys won't have a problem but anyone else will have illegal programs running.

load more comments (10 replies)
load more comments (1 replies)
[–] c0mbatbag3l@lemmy.world 7 points 1 year ago (2 children)

I work in IT, you want a list? Probably be quicker to tell you what it's not doing/replacing.

[–] whoisearth@lemmy.ca 9 points 1 year ago (2 children)

No longer do I have to learn Regex! For that alone AI is worth.

load more comments (2 replies)
[–] kandoh@reddthat.com 1 points 1 year ago

If you can explain it in a way a graphic designer would understand I would genuinely be interested

[–] perviouslyiner@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

Doesn't generative AI need a whole other layer of technology to become reliable?

The AI needs to control some domain-specific model (like a poser skeleton for pictures of humans) that enforces the rules for how each modelled concept can actually behave, instead of trying to guess the output directly.

load more comments
view more: next ›