this post was submitted on 17 Jul 2024
122 points (88.1% liked)

Showerthoughts

29567 readers
1328 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

founded 1 year ago
MODERATORS
 

Any tool can be a hammer if you use it wrong enough.

A good hammer is designed to be a hammer and only used like a hammer.

If you have a fancy new hammer, everything looks like a nail.

you are viewing a single comment's thread
view the rest of the comments
[–] 0laura@lemmy.world 11 points 3 months ago (4 children)

you not liking it doesn't make it any less ai. I don't remember that many people complaining when we called the code controlling video game characters ai.

[–] Carighan@lemmy.world 4 points 3 months ago (1 children)

Or called our mobile phones "cell phones", despite not being organic. Tsk.

[–] DmMacniel@feddit.org 1 points 3 months ago

Except cell phones or cellular phones refer to the structure a mobile network is built on: a mesh of cell towers.

[–] DmMacniel@feddit.org 1 points 3 months ago (1 children)

pretty sure that they were and still are called Bots though, atleast in the context of first person shooter.

[–] 0laura@lemmy.world 2 points 3 months ago (1 children)

look at the NBT tags for bats for example. it means artificial intelligence.

https://www.digminecraft.com/data_tags/bat.php

[–] DmMacniel@feddit.org 0 points 3 months ago* (last edited 3 months ago) (1 children)

next thing you gonna say that boids are AI too...

just because Mojang decided to name that flag noAI doesn't mean it uses AI to govern its behavior.

[–] 0laura@lemmy.world 1 points 3 months ago

Descriptivism advocates when AI smhingmyheads

[–] kescusay@lemmy.world -2 points 3 months ago* (last edited 3 months ago) (3 children)

Software developer, here.

It's not actually AI. A large language model is essentially autocomplete on steroids. Very useful in some contexts, but it doesn't "learn" the way a neural network can. When you're feeding corrections into, say, ChatGPT, you're making small, temporary, cached adjustments to its data model, but you're not actually teaching it anything, because by its nature, it can't learn.

I'm not trying to diss LLMs, by the way. Like I said, they can be very useful in some contexts. I use Copilot to assist with coding, for example. Don't want to write a bunch of boilerplate code? Copilot is excellent for speeding that process up.

[–] celliern@lemmy.world 7 points 3 months ago (1 children)

LLMs are part of AI, which is a fairly large research domain of math/info, including machine learning among other. God, even linear regression can be classified as AI : that term is reeeally large

[–] kescusay@lemmy.world -1 points 3 months ago (2 children)

I mean, I guess the way people use the term "AI" these days, sure, but we're really beating all specificity out of the term.

[–] celliern@lemmy.world 2 points 3 months ago

This is a domain research domain that contain statistic methods and knowledge modeling among other. That's not new, but the fact that this is marketed like that everywhere is new

AI is really not a specific term. You may refer as global AI, and I suspect that's what you refer to when you say AI?

[–] 0laura@lemmy.world 1 points 3 months ago

it's always been this broad, and that's a good thing. if you want to talk about AGI then say AGI.

[–] 0laura@lemmy.world 2 points 3 months ago

I know that they're "autocorrect on steroids" and what that means, I don't see how that makes it any less ai. I'm not saying that LLMs have that magic sauce that is needed to be considered truly "intelligent", I'm saying that ai doesn't need any magic sauce to be ai. the code controlling bats in Minecraft is called ai, and no one complained about that.

[–] Zos_Kia@lemmynsfw.com 0 points 3 months ago

Very useful in some contexts, but it doesn’t “learn” the way a neural network can. When you’re feeding corrections into, say, ChatGPT, you’re making small, temporary, cached adjustments to its data model, but you’re not actually teaching it anything, because by its nature, it can’t learn.

But that's true of all (most ?) neural networks ? Are you saying Neural Networks are not AI and that they can't learn ?

NNs don't retrain while they are being used, they are trained once then they cannot learn new behaviour or correct existing behaviour. If you want to make them better you need to run them a bunch of times, collect and annotate good/bad runs, then re-train them from scratch (or fine-tune them) with this new data. Just like LLMs because LLMs are neural networks.