this post was submitted on 07 Mar 2024
298 points (92.3% liked)

Memes

1176 readers
1 users here now

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Omega_Haxors@lemmy.ml 0 points 8 months ago* (last edited 8 months ago) (1 children)

It's like you didn't even read what I posted. Why do I even bother? Sophists literally don't care about facts.

[–] UraniumBlazer@lemm.ee 2 points 8 months ago (1 children)

Yes, I read what you posted and answered accordingly. Only, I didn't spend enough time dumbing it down further. So let me dumb it down.

Your main objection was the simplicity of the goal of LLMs- predicting the next word that occurs. Somehow, this simplistic goal makes the system stupid.

In my reply, I first said that self awareness occurs naturally after a system becomes more and more intelligent. I explained the reason as to why. I then went on to explain how a simplistic terminal goal has nothing to do with actual intelligence. Hence, no matter how stupid/simple a terminal goal is, if an intelligent system is challenged enough and given enough resources, it will develop sentience at a given point in time.

[–] Omega_Haxors@lemmy.ml 0 points 8 months ago

Exactly I literally said none of that shit you're just projecting your own shitty views onto me and asking me to defend them.