this post was submitted on 18 Oct 2023
25 points (96.3% liked)

Futurology

1670 readers
75 users here now

founded 1 year ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] sabreW4K3@lemmy.tf 6 points 11 months ago (1 children)

What was it focused on before?

[–] BaylorSwift3@futurology.today 7 points 11 months ago* (last edited 11 months ago) (2 children)

Just AI. The distinction being that an AGI (Artificial General Intelligence) is a theoretical superintelligence capable of any intellectual task, including coding to improve itself.

[–] Mkengine@feddit.de 2 points 11 months ago

Be careful with the terminology, as artificial super intelligence is something different than artificial general intelligence.

[–] sabreW4K3@lemmy.tf 1 points 11 months ago
[–] thantik@lemmy.world 6 points 11 months ago (2 children)

Honestly I think we're going to have to pit AIs against one another in another web of complexity in order to reach AGI. You're going to need 50 different AIs all deliberating and contradicting and refining output with each one controlling some aspect of the result. I think it's far too much processing needed at this point in time. LLMs right now are largely just word predictors, but we also see things like diffusors being able to create images, etc. These are distinct problem types, and I think if we develop a separate AI for each generalized problem type, we can utilize them in a way that can predictably output something that we can classify as 'general intelligence'.

[–] BaylorSwift3@futurology.today 2 points 11 months ago* (last edited 11 months ago) (1 children)

I think we’re going to have to pit AIs against one another in another web of complexity in order to reach AGI.

I don't know if its going to be the route to AGI, but what you are describing is already happening.

There's Microsoft’s AutoGen framework & OpenAI next month say they too will have AI Agents for Chat-GPT

[–] drekly@lemmy.world 4 points 11 months ago

I just want an offline one that works on my pc without API costs. But then I have an old GPU so that's a dream

[–] kakes@sh.itjust.works 2 points 11 months ago (1 children)

I think we need to shift our paradigm one or two more times before we can start seriously talking about AGI. Current transformer models are impressive, but they're much better suited to modeling language than what I would call "cognition".
I think we're close, but I don't think we'll get there by increasing/improving current technology.

[–] thantik@lemmy.world 1 points 11 months ago

Hell, honestly -- LLMs are smarter than half of the people I know already.