this post was submitted on 15 Dec 2023
357 points (84.0% liked)

Technology

59381 readers
4029 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A neuromorphic supercomputer called DeepSouth will be capable of 228 trillion synaptic operations per second, which is on par with the estimated number of operations in the human brain

Edit: updated link, no paywall

you are viewing a single comment's thread
view the rest of the comments
[–] ArbitraryValue@sh.itjust.works 342 points 11 months ago* (last edited 11 months ago) (4 children)

A better title would be "Supercomputer that could conceivably simulate entire human brain, based on a rough estimate of what it would take to do that if we had any idea how to do that, will switch on in 2024".

[–] gibmiser@lemmy.world 95 points 11 months ago (1 children)

For real. I'm reading the title all wondering how the fuck they mapped all the neuron connections and... nope, the real innovative part of the story is clickbait

[–] neuropean@kbin.social 47 points 11 months ago (1 children)

That’s only counting connections. The brain learns by making new connections, through complex location and timing dependent inputs from other neurons. It’s way more complex than the number of connections, and if neuroscientists are still studying the building blocks we don’t have much hope of recreating it.

[–] IHeartBadCode@kbin.social 32 points 11 months ago (1 children)

This also ignores that the brain is not wholly an electrical system. The are all kinds of chemical receptors within the brain that alter all kinds of neurological function. Kid of the reason why drugs are a thing. On small scales we have a pretty good idea how these work, at least for the receptors that we're aware of. On larger scales it's mostly guessing at this point. The brain has a knack of doing more than the sum of all parts on a pretty regular basis.

[–] 0ops@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

Not to mention the scale and nature of the "dataset" that our brains were trained on. Millions of years of instinct encoded in DNA, plus a few years gathering data from dozens of senses 24/7 (including chemical receptors, like you said) and in turn manipulating our bodies, interacting with the environment, and observing the results. We've been doing all of this since embryo.

We can't just feed a model raw image and text data and expect it's intelligence to be comparable to ours. However you quantify intelligence/consciousness whatever, the text/image model's thought processes will be alien to ours, which makes sense because their "environment" is nothing like ours - just text and image input and output.

[–] Geek_King@lemmy.world 19 points 11 months ago

I get so tired of these half-truth spun news article headlines. Thank you for bring it back down to reality.

[–] Warl0k3@lemmy.world 15 points 11 months ago

Four grad students out there hand-entering NXML rows while squinting at AI enhanced SEM images should be able to get all 228T done by.... next quarter, right?

This is setting aside that bus capacity is the bottleneck vs. compute power and they have yet to demonstrate bus performance of a full 228T connections/second with implicit timing which, to my knowledge, has never been demonstrated in a system a tiny fraction of this size. Though that's not to say it's impossible, but while this machine is incredibly powerful the comparison to human brains is predictably inaccurate...