this post was submitted on 15 Nov 2024
89 points (98.9% liked)

Futurology

1776 readers
201 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] JohnDClay@sh.itjust.works 12 points 15 hours ago (13 children)

And it's hard to tell what the difference is. Apples 'built from the ground up for AI' chips just have more RAM. What's the difference with CPUs? Do they just have more onboard graphics processing that can also be used for matrix multiplication?

[–] synapse1278@lemmy.world 3 points 13 hours ago (4 children)

Basically yes. They come with an NPU (Neural processing unit) which is hardware acceleration for matrix multiplications. It cannot do graphics. Slap whatever NPU into the chip, boom: AI laptop!

[–] JohnDClay@sh.itjust.works 2 points 13 hours ago (3 children)

Matrix multiplication is also largely what graphics cards do, I wonder how the npus are different.

[–] synapse1278@lemmy.world 1 points 12 hours ago (1 children)

Modern graphics cards pack a lot of functionality. Shading units, Ray tracing, video encoding/deciding. NPU is just the part needed to accelerat Neural nets.

[–] JohnDClay@sh.itjust.works 2 points 12 hours ago (1 children)

But you can accelerate nural nets better with a GPU, right? They've got a lot more parallel matrix multiplication compute than any npu you can slap on a CPU.

[–] synapse1278@lemmy.world 1 points 9 hours ago

It all depends on the GPU. If it's something integrated in the CPU it will probably not so better, if it's a 2000$ dedicated GPU with 48GB of VRAM is will be very powerful for Neural Net computing. NPUs are most often implemented as small, low-power, embedded solutions. Their goal is not to compete with data centers or workstations, it's to enable some basic "AI" features on portable devices. E.g: "smart" camera with object recognition to give you alerts.

load more comments (1 replies)
load more comments (1 replies)
load more comments (9 replies)