this post was submitted on 31 Jul 2023
18 points (87.5% liked)

Apple

17435 readers
121 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS
 

I found this on reddit which I reluctantly to cite it here [1], anyway the comments and the findings were as vague as Apple claiming they beat Nvidia RTX 3090 GPU with that fancy chart.

Regardless, all Apple current lineups, incl. Macbooks, Mac mini, Mac Studio Max come with 16-core Neural Engine, and the Ultra comes with 32-core Neural Engine.

What does it actually do despite all the marketing claims that none other than BS vague stuffs that only accessible to Apple proprietary apps, Finder, FaceTime, Final Cut Pro...

And from the schematic diagram of Apple M series SoC, the Neural Engine used significant space of the SoC.

Does Pytorch and other ML frameworks actually utilize that 16/32-core ?

[1] https://www.reddit.com/r/apple/comments/122iqf4/everything_we_actually_know_about_the_apple/

top 4 comments
sorted by: hot top controversial new old
[–] gdbjr@lemmy.world 13 points 1 year ago* (last edited 1 year ago)

Tasks the Apple Neural Engine Takes Responsibility For

It’s time to dive into just what sort of jobs the Neural Engine takes care of. As previously mentioned, every time you use Face ID to unlock your iPhone or iPad, your device uses the Neural Engine. When you send an animated Memoji message, the Neural Engine is interpreting your facial expressions.

That’s just the beginning, though. Cupertino also employs its Neural Engine to help Siri better understand your voice. In the Photos app, when you search for images of a dog, your iPhone does so with ML (hence the Neural Engine.)

Initially, the Neural Engine was off-limits to third-party developers. It couldn’t be used outside of Apple’s own software. In 2018, though, Cupertino released the CoreML API to developers in iOS 11. That’s when things got interesting.

The CoreML API allowed developers to start taking advantage of the Neural Engine. Today, developers can use CoreML to analyze video or classify images and sounds. It’s even able to analyze and classify objects, actions and drawings.

https://www.macobserver.com/tips/deep-dive/what-is-apple-neural-engine/

[–] riodoro1@lemmy.world 5 points 1 year ago

This thread claims the neural engine works on models with frameworks being private (fuck you for that apple) bit there are solutions to play with it outside of xcode.

[–] Teal@lemm.ee 5 points 1 year ago

From what I’ve read it’s mainly for AI and machine learning jobs. Some common uses are for specific photo editing tasks like AI noise reduction (or other AI based editing) and also video encoding make use of the neural engine.

Aside from that I’m really not sure.

[–] AgentCorgi@lemmy.world 2 points 1 year ago

Pytorch can’t use ANE