this post was submitted on 23 Jan 2024
41 points (95.6% liked)
Privacy
32003 readers
1103 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It can and it will. That is one of the uses of "NPUs" I'm most excited about.
Basically you can run an (potentially open-source) small LLM on the phone using whatever context the keyboard has access to (at a minumim, what you've typed so far) and have the keyboard generate the next token(s).
Since this is comptationally intensive the model has to be small and you need dedicated hardware to optimize it, otherwise you would need a 500W GPU like the big players. You can do it for 0.5W locally. Of course, adjust your expectations accordingly.
I don't know any project doing it right now, but I imagine that Microsoft will integrate in SwiftKey soon, with open source projects to follow.
What are NPU's?
Neural Processing Unit. Basically AI processor inside the chip, alongside your CPU and GPU.
Interesting concept, thanks for explaining
I think you hugely estimate what it takes to complete and correct a few words. Maybe you would want some sort of accelerator for fine tuning but 1. You probably don't even need fine tuning and 2. You can probably just run it on the CPU while your device is charging. But for inference modern CPUs are by far powerful enough.
Yeah, modern arm CPUs can run at 3GHz and play PS4 level games, but I don't want my phone to become a handwarmer every time I want to typefvvn a quick email...
And of course, I'm not talking about correcting "fuck" to "duck", I'm talking about ChatGPT level prediction. Or llama2, or gemini nano, or whatever....