this post was submitted on 23 Jan 2024
41 points (95.6% liked)

Privacy

32003 readers
1103 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

I've been playing with both the Thumb and the Unexpected keyboards. I like 'em both but, man, I have to admit I'd like them more if they had that top bar that predicts what you might be. Is that just a no-go from a privacy perspective? Can that functionality be local?

(I also wouldn't mind a good voice typing feature)

you are viewing a single comment's thread
view the rest of the comments
[–] Tja@programming.dev 6 points 10 months ago* (last edited 10 months ago) (2 children)

It can and it will. That is one of the uses of "NPUs" I'm most excited about.

Basically you can run an (potentially open-source) small LLM on the phone using whatever context the keyboard has access to (at a minumim, what you've typed so far) and have the keyboard generate the next token(s).

Since this is comptationally intensive the model has to be small and you need dedicated hardware to optimize it, otherwise you would need a 500W GPU like the big players. You can do it for 0.5W locally. Of course, adjust your expectations accordingly.

I don't know any project doing it right now, but I imagine that Microsoft will integrate in SwiftKey soon, with open source projects to follow.

[–] technomad 5 points 10 months ago (1 children)
[–] MrFunnyMoustache@lemmy.ml 4 points 10 months ago (1 children)

Neural Processing Unit. Basically AI processor inside the chip, alongside your CPU and GPU.

[–] technomad 4 points 10 months ago

Interesting concept, thanks for explaining

[–] kevincox@lemmy.ml 1 points 10 months ago* (last edited 10 months ago) (1 children)

I think you hugely estimate what it takes to complete and correct a few words. Maybe you would want some sort of accelerator for fine tuning but 1. You probably don't even need fine tuning and 2. You can probably just run it on the CPU while your device is charging. But for inference modern CPUs are by far powerful enough.

[–] Tja@programming.dev 1 points 10 months ago* (last edited 10 months ago)

Yeah, modern arm CPUs can run at 3GHz and play PS4 level games, but I don't want my phone to become a handwarmer every time I want to typefvvn a quick email...

And of course, I'm not talking about correcting "fuck" to "duck", I'm talking about ChatGPT level prediction. Or llama2, or gemini nano, or whatever....