this post was submitted on 06 Jul 2024
107 points (100.0% liked)

Free and Open Source Software

17911 readers
38 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

As mentioned in the comments, plain text keys aren't bad because they are necessary. You have to have at least one plain text key in order to be able to use encryption

all 49 comments
sorted by: hot top controversial new old
[–] riskable@programming.dev 89 points 4 months ago* (last edited 4 months ago) (4 children)

This is a, "it's turtles all the way down!" problem. An application has to be able to store its encryption keys somewhere. You can encrypt your encryption keys but then where do you store that key? Ultimately any application will need access to the plaintext key in order to function.

On servers the best practice is to store the encryption keys somewhere that isn't on the server itself. Such as a networked Hardware Security Module (HSM) but literally any location that isn't physically on/in the server itself is good enough. Some Raspberry Pi attached to the network in the corner of the data center would be nearly as good because the attack you're protecting against with this kind of encryption is someone walking out of the data center with your server (and then decrypting the data).

With a device like a phone you can't use a networked HSM since your phone will be carried around with you everywhere. You could store your encryption keys out on the Internet somewhere but that actually increases the attack surface. As such, the encryption keys get stored on the phone itself.

Phone OSes include tools like encrypted storage locations for things like encryption keys but realistically they're no more secure than storing the keys as plaintext in the application's app-specific store (which is encrypted on Android by default; not sure about iOS). Only that app and the OS itself have access to that storage location so it's basically exactly the same as the special "secure" storage features... Except easier to use and less likely to be targeted, exploited, and ultimately compromised because again, it's a smaller attack surface.

If an attacker gets physical access to your device you must assume they'll have access to everything on it unless the data is encrypted and the key for that isn't on the phone itself (e.g. it uses a hash generated from your thumbprint or your PIN). In that case your effective encryption key is your thumb(s) and/or PIN. Because the Signal app's encryption keys are already encrypted on the filesystem.

Going full circle: You can always further encrypt something or add an extra step to accessing encrypted data but that just adds inconvenience and doesn't really buy you any more security (realistically). It's turtles all the way down.

[–] remington@beehaw.org 13 points 4 months ago (1 children)

Very good "Explain Like I'm Five" there!

[–] sic_1@feddit.de 9 points 4 months ago

Signal seem to be the least compromising messenger app out there with their privacy policy and open source code base. It's only natural they are frequent victims of FUD.

[–] tatterdemalion@programming.dev 13 points 4 months ago* (last edited 4 months ago) (1 children)

This reminds me of the apparent gnome-keyring security hole. It's mentioned in the first section of the arch wiki entry: https://wiki.archlinux.org/title/GNOME/Keyring

Any application can read keyring entries of the other apps. So it's pretty trivial to make a targeted attack on someone's account if you can get them to run an executable on their machine.

[–] AVincentInSpace@pawb.social 10 points 4 months ago* (last edited 4 months ago) (1 children)

Exactly. Discord token stealers have been around for ages but no one gives Discord flak about leaving that secret unencrypted.

[–] princessnorah@lemmy.blahaj.zone 4 points 4 months ago

Discord aren't marketing themselves on secure or encrypted messaging. Signal is.

[–] skullgiver@popplesburger.hilciferous.nl 6 points 4 months ago* (last edited 3 months ago)

[This comment has been deleted by an automated system]

[–] lud@lemm.ee 2 points 4 months ago (1 children)

Don't phones have something comparable to a TPM?

[–] Ava@beehaw.org 2 points 3 months ago

The article referenced is about their Desktop application

[–] limitedduck@awful.systems 56 points 4 months ago (2 children)

I kind of agree that this may be a little overblown. Exploiting this requires device and filesystem access so if you can get the keys you can already get a lot more stuff.

[–] ericjmorey@programming.dev 12 points 4 months ago

A secure enclave can already be accessed by the time someone can access the Signal encryption keys , so there's no extra security in putting the encryption keys in the secure enclave.

[–] solarvector@lemmy.zip 17 points 4 months ago

Also not a surprise because as the article notes it's been known and discussed since at least 2018

[–] bjoern_tantau@swg-empire.de 15 points 4 months ago (2 children)

How else should the keys be stored?

[–] Pechente@feddit.org 8 points 4 months ago (1 children)

There are system specific encryption methods like keychain services on iOS to store exactly this kind of sensitive information.

[–] ericjmorey@programming.dev 10 points 4 months ago (1 children)

How would that provide additional security in the particular circumstance of someone having access to the Signal encryption keys on someone's phone?

[–] hedgehog@ttrpg.network 5 points 4 months ago

This particular scenario involves the MacOS desktop app, not the phone app. The link is showing just an image for me - I think it’s supposed to be to https://stackdiary.com/signal-under-fire-for-storing-encryption-keys-in-plaintext/

That said, let’s compare how it works on the phone to how it could work on MacOS and how it actually works on MacOS. In each scenario, we’ll suppose you installed an app that has hidden malware - we’ll call it X (just as a placeholder name) - and compare how much data that app has access to. Access to session data allows the app to spoof your client and send+receive messages

On the phone, your data is sandboxed. X cannot access your Signal messages or session data. ✅ Signal may also encrypt the data and store an encryption key in the database, but this wouldn’t improve security except in very specific circumstances (basically it would mean that if exploits were being used to access your data, you’d need more exploits if the key were in the keychain). Downside: On iOS at least, you also don’t have access to this data.

On MacOS, it could be implemented using sandboxed data. Then, X would not be able to access your Signal messages or spoof your session unless you explicitly allowed it to (it could request access to it and you would be shown a modal). ✅ Downside: the UX to upload attachments is worse.

It could also be implemented by storing the encryption key in the keychain instead of in plaintext on disk. Then, X would not be able to access your Signal messages and session data. It might be able to request access - I’m not sure. As a user, you can access the keychain but you have to re-authenticate. ✅ Downside: None.

It’s actually implemented by storing the encryption key in plaintext, collocated with the encrypted database file. X can access your messages and session data. ❌

Is it foolproof? No, of course not. But it’s an easy step that would probably take an hour of dev time to refactor. They’re even already storing a key, just not one that’s used for this. And this has been a known issue that they’ve refused to fix for several years. Because of their hostile behavior towards forks, the FOSS community also cannot distribute a hardened version that fixes this issue.

[–] scott@lem.free.as 4 points 4 months ago (1 children)

In the device's secure enclave (e.g. TPM).

[–] onlinepersona@programming.dev 13 points 4 months ago* (last edited 4 months ago) (1 children)

How does that help when somebody has access to the phone via your PIN or password?

Anti Commercial-AI license

[–] chris@l.roofo.cc 4 points 4 months ago (1 children)

If I'm not mistaken you can save keys in these chips so that they can not be extracted. You can only use the key to encrypt/decrypt/sign/verify by asking the chip to do these operations with your key.

[–] onlinepersona@programming.dev 5 points 4 months ago (2 children)

That sounds only marginally better. Access to the phone still means you can create a backup containing the key, so TPM wouldn't help much.

Anti Commercial-AI license

[–] lud@lemm.ee 3 points 4 months ago (1 children)

No, why would a backup contain non-exportable information? One of the reasons to use TPM to begin with is that sensitive information can't leave it.

[–] onlinepersona@programming.dev 1 points 3 months ago (1 children)

How do you restore a backup on another phone without the keys?

Anti Commercial-AI license

[–] lud@lemm.ee 2 points 3 months ago

You would probably use a recovery key that exists exclusively elsewhere like on paper in a vault. Like bitlocker.

I have no idea if signal uses TPM or not but generally keys in TPM are non-exportable which is a very good thing and IMO the primary reason to use TPM at all.

[–] scott@lem.free.as 1 points 4 months ago (1 children)

One would hope the backup is encrypted.

[–] onlinepersona@programming.dev 3 points 4 months ago* (last edited 4 months ago)

It is. A password is generated that you have to write down. It must've been a compromise because they knew most people would just pick a shitty password if they didn't generate one and it would end up on a piece of paper or in some digital form anyway.

Anti Commercial-AI license

[–] thingsiplay@beehaw.org 12 points 4 months ago (2 children)

After your edit, the post points to an image only, no longer the link to the source. Please edit back the link, if not at least into the body.

[–] thingsiplay@beehaw.org 7 points 4 months ago (1 children)

Isn't Signal Open Source? If so, why is it a surprise then?

[–] Haui@discuss.tchncs.de 19 points 4 months ago* (last edited 4 months ago) (1 children)

Its not a surprise but it has not really received a lot of attention since people have reported it.

https://github.com/signalapp/Signal-Desktop/issues/5751

Also signal as a service is not really open source because it is not selfhostable. The server backend is proprietary afaik.

[–] jet@hackertalks.com 15 points 4 months ago (1 children)

The back end is open source, but sometimes they've lagged years behind releasing the source code. Other developers have stood up copies of the signal network. Session, for example.

You can self host your own signal, but it's not federated, so you'd have nobody to talk to

[–] Haui@discuss.tchncs.de 5 points 4 months ago (1 children)

So effectively its not FOSS.

[–] jet@hackertalks.com 11 points 4 months ago* (last edited 4 months ago) (2 children)

It's absolutely FOSS. It is not, however federated. But that is not a requirement to be free and open source software

Think of it like this, Linux is free and open source software, even if I don't give you a shell on my computer.

You can use the code, however you want, in any project you want.

[–] furikuri@programming.dev 3 points 4 months ago* (last edited 3 months ago) (1 children)

The back end is open source, but sometimes they've lagged years behind releasing the source code.

I think this is the more worrying part if true. The backend is licensed under the AGPL, so this would technically be a ~~violation~~ of their terms

  1. Remote Network Interaction; Use with the GNU General Public License.

Notwithstanding any other provision of this License, if you modify the Program, your modified version must prominently offer all users interacting with it remotely through a computer network (if your version supports such interaction) an opportunity to receive the Corresponding Source of your version by providing access to the Corresponding Source from a network server at no charge, through some standard or customary means of facilitating copying of software

Edit: For anyone else reading I looked into it a bit more and looks like the issue came to a head around 3 years ago, with this comment being made after a year of missing source code. The public repo has been pretty active since then, so the issue seems to be resolved

[–] hedgehog@ttrpg.network 1 points 4 months ago (1 children)

It isn’t, because their business practices violate the four FOSS essential freedoms:

  1. The freedom to run the program for any purpose
  2. The freedom to study and modify the program
  3. The freedom to redistribute copies of the original or modified program
  4. The freedom to distribute modified versions of the program

Specifically, freedom 4 is violated, because you are not permitted to distribute a modified version of the program that connects to the Signal servers (even if all your modified version does is to remove Google Play Services or something similar).

[–] jet@hackertalks.com 4 points 4 months ago

Molly.im

The license does not prevent number four from happening, they just ask people not to do it

[–] ericjmorey@programming.dev 4 points 4 months ago

FYI Submitting an image in the Lemmy "create post" submittion form overrides the URL feild. I'm not sure if anyone submitted a bug about this.

[–] onlinepersona@programming.dev 4 points 4 months ago

Source? That's just an image.

Anti Commercial-AI license

[–] brie@beehaw.org 3 points 4 months ago

Restricting access to files within a user is why sandboxing is useful. It in theory limits the scope of a vulnerability in an app to only the files it can read (unless there is a sandbox escape). Android instead prevents apps from accessing other apps' files by having each app run as a separate user.

One way to keep the encryption keys encrypted at rest is to require the login password (or another password) to open the app, and use it to encrypt the keys. That said, if an adversary can read Signal's data, they can almost certainly just replace Signal with a password-stealing version.

[–] i_am_not_a_robot@discuss.tchncs.de 3 points 4 months ago (1 children)

The link is broken, but this is apparently an issue with Signal Desktop, not regular Signal. The proposed solution does not work on Windows: https://www.electronjs.org/docs/latest/api/safe-storage

[…] content is protected from other users on the same machine, but not from other apps running in the same userspace.

It's unfortunately about the best you can do on Windows.

[–] Recant@beehaw.org 1 points 4 months ago

Thanks for the heads up on the broken link! I fixed it.

[–] eveninghere@beehaw.org 2 points 3 months ago (1 children)

by any process on the system

This IS bad. Btw they can ask the user to type the password rather than saving it in a plaintext. I can't believe comments on this thread defend Signal...

[–] Recant@beehaw.org 1 points 3 months ago (1 children)

But can you trust that a user will pick a difficult to break password? They likely will pick something simple to remember but that is not a good password.

The we are just back to essentially having a plaintext password because if the attacker has a good dictionary, it will be easy to crack.

[–] eveninghere@beehaw.org 1 points 3 months ago

I can agree, but I MYSELF will pick a strong PW. So they better just fucking encrypt the thing, fucking please for the love of god.