this post was submitted on 09 Aug 2024
89 points (96.8% liked)

Technology

59366 readers
3582 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 25 comments
sorted by: hot top controversial new old
[–] catloaf@lemm.ee 50 points 3 months ago

Castellucci, whose pronouns are they/them, acquired this remarkable control after gaining access to the administrative account for GivEnergy, the UK-based energy management provider who supplied the systems. In addition to the control over an estimated 60,000 installed systems, the admin account—which amounts to root control of the company's cloud-connected products—also made it possible for them to enumerate names, email addresses, usernames, phone numbers, and addresses of all other GivEnergy customers (something the researcher didn't actually do).

tl;dr: hacker (the good kind) exploits weak encryption key to gain access to the utility's management system. Because you too were probably wondering how key length and power generation could possibly be related.

[–] BertramDitore@lemmy.world 26 points 3 months ago

Wow, props to Castellucci for being a stand up person and not using their discovery to control or mess with tens of thousands of people’s power supply. And props to GivEnergy for not turning around and suing them after they reported finding the issue.

This could have gone badly in either direction, but we lucked out that this Castellucci seems to be an excellent and conscientious citizen.

[–] LodeMike@lemmy.today 19 points 3 months ago (2 children)

How in the fuck do you even coax software into using a key like that? Did someone just say "yeah just use the smallest size possible, that'll be okay" and then just like not care?

[–] GreenEngineering3475@lemmy.world 21 points 3 months ago* (last edited 3 months ago) (2 children)

From the article:

In an email, a GivEnergy representative reinforced Castellucci’s assessment, writing:

In this case, the problematic encryption approach was picked up via a 3rd party library many years ago, when we were a tiny startup company with only 2, fairly junior software developers & limited experience. Their assumption at the time was that because this encryption was available within the library, it was safe to use. This approach was passed through the intervening years and this part of the codebase was not changed significantly since implementation (so hadn't passed through the review of the more experienced team we now have in place).
[–] sugar_in_your_tea@sh.itjust.works 15 points 3 months ago (1 children)

So, it sounds like they don't have regular security audits, because that's something that would absolutely get flagged by any halfway competent sec team.

[–] WhatAmLemmy@lemmy.world 4 points 3 months ago

No need for audits. It's only critical infrastructure embedded into tens of thousands of homes, lol.

[–] Telorand@reddthat.com 10 points 3 months ago

Yet another reminder that trust should be earned.

[–] umami_wasbi@lemmy.ml 9 points 3 months ago* (last edited 3 months ago) (1 children)

Because cryptography is a specialized knowledge. Most curriculums doesn't even include cryptography as core topic in their Computer Science degree. You can have a look of the MIT's computer science curriculum. Cryptography is instead embedded in the elective class of Fundementals of Computer Security (6.1600). That's also why DevSecOps instead of the previous DevOps. It's just simply boils down teaching and learning cryptography is hard. It's still too early to expect a typical dev to understand how to implement cryptograhy, even with good library. Most doesn't know compression and encryption doesn't mix well. Nor they understand the importance of randomness and never use the same nounce twice. They doesn't even know they can't use built-in string comparison (==) for verifying password hashes which can lead to timing attacks. Crypto lib devs who understands crypto add big scary warnings yet someone will mess something up.

Still, I will strongly support academics adding basic cryptography knowledge to their curriculum, like common algoritms, key lengths, future threats, and how fast the security landscape is moving, just for the sake of the future of cyber security.

[–] sugar_in_your_tea@sh.itjust.works 6 points 3 months ago (1 children)

Eh, I disagree. Cryptography really isn't something your average software engineer needs to know about, as long as they understand that you should never roll your own crypto. If you teach it in school, most students will forget the details and potentially just remember some now-insecure details from their classes.

Instead, we should be pushing for more frequent security audits. Any halfway decent security audit would catch this, and probably a bunch of other issues they have as well. Expect that from any org with revenue above some level.

[–] umami_wasbi@lemmy.ml 5 points 3 months ago* (last edited 3 months ago) (1 children)

At least have few lessons let them remember not to roll their own crypto, and respect those scary warnings. These needs to be engraved into their mind.

I agree security audit would catch this, but that's something after the fact. There is a need for a more preventative solution.

[–] sugar_in_your_tea@sh.itjust.works 3 points 3 months ago* (last edited 3 months ago) (1 children)

Security audits should be preventative. Have them before any significant change in infrastructure is released, and have them periodically as a backup.

I had a cryptography and security class in college (I took the elective), and honestly, we didn't cover all that much that's actually relevant to the industry, and everything that was relevant was quickly outdated. That's not going to be a solution, we need a greater appreciation for security audits.

[–] umami_wasbi@lemmy.ml 1 points 3 months ago* (last edited 3 months ago)

At least teach the concept of "don't do it ever" won't hurt, and won't get outdated anytime soon.

However, this approach will hurt security in the long term as this brings to burden to the lib dev to maintain a foolproof design, which they can burnout, quit, and leave a big vulnerbility in the future as most dev won't touch the code again if it's still "working."

Cybersecurity is very important in today's digital landscape, and cryptography is one of the pillers. I believe it's essential for devs to learn of core principles of cryptograhy.

Again, audits are nice, and you can use it in various points, but it's not silver bullet. It is just a tool, and can't replace proper education. People are often ignorant. Audits can generate any number of warnings it can, but it's the people needs to take corrective actions, which they can ignore or pressured to ignore. Unless it's part of a compliances certification process that can cause them to get out of business. Otherwise, most managers are "What would I care? That cost more."

[–] shortwavesurfer@lemmy.zip 12 points 3 months ago (1 children)

This was an incredibly interesting article.

[–] jayk@lemmy.ca 10 points 3 months ago (1 children)

Right? I feel like ars technica has been on a roll this year

[–] NOT_RICK@lemmy.world 8 points 3 months ago

I subbed because I’ve really enjoyed their content for the past few years

[–] schizo@forum.uncomfortable.business 5 points 3 months ago (1 children)

You have to wonder how many other things are out there with effectively worthless encryption because some old document or default option told them to/allowed them to implement it without any 'hey! some 14 year old with a TI-83 could crack this key!' warnings.

[–] shortwavesurfer@lemmy.zip 9 points 3 months ago

There was a book with a Bitcoin wallet generator code in it that specifically said that it was vulnerable and was only to be done as a demo test and yet somebody released a wallet with that code and fucked a bunch of people over on accident.

[–] shortwavesurfer@lemmy.zip 4 points 3 months ago (3 children)

You know, at least when I've had to generate RSA keys for SSH, it seems like the highest I can possibly do is 4096. Just makes me wonder why you can't generate a key of any links that's a multiple of 1024. Such as, what if I wanted a 20,480 bit key?

[–] solrize@lemmy.world 5 points 3 months ago (1 children)

Current recommendation is to stop using RSA in new deployments altogether. ECC is preferred now, and the major programs (OpenTLS, OpenSSH, etc.) support it.

[–] shortwavesurfer@lemmy.zip 3 points 3 months ago (2 children)

Thats ECDSA correct? Or is that something different?

ECDSA

Yup, that's an implementation that uses ECC (elliptic curve cryptography).

[–] solrize@lemmy.world 1 points 3 months ago* (last edited 3 months ago)

ECDSA is elliptic curve digital signature algorithm. Key exchange is usually done with ECDH (elliptic curve Diffie-Hellman). There has been some debate on the exact best way to do ECDH, but I think the FOSS world is currently settled on Curve25519. Anyway, it is best to leave stuff like that to specialists if you're not one yourself. As mentioned, OpenSSL and OpenSSH both provide working implementations so go ahead and use them. The NIST curve P256 is also perfectly fine as far as anyone can tell. It has a mathematical drawback that it's especially easy to make mistakes and screw up the security if you don't know what you're doing, but the deployed implementations that are out there have been checked carefully and should be ok to use. Bitcoin uses P256 so if anything were wrong with it, someone would have broken it and gotten pretty darn rich ;).

[–] umami_wasbi@lemmy.ml 5 points 3 months ago* (last edited 3 months ago)

I believe you can with openssl, but it will take lots of time both generating and using the key. Think you sign something with that key, and the other party is using a low end device. He might take few mintues to verify the signature. The drawbacks just outweight the benefits. Security is a balancing act between complexity and usability.

[–] barsquid@lemmy.world 3 points 3 months ago

Yet another reason to never connect your devices to the cloud.