this post was submitted on 18 May 2024
361 points (95.5% liked)

Technology

58133 readers
4669 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
  • Google Cloud accidentally deleted UniSuper's account and backups, causing a major data loss and downtime for the company.
  • UniSuper was able to recover data from backups with a different provider after the incident.
  • The incident highlighted the importance of having safeguards in place for cloud service providers to prevent such catastrophic events from occurring.
you are viewing a single comment's thread
view the rest of the comments
[–] Gormadt@lemmy.blahaj.zone 51 points 4 months ago (5 children)

As the saying goes: if you only have one backup you have zero backups.

How the fuck does Google of all companies manage to accidentally delete that‽

[–] smokinliver@sopuli.xyz 21 points 4 months ago (1 children)

If this is the thing I heard of a few days ago then google had multiple backups on different sites but they managed to delete all of them

[–] brbposting@sh.itjust.works 5 points 4 months ago (1 children)

I guess they weren’t paying quite enough to have offline backups? I believe financial institutions can keep stuff stored in caves (think records of all the mortgages a bank wants to be repaid for - data loss isn’t an option).

[–] T156@lemmy.world 9 points 4 months ago

From the sounds of it, they did, since they were able to recover the data from elsewhere.

They just lost the data they kept and stored with Google.

[–] RedditRefugee69@lemmy.world 10 points 4 months ago

I’m betting job cuts and someone was in a hurry

[–] T156@lemmy.world 8 points 4 months ago (2 children)

Backups all tied to the same Google account that got mistakenly terminated, and automation did the rest?

It didn't matter that they might have had backups on different services, since it was all centralised through Google, it was all blown away simultaneously.

[–] erwan@lemmy.ml 2 points 4 months ago (1 children)

It's weird that backups got deleted immediately. I would imagine they get marked for deletion but really deleted something like a month later to prevent this kind of issue.

[–] Hootz@lemmy.ca 1 points 4 months ago (1 children)

That's when accounts are closed or payments missed, I think in this case they just deleted the sub itself which just bypassed everything for instant deletion.

[–] asdfasdfasdf@lemmy.world 1 points 4 months ago* (last edited 4 months ago) (1 children)

I don't see why it matters that it was a subscription. Anything which deletes data should be a soft delete.

[–] lolcatnip@reddthat.com 1 points 4 months ago

Sometimes it has to be a hard delete to comply with a user's request to remove data.

[–] ricdeh@lemmy.world 2 points 4 months ago

UniSuper was able to recover data from backups with a different provider after the incident.

[–] elucubra@sopuli.xyz 5 points 4 months ago (1 children)

My first job was in a Big Iron shop in the late 80's, where I was in charge of backups. We kept Three sets of backups, on two different media, one on hand, one in a different location in the main building, in a water and fireproof safe, and one offsite. We had a major failure one day, and had to do a restore.

Both inhouse copies failed to restore. Thankfully the offsite copy worked. We were in panic. That taught me to keep all my important data on three sets. As the old saying goes: Data loss is not an if question, but a when question. Also, remember that "the cloud" simply means someone else's remote servers over which you have no control.

[–] Diplomjodler3@lemmy.world 9 points 4 months ago (1 children)

And had you ever tested the restore process?

[–] elucubra@sopuli.xyz 1 points 4 months ago

In a big iron shop?everything gets tested, dry run, etc, but shit happens, hence backups

[–] Hootz@lemmy.ca 3 points 4 months ago (2 children)

Everything is tied to the subscriptions, they deleted the sub and that automatically deleted all backups.

[–] linearchaos@lemmy.world 3 points 4 months ago

That sounds like a pretty trashy backup scheme. I don't care what your subscription status is I'm keeping those backups until retension's over.

[–] capital@lemmy.world 2 points 4 months ago

Very stupid.

AWS has a holding period after account deletion where nothing is actually deleted, just inaccessible and access can be regained without data loss.

Since first hearing about this I’m wondering how TF Google Cloud doesn’t have a similar SOP.