this post was submitted on 24 Jan 2024
12 points (92.9% liked)
Permacomputing
641 readers
2 users here now
Computing to support life on Earth
Computing in the age of climate crisis is often wasteful and adds nothing useful to our real life communities. Here we try to find out how to change that.
Definition and purpose of permacomputing: http://viznut.fi/files/texts-en/permacomputing.html
XMPP chat: https://movim.slrpnk.net/chat/lowtech%40chat.disroot.org/room
Sister community over at lemmy.sdf.org: !permacomputing@lemmy.sdf.org
There's also a wiki: https://permacomputing.net/
Website: http://wiki.xxiivv.com/site/permacomputing.html
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well yes, but there are >8 billion humans out there. And a single human would be quite useless. So it's more like 8 Exabytes. Quite a lot.
Of course, there is a heap of research on the efficient lossless and lossy compressed representation of pangenomes, so my blind guys would be that we could probably losslessly store all human genomes in a petabyte or so.
I think it might be even less than that, considering that two humans have 99.9% the same genome.
So the differences are maybe 1 MB per person.
That would make for 8000 TB.
And then there's compression. Obviously, there's going to be a lot of representation if you take the genes of all humans, since they are more or less just passed on, with slight modifications.
So it's definitely going to be 80 TB or less.
Right, peta is two steps above giga. Then I'll go with one terabyte. Well, then there is roughly 10 bytes per genome. Hmm, that is a bit little. Maybe the 80TB estimate is quite good. Then it would be 1KB per genome.
You could probably build a phylogenetic tree by some heuristic, and then the differences on the edges are very small.
Or, build an index of all variants, and then represent each genome as a compressed bitvector with a one for each variant it contains.
Well, now it seems that this would still be many variants, given that there are so many single bases that may differ. So maybe 80TB is a bit too little.
Yeah, but nobody's gonna encode all of humanity's genes at once. It's like taking the storage of all game sava data of all users combined. It doesn't make sense.
Normally, you look at the storage space for one individual at a time.
There is an entire research field about looking at sets of genomes. It's called pangenomics. I think they are at hundreds of thousands of human genomes of available data right now. Ten thousand from a few years ago I know for sure.
Considering multiple genomes is one of the keys to understanding the effects of the different genomic variants on the individual. One can for example look at various chronic diseases and see if there is anything special about the genomes of the sick individuals compared to the healthy ones.
This requires a lot of samples, because just comparing one sick and one healthy individual will bring up a lot of false positive variants, that differ between the individuals, but are not related to the disease.
thanks, I hadn't thought of that.