There was a study on the power efficiency of programming languages, and rust was in the top few. Bugs aside, lemmy - which is written in rust, has the potential to be among the most efficient ways to solve the problem. I'd think the total lack of ads and smarts would also help efficiency.
Solarpunk
The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.
Join our chat: Movim or XMPP client.
I was recently at a conference for AWS (Amazon Web Services, AKA the cloud provider for a HUGE chunk of the internet), and part of the keynote claimed that it was greener to run in the cloud because... uh... well, they didn't exactly say. Don't get me wrong, I could see how it would be easier to make all AWS data centers compliant with using green energy than it would be to convince every random financial institution that their on-premises servers need to be green, but quite frankly it's Amazon and I don't trust that they're telling the truth about themselves and not just greenwashing.
Quite frankly, for things like lemmy instances, I think we could totally achieve a totally solar powered setup easily... but not easily at scale or reliably.
I've thought about how cool it would be to have a server room linked up with a solar array and batteries, and basically only have the servers up when there's enough energy to power them. In theory, it sounds fun to have a static splash page that shows when the servers are down that explains why they're down, as a way to make people think about how energy-expensive servers are. In practice, it sounds like a nightmare for a ton of reasons to have an intentionally flaky server. But it sounds like this is already a Thing with Low-tech magazine, which is neat!
But that's not to say we couldn't build and self-host a reliable and sustainable server room. Just that I don't know the numbers on what a server room actually pulls energy wise and how much energy generation we'd need.
What is often omitted is that large centralized data-centers need a lot of cooling. Due to efficiency improvements this has somewhat improved lately, but it used to be up to 60% of the total electricity used.
Smaller decentralized servers don't need nearly as much of it as they can easily dissipate heat to their cooler surrounding even if they use older less efficient equipment.
Thus up-cycling older server hardware in decentralized locations can save a lot of energy if you consider the entire life cycle of the equipment.
I agree with this. Efficiency vs cooling the infrastructure and updating hardware after a maximum of 5 years. Still, I'm not 100% sure about statistics. Do you know of any comparative studies or the like?
Just one fitting side note. We had an interview with a local data centre manager and during the discussion, we somehow started talking about alternative setups, like a raspberry pi server. The interviewee reminded us of the efficiency of their virtual servers. He even gave us a tour through their digital dashboards and showcased the 1 watt used by a server (vs roughly 4 watts of a Pi, with much less performance).
This is not to say that low-tech is not the way to go. Less mining and hazardous work conditions are always good and need no measurement for emphasis.
Also omitted - the amount of speculative buying for planned capacity that never actually happens. I worked for one of the big tech companies for several years, and more specialized hardware especially (ML accelerators) were spun up with the notion of "we don't know who will need these, but we don't want to not have them if they're needed". Cue massive amounts of expensive hardware sitting plugged in and idle for months as dev teams scramble to adopt their stuff to new hardware that has just enough difference in behavior and requirements as to make it hard to migrate over.
Also also, there's a bunch of "when in doubt, throw it out" - automated systems detecting hardware failure that automatically decommission it after a couple strikes. False positive signals were common, so a lot gets thrown out despite being perfectly fine.
There are actually quite a few projects to use waste heat from data centers to heat homes. With aquifere storage that can even be used in a seasonale fashion.
Did I read that you host this on renewable energy?
Not fully. Well, I am pretty sure most of the power comes from the nearby geothermal plant and the three windmills up the hill, but overall the grid power here is still around 50% fossile.
I started building a solar PV system for it though, but ran into some issues with the batteries that I am still trying to solve without having to buy expensive new ones.
Ok so I'm not fully crazy haha I did read something I just didn't fully remember.
@okasen @stefanlaser you are right to be skeptical about AWS https://www.fastcompany.com/90879223/amazon-claims-to-champion-clean-energy-so-why-did-it-just-help-kill-an-emissions-bill-in-oregon.
FWIW, I think you are pointing to a larger problem-- like, it's not a coincidence that the harder the sales pitch of the cloud, the more obscure such numbers become.
To take a car analogy--there's a reason most Americans have an intuition for what miles per gallon *feels* like, but wouldn't know where to start with the equivalent for EVs.
While I was looking for an instance I saw one that said it was run on 99% renewables! So unless they’re lying I’d imagine it’s possible.
Nearly every data-center in Europe claims that. They use the same electricity as everyone else, but have a contract with a utility company that tracks the amount of renewable energy they feed into the grid (or buy on the energy market) so that on average those claims are technically true.
But of course with the grid being mostly nonrenewable this means little.
Iceland has a clean grid and Norway is very close to 99%. If you allow for nuclear Sweden also comes close, as does Switzerland.
Can you share the instance? I've also come across a few Mastodon instances that do the same.
Then I wonder how the material remains of hardware are addressed by providers. Or if virtual servers would be a better solution anyway.
turns out it’s one of the more popular ones! though, I’ve done a quick search and I can’t see where it’s sourced.
How do virtual servers help?
Perhaps for medium- to large-size solutions, for example: bundling multiple fediverse instances in one cooperative data centre. Virtuality allows for efficiently allocating resources where they are needed the most.
*Edit: Turns out I almost joined that particular instance. Awesome name, too. But Canada is quite far away from my home. And home-y it shall be.
Talking #lemmy, a fediverse alternative to #Reddit. It works. During the past years, Reddit turned into my quasi-search engine. #enshittification Thus, it's cool to see non-commercial alternatives.
Above is a post of mine, shared via the #solarpunk instance https://slrpnk.net/
👉 @stefanlaser@slrpnk.net
Half a year ago, we wrote a piece about the environmental footprint of Mastodon hosting. The same questions can be raised with Lemmy and #kbin in mind.
#climate #e-waste #sustainability @ecologies
I dunno about Lemmy, but there are some lightweight Mastodon-compatible servers under development on GitHub.
Also, I saw some tests from a Mastodon admin that showed huge decrease in server power by simply decreasing the retention time for cached media from the default. I forget the numbers, but I remember being shocked by them.
I also plan to experiment with this (but, well, time and stuff). Do you have any clues on how to find the projects? There's probably a lot of network traffic that can be saved through tweaking
All in all, these things are not energy-hungry. It probably costs more energy to display these pages on a big screen than doing all the data processing required. When it comes to energy efficiency, huge "energy-hungry" datacenter are usually more efficient: they have the ability to do economy of scale, and for them, a Wh gained is money gained so they are usually well designed in that respect and keep getting more efficient.
I have resources like the Low-Tech Magazine in mind, which uses solar power to host a website
I have a friend who is really serious about energy savings and about having a sustainable lifestyle. And who does the maths and his homework. His advice was that IT is probably the last of your concerns. Insulate. Insulate your water-heater, insulate your house. Find your main source of wasted energy. It probably won't be your webserver and by several orders of magnitude.
I agree with most of this. And our little Lemmy servers will certainly not count. We definitely should not care about individual consumers, or rather, it should not be about blaming people. It's more about experiments and learning. And fun.
However, what I would like to do is to complicate the data centre narrative. Yes, data centres are superply efficient. But this is a relative measure. Companies demand exponentially more computing and storage power; more capacity to process data for 'intelligent' applications and provide ads.
Ergo, the landlords of the internet build massive new data centres that do indeed need a considerable amount of electricity, water and all the new, resource heavy high tech chips were reading about in the news. Corporate social media platforms are part of this, too. 2 per cent of current global electricity demand comes from data centres. And scholars agree that this share is growing. But, yeah. This is an interesting field of research, because it's quite difficult when it comes to the concrete numbers.
So this post here is a typical "let's improve our society somewhat" contribution.