this post was submitted on 30 Sep 2022
16 points (100.0% liked)

Self-hosting

2844 readers
1 users here now

Hosting your own services. Preferably at home and on low-power or shared hardware.

Also check out:

founded 2 years ago
MODERATORS
 

In my dayjob, there are four such nice blue sticks, gathering dust on my shelf. I have a possibility to put them online via Raspberry Pi.

So, if there is someone who may have an interesting idea how to put them to work for the good cause, talk to me.

top 1 comments
sorted by: hot top controversial new old
[–] poVoq 1 points 2 years ago* (last edited 2 years ago)

Hmm, not sure how much can be done with these. Most ML stuff you can self-host requires CUDA or OpenCL, i.e. a GPU.

I am planning to setup a Libre-Translate instance on an old CUDA enabled gaming laptop turned server soon:

https://github.com/LibreTranslate/LibreTranslate

Cool would be an auto-translate button on Lemmy posts with Libre-translate API support like it exists for Discourse forums.