this post was submitted on 12 Jun 2023
258 points (100.0% liked)

Selfhosted

39257 readers
185 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

A simple question to this community, what are you self-hosting? It's probably fun to hear from each-other what services we are running.

Please mention at least the service (e.g. e-mail) and the software (e.g. postfix). Extra bonus points for also mentioning the OS and/or hardware (e.g. Linux Distribution, raspberry pi, etc) you are running on.

you are viewing a single comment's thread
view the rest of the comments
[–] behohippy@lemmy.world 2 points 1 year ago (4 children)

Stable Diffusion (Stability AI version), text-generation-webui (WizardLM), a text embedder service with Spacy, Bert and a bunch of sentence-transformer models, PiHole, Octoprint, Elasticsearch/Kibana for my IoT stuff, Jellyfin, Sonarr, FTB Minecraft (customized pack), a few personal apps I wrote myself (todo lists), SMB file shares, qBittorrent and Transmission (one dedicated to Sonarr)... Probably a ton of other stuff I'm forgetting.

[–] Kaerey@lemmy.world 2 points 1 year ago (2 children)

Do you have a GPU in there with the Stable Diffusion? If not how's it working? I'm debating moving to a machine I can't guarantee my spare GPU will fit in.2

[–] behohippy@lemmy.world 1 points 1 year ago

Yep, I'm using an RTX2070 for that right now. The LLMs are just executing on CPU.

[–] jerrimu@lemmy.world 1 points 1 year ago (1 children)

Without a GPU, it's pretty horrible.

[–] Kaerey@lemmy.world 1 points 1 year ago

That's what I was afraid of. Wad gifted a Dell VRTX blade chassis and trying to figure out how to shove my spare 2080 Super in there and get it power.

load more comments (1 replies)