this post was submitted on 09 Jul 2023
17 points (90.5% liked)

Self Hosted - Self-hosting your services.

11391 readers
33 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules

Important

Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!

Cross-posting

If you see a rule-breaker please DM the mods!

founded 3 years ago
MODERATORS
 

What the title says. Are there any good ChatGPT alts that can be self hosted?

top 9 comments
sorted by: hot top controversial new old
[–] zephyrvs@lemmy.ml 5 points 1 year ago

Llama.cpp + Wizard Vicuna (Uncensored, if you want to get the real thing) + one of the web interfaces that are compatible. Should be listed in the readme.

Or try gpt4all which is much easier to use and even offers a selection of downloadable models.

7B/13B/30B+ depends on your hardware, especially GPU.

[–] frap129@lemmy.maples.dev 4 points 1 year ago

I use koboldcpp with the vicuna model. Reasonably fast generation (<1 minute) on a 4th gen i7, would probably be on par with chatgpt in terms of speed if you used a GPU.

[–] Weirdbeardgame@lemmy.ml 4 points 1 year ago

Serge I've heard good things about this one as well.

[–] CanOpener@sh.itjust.works 3 points 1 year ago

I've tried https://github.com/oobabooga/text-generation-webui with LLaMA, didn't have enough VRAM to run it though.

[–] mrmojo@beehaw.org 3 points 1 year ago* (last edited 1 year ago)

Hi, I think localAi is a good place to start.

[–] HumanPerson@sh.itjust.works 1 points 1 year ago

I believe gpt4all has a self-hostable web interface but I could be wrong. Still it can run on relatively low end hardware (relatively because it still needs a decent amount) and you could just use it on your local computer.

[–] ddtfrog@lemm.ee 1 points 1 year ago

Codestar for dev

load more comments
view more: next ›