this post was submitted on 20 Jan 2025
47 points (88.5% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

55583 readers
574 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 2 years ago
MODERATORS
 

Of course, I'm not in favor of this "AI slop" that we're having in this century (although I admit that it has some good legitimate uses but greed always speaks louder) but I wonder if it will suffer some kind of piracy, if it is already suffering or people simple are not interested in "pirated AI"

you are viewing a single comment's thread
view the rest of the comments
[–] OminousOrange@lemmy.ca 6 points 2 days ago* (last edited 2 days ago) (1 children)

There are quite a few options for running your own LLM. Ollama makes it fairly easy to run (with a big selection of models - there's also Hugging Face with even more models to suit various use cases) and OpenWebUI makes it easy to operate.

Some self-hosting experience doesn't hurt, but it's pretty straightforward to configure if you follow along with Networkchuck in this video.

[–] can@sh.itjust.works 1 points 2 days ago (1 children)

Any that are easier to set up on a phone? I tried something before but had trouble despite having enough RAM.

[–] OminousOrange@lemmy.ca 3 points 2 days ago

Not that I'm familiar with. I would guess that the limited processing power of a phone would bring a pretty poor experience though.