this post was submitted on 28 Feb 2024
25 points (96.3% liked)

Stable Diffusion

4304 readers
2 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
all 10 comments
sorted by: hot top controversial new old
[–] poVoq 6 points 8 months ago (1 children)

This is odd reporting: Stable Diffuse XL AFAIK already runs on a GPU with 8GB ram and usually doesn't need that much time to generate an image either (depends on the GPU).

[–] Even_Adder@lemmy.dbzer0.com 5 points 8 months ago (1 children)

I think they got their numbers wrong. It says they shrink it down to 700 million parameters, that would make it smaller than SD 1.5, which means it should take way less than 8GB of RAM.

[–] Stampela@startrek.website 3 points 8 months ago

I’m guessing there’s a mix. The smallest version is 700 million, possibly the one used to generate the time data reported, but the largest (or not?) still runs with 8gb. If I remember correctly SD3 is supposed to have multiple versions, starting from 800 millions and going up, so this is going to be interesting.

[–] RootBeerGuy@discuss.tchncs.de 5 points 8 months ago (3 children)

Is that feasible on a Raspberry pi?

[–] Even_Adder@lemmy.dbzer0.com 3 points 8 months ago

Probably. FastSD CPU already runs on a Raspberry PI 4.

[–] Scew@lemmy.world 3 points 8 months ago* (last edited 8 months ago) (2 children)

No, lol. Well, at least I'm not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they "distilled" theirs from... Not really sure what exactly they think they are differentiating themselves from, reading the article...

[–] grue@lemmy.world 3 points 8 months ago

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Jeff Geerling has entered the chat

[–] Even_Adder@lemmy.dbzer0.com 3 points 8 months ago

There are three models and the smallest one is 700M parameters.

[–] Wooki@lemmy.world 0 points 8 months ago* (last edited 8 months ago)

Lol read the article, it cites “8gb vram” and if i had to guess it will only support nvidia out of the gate