this post was submitted on 19 Jul 2023
16 points (90.0% liked)

Stable Diffusion

4312 readers
12 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
 

Do you keep it simple? Just long enough? Go wild with it? How about embeddings, do you also use them?

The more I learn about this, the more I don't understand it. Outside of some basic enhancers (masterpiece, best quality, worst quality, and bad anatomy/hands etc. if I'm generating a human), I don't see any big improvements. Every combination gives different result; some look better, some look worse depending on the seed, sampler, etc. It's basically a matter of taste. Note that I only do illustrations/paintings so the differences might not be much. Do you keep tweaking your prompts or just settle with the prompts you've been using?

you are viewing a single comment's thread
view the rest of the comments
[–] DrakeRichards@lemmy.world 4 points 1 year ago (5 children)

I don’t bother with prompt enhancers any more. Stable Diffusion isn’t MidJourney; quantity is far more important than quality. I just prompt for what I want and add negative prompts for things that show up that I don’t want. I’ll use textual inversions like badhandv4 if the details look really bad. If the model isn’t understanding at all then I’ll use ControlNet.

[–] akai@kbin.social 1 points 1 year ago (1 children)

Do you have any good beginner's guide for ControlNet?

load more comments (3 replies)