this post was submitted on 26 Aug 2023
16 points (76.7% liked)

BecomeMe

805 readers
1 users here now

Social Experiment. Become Me. What I see, you see.

founded 1 year ago
MODERATORS
top 12 comments
sorted by: hot top controversial new old
[–] Xanthobilly@lemmy.world 10 points 1 year ago

Marketplace? You can make this all yourself with a $200 graphics card and a little googling. Why would you pay for what you could do yourself?

[–] lvxferre@lemmy.ml 9 points 1 year ago (1 children)

That gotta be one of the shittiest renderings of the Blood Devil that I've seen. And I'm not even talking about the 12 toes.

On-topic: I think that the impact of image-generating models will be smaller than the author predicts. Eventually people will desentivise to them, saying "ah, this is probably AI-generated". Having someone generate a picture of you through an AI model will feel less and less disturbing.

[–] sbv@sh.itjust.works 14 points 1 year ago (3 children)

Eventually people will desentivise to them, saying "ah, this is probably AI-generated".

I suspect it'll be more along the lines of "any image that challenges my world view is AI generated, but any image that confirms my biases is undoubtedly real."

[–] bitsplease@lemmy.ml 5 points 1 year ago (1 children)

This guy understands human nature.

[–] hughperman@sh.itjust.works 3 points 1 year ago (1 children)

That's not a guy, that's a bot

[–] bitsplease@lemmy.ml 3 points 1 year ago

Nuh uh, I agree with him, so he's a persom

[–] lvxferre@lemmy.ml 4 points 1 year ago (1 children)

I mean realistic porn based on RL people. I predict that, in the future, if you see some potential nude of an acquaintance or relative you'll immediately think "ah, this is likely AI-generated" and ignore it, without much of a thought. While now you'd probably think that it's real, you know?

[–] sbv@sh.itjust.works 4 points 1 year ago (1 children)

I predict that, in the future, if you see some potential nude of an acquaintance or relative you'll immediately think "ah, this is likely AI-generated" and ignore it

Hopefully.

I think the bigger issue is how the subject/victim feels. If they see a compelling video of them doing something nasty, and the site tells them it's been seen 7,536,865 times, are they going to shrug it off or feel weird? Now what if it shows them with someone they don't like?

I hope it's the former, but people get into their own heads. I suspect there will be a feeling of violation and discomfort that goes along with it.

[–] lvxferre@lemmy.ml 3 points 1 year ago

I hope so, too, but sadly only time will tell.

[–] Saledovil@sh.itjust.works 2 points 1 year ago (1 children)

Manual image manipulation has already existed for quite a while. Ten years ago you could also just dismiss anything that doesn't confirm to your biases as photoshopped.

[–] DarkThoughts@kbin.social 1 points 1 year ago

Yeah. Face swapping is so easy it makes fake nudes absolutely trivial to create even by an amateur.
https://www.photopea.com/tuts/swap-faces-online/

[–] ebenixo@sh.itjust.works 5 points 1 year ago

Maybe the epidemic of selfies posted online will finally stop now