this post was submitted on 22 Aug 2023
96 points (100.0% liked)
Technology
37712 readers
187 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Exactly. In another thread on here recently someone said something that basically boiled down to "your protest against AI isn't going to stop it. There's too much corporate power behind it. So you might as well embrace it" and I just cannot get my head around that mentality.
Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It's creepy as hell.
Creepy isn't illegal. Never has been.
I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we're going to tolerate it.
The real issue here is what things like deepfakes can do. It's already starting, and it's going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say "it's creepy, but there's nothing we can do about people deepfaking Nancy Pelosi's face onto their spank material", it's extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That's the real stakes here. How we react to what's happening with regular folk and celebs is just the canary int he coal mine.
It's the only sensible answer. Anything else would require an extreme violation of everybody's privacy and the implementation of total surveillance. See France's recent attempt at giving police full access to peoples phones, that's the kind of stuff you end with when going down that route.
This AI is out there today, can be run on every half descent gaming PC and can generate new images in about 30sec. And it will only get better going forward. Images are as malleable as text now, you can accept that, or keep trying to fight windmills.
Of course they can, and most already do. But on the whole, that really doesn't have much of an effect, anybody can make their own sites and you don't even have to go deep down into the dark web for that. It's the first link on Google when you search for it.