this post was submitted on 31 Jan 2024
255 points (93.5% liked)

News

23259 readers
3016 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

top 50 comments
sorted by: hot top controversial new old
[–] Merlin404@lemmy.world 119 points 9 months ago (1 children)

Tragic that they were a celebrity that had to go through it for them to do something. But when children or others have it happened to them, they just shrug..

[–] FuglyDuck@lemmy.world 52 points 9 months ago (2 children)

a rich-as-fuck celebrity, at that.

[–] Viking_Hippie@lemmy.world 47 points 9 months ago (2 children)

Probably helps that she's super white too.

This has been happening to AOC constantly since before she was first sworn in and it's been crickets.

When it happens once to the media's favourite white billionaire, though? THAT'S when they start to take it seriously.

[–] Ledivin@lemmy.world 30 points 9 months ago (2 children)

To be clear, this has been happening to Swift for years. She's been very public on the problem, and pays IIRC up to a million per year for a firm to get fakes taken down.

[–] HandBreadedTools@lemmy.world 8 points 9 months ago

"But my narrative!"

[–] Grandwolf319@sh.itjust.works 6 points 9 months ago (1 children)

Okay, then why is it getting attention now though?

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] drmoose@lemmy.world 68 points 9 months ago* (last edited 9 months ago) (24 children)

What a weird populist law tbh. There's already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

They'll use this as an opportunity to sneak in more government spyware/control is my guess.

[–] quindraco@lemm.ee 14 points 9 months ago (1 children)

It's not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone's head onto someone else's nude body, because no real person's head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that's what's being pitched here.

[–] drmoose@lemmy.world 6 points 9 months ago (3 children)

It actually proposes "possession with the intention to distribute" which just show what a meme law this is. How do you determine the intention to distribute for an image?

And I disagree with your take that this can't be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

load more comments (3 replies)
load more comments (23 replies)
[–] PP_BOY_@lemmy.world 52 points 9 months ago

The 1% only look out for the 1%, remember that.

[–] sphericth0r@kbin.social 26 points 9 months ago

I believe libel laws already exist, but when you're in Congress you must make laws in a reactionary way otherwise considered thought and reason might begin to permeate the law. We wouldn't want that.

[–] AlternatePersonMan@lemmy.world 25 points 9 months ago (1 children)

Why just nonconsensual sexual deep fakes? How about nonconsensual deep fakes of anything ?

[–] FuglyDuck@lemmy.world 20 points 9 months ago (1 children)

there is a place for deep fakes in satire. (albeit, they should be known as such,)

[–] AlternatePersonMan@lemmy.world 8 points 9 months ago (3 children)

I agree with the right to satire, but probably not as a deep fake. Comics, skits, etc., sure. Deep fakes are too convincing for an alarming number of folks.

[–] FuglyDuck@lemmy.world 22 points 9 months ago (2 children)

so how do you feel about skilled impersonators?

what if they're convincing? or are we going to allow just the shitty ones? or only if they offend the subject?

what you're proposing is a very slippery slope.

load more comments (2 replies)
[–] MagicShel@programming.dev 5 points 9 months ago* (last edited 9 months ago) (4 children)

An alarming number of folks think the world is flat and the moon is made of cheese. We need a better standard than that.

load more comments (4 replies)
[–] leaky_shower_thought@feddit.nl 19 points 9 months ago (1 children)

individuals who produced or possessed the forgery with intent to distribute it

this is going to be a wild ride.

there's a scenario where the creator is not the leaker but angry people with forks won't even care of the distinction.

[–] Grandwolf319@sh.itjust.works 9 points 9 months ago

Wanna get someone arrested? Send them an email of a deepfake.

[–] Zugyuk@lemmy.world 18 points 9 months ago (1 children)
[–] lolcatnip@reddthat.com 18 points 9 months ago

Alternate headline: Legislators Vow To Put Genie Back In Bottle

[–] doingthestuff@lemmy.world 17 points 9 months ago (7 children)

So they can still make them the old school way using Photoshop?

[–] Asafum@lemmy.world 8 points 9 months ago

This is exactly what has me irritated about this whole nonsense... People have been doing that since Photoshop existed, but big scary AI is in the news now so we are going to attack it full force because people are using it in the way they've used everything that has similar capabilities...

Still no action on our actual issues though, just some performative bullshit to assist the truly needy of our society, billionaires...

load more comments (6 replies)
[–] theneverfox@pawb.social 15 points 9 months ago (1 children)

Hot take, but I feel like this is entirely the wrong direction to take. I feel like this will go badly in one of many ways if passed, and I feel like leaning into this would lead to a better world

Women, especially teachers, lose their jobs because their nudes leaked. This technology is in the wild, it can't be put back in the box. It can be done at home by a technically gifted teenager with a gaming computer. While this is certainly true, I don't think the common person will understand this until it's everywhere.

Yeah, I get that it must feel horribly violating, but imagine the world where we go the other direction - where nude pictures have no power, because anyone could have whipped them up.

Where the response to seeing them is anger or disgust, not fear

But my biggest concern is the fact that most technical people don't understand generative AI... There's no way in hell Congress grasps the concept. I'm scared to read the full wording of this bill

load more comments (1 replies)
[–] Serinus@lemmy.world 11 points 9 months ago (9 children)

I don't get it. Why care? It's not her.

Maybe if they're making money of off her likeness. But without a money trail it just seems like chasing ghosts for not much reason.

[–] gapbetweenus@feddit.de 24 points 9 months ago

If you are interested to know you can search interviews with people who have been deepfaked in a sexual way where they explain how they feel and why they care.

[–] shiroininja@lemmy.world 20 points 9 months ago (6 children)

Because it's gross, and they do it to minors now. and all they need are pictures of your kids from your social media profile. They even use AI to undress them.

[–] MagicShel@programming.dev 17 points 9 months ago (5 children)

Generating sexual images of minors is already illegal. And these images can be generated by anyone modestly technical on their computer, so you can't go after people for creating or posessing the images (except if they look too young), only distribution.

This is unfortunately theater and will do basically nothing. How does a person even know if they are deep fakes? Or consensual? Hell what's too close of a likeness, because some of those images didn't look that much like her and at least one was not even realistic.

I'm not saying it's cool people are doing this, just that enforcement of this law is going to be a mess. You wind up with weird standards like how on Instagram you can show your labia but only through sheer material. Are deep fakes fine if you run them through an oil painting filter?

load more comments (5 replies)
[–] fishos@lemmy.world 16 points 9 months ago (5 children)

And here we have the real answer: prudism. "It's gross". And of course "think of the children". You don't have a real answer, you have fear mongering

[–] MagicShel@programming.dev 15 points 9 months ago (1 children)

I agree the issue is one of puritan attitudes toward sex and nudity. If no one gave a fuck about nude images, they wouldn't be humiliating, and if they weren't humiliating then the victim wouldn't really even be a victim.

However we live in the world we live in and people do find it embarrassing and humiliating to have nude images of themselves made public, even fakes, and I don't think it's right to tell them they can't feel that way.

They shouldn't ever have been made to feel their bodies are something to be embarrassed about, but they have been and it can't be undone with wishful thinking. Societal change must come first. But that complication aside, I agree with you completely.

[–] gapbetweenus@feddit.de 7 points 9 months ago (5 children)

Even without being puritan, there are just different levels of intimacy we are willing to share with different social circles - which might be different for everyone. It's fundamental to our happiness (in my opinion) to be able to decide for ourselves what we share with whom.

load more comments (5 replies)
[–] gapbetweenus@feddit.de 7 points 9 months ago (3 children)

So you would not mind if I send AI sex videos of you to your parents and friends? How about a video where you are sexually degraded playing in public space - how would you feel about that? Maybe you performing sexual acts that you find gross yourself? You just need a bit of empathy to understand that not everyone is into exhibitionism and wants intimate things become public.

[–] Serinus@lemmy.world 10 points 9 months ago (5 children)

I'd really prefer that people not send my parents any kind of porn.

I look at it like someone took my face out of a Facebook picture, printed it, cut it out, pasted it over some porn, and did the same thing.

It'd be a weird thing for them to do, but I don't really need to send the law after them for it. Maybe for harassment?

Laws have a cost, even good intentioned laws. I don't believe we need new ones for this.

load more comments (5 replies)
load more comments (2 replies)
load more comments (3 replies)
load more comments (4 replies)
load more comments (7 replies)
[–] Copernican@lemmy.world 9 points 9 months ago (2 children)

So what happens if a person allows their likeness to be 3d modeled and textured for something like a video game, and that 3d model is used to create explicit images. Is that not a problem (or maybe a different kind of problem) because it's not a deepfake and instead a use of a digital asset?

load more comments (2 replies)
[–] General_Effort@lemmy.world 8 points 9 months ago

This appears to be the bill: https://www.documentcloud.org/documents/24397944-defiance-act

For reference, this is the section getting altered: https://www.law.cornell.edu/uscode/text/15/6851

This looks to be an absolute shitshow. I fear it’ll be made even worse before it passes. Maybe they’ll curtail the abuse potential. Then again, maybe not. It may be seen as hurting the right people.

[–] alienanimals@lemmy.world 7 points 9 months ago

It's already impossible to stop.

Also, doing something ONLY when a billionaire complains, is a very bad look.

load more comments
view more: next ›