this post was submitted on 01 Feb 2024
55 points (79.6% liked)

Australia

3611 readers
191 users here now

A place to discuss Australia and important Australian issues.

Before you post:

If you're posting anything related to:

If you're posting Australian News (not opinion or discussion pieces) post it to Australian News

Rules

This community is run under the rules of aussie.zone. In addition to those rules:

Banner Photo

Congratulations to @Tau@aussie.zone who had the most upvoted submission to our banner photo competition

Recommended and Related Communities

Be sure to check out and subscribe to our related communities on aussie.zone:

Plus other communities for sport and major cities.

https://aussie.zone/communities

Moderation

Since Kbin doesn't show Lemmy Moderators, I'll list them here. Also note that Kbin does not distinguish moderator comments.

Additionally, we have our instance admins: @lodion@aussie.zone and @Nath@aussie.zone

founded 1 year ago
MODERATORS
 

After Nine blamed an 'automation' error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.

all 29 comments
sorted by: hot top controversial new old
[–] pixxelkick@lemmy.world 51 points 9 months ago* (last edited 9 months ago) (2 children)

didn't post the pics they started with of the women

That's enough for me to discard this as clickbait at best.

Post your data if you want me to take your journalism seriously.

If you want a fair comparison, start with the woman wearing an actual suit, followed by a woman wearing a button up shirt, as your shoulder up pic and see what gets generated.

20 bucks says the woman in the suit... generates the rest of the suit.

And 20 bucks says the button up shirt... generates jeans or etc as well.

If you compare apples to oranges, don't pretend that getting oranges instead of apples is surprising.

The fact people aren't calling this out on here speaks volumes too. We need to have higher standards than garbage quality journalism.

"Men wearing clearly suits generates the rest of the suits weareas women generate ???? Who knows, we won't even post the original pic we started with so just trust me bro"

0/10

[–] CrystalEYE@kbin.social 15 points 9 months ago* (last edited 9 months ago)

@pixxelkick Thank you! This article clearly is written completely biased. Photohops AI generator tries to interpret the whole picture to expand the cropped image. So in case of the original Georgie Purcell photo, the AI sees "woman, tank top, naked shoulders and arms, water in the background", so of course it tries to generate clothing it thinks fitting to wear at seaside or a beach.
I just tried the same with a male model in tank top on a beach and it did not magically put him in a suit, it generated swim wear.
If I use a picture on Georgie Purcell in more formal clothing, it generates more formal cloting.

Georgie Purcell in generated swimwear
Georgie Purcell in generated suit/dress
Male in generated swimwear

But, to be fair, this quote from the article:

But what it proves is that Adobe Photoshop’s systems will suggest women are wearing more revealing clothing than they actually are without any prompting. I did not see the same for men.

is indeed true. In general pictures of women tend to generate more "sexy" output than pictures of men.

And, of course, NINE clearly edited the image badly and could have chosen another generated output with no effort at all.

@LineNoise

[–] grue@lemmy.world 10 points 9 months ago (2 children)

So what you're saying is,

?

[–] Zagorath@aussie.zone 2 points 9 months ago

Holy shit I miss the old times of MSN/Windows Live Messenger emoticons.

[–] pixxelkick@lemmy.world 2 points 9 months ago (2 children)

Everyone shut the fuck up for a second.

Why does this extremely specific gif exist? Who made it? Are there more? I love it lol

[–] Snowpix@lemmy.ca 3 points 9 months ago

Judging by the style of the smiley faces, this is some ancient gif from a forum. Probably for when people would make wild claims without evidence in a thread, and people wanted proof.

[–] Death_Equity@lemmy.world 2 points 9 months ago

That is a 2 decades old forum gif.

[–] Kbin_space_program@kbin.social 14 points 9 months ago (1 children)

The issue is also present in DallE and bing image generation. Hypothesis is the sheer amount of porn being generated is affecting the models.

When I tried to create a joke profile for Tinder with some friends, I tried "woman eating fried chicken". Blocked result. "Man eating fried chicken" works.
Tried "Man T posing on beach" clothed. Woman T posing on beach, blocked.
Woman t posing at sunset on beach, returned nearly silhouetted nude image. Same thing for guy, clothed.

Went back to the first one, had to specify that the woman was wearing clothes to make it return the image. Sometimes specifying specific articles.

[–] Deceptichum@kbin.social 7 points 9 months ago (1 children)

Your hypotheses makes no sense?

People generating porn would make no change to its training data set.

[–] Kbin_space_program@kbin.social 4 points 9 months ago (2 children)

You wouldn't feed the images people generate and save back into the system to improve it?

[–] DoYouNot@lemmy.world 4 points 9 months ago (1 children)

This actually doesn't work to improve the model, generally. It's not new information for it.

[–] Kbin_space_program@kbin.social 0 points 9 months ago* (last edited 9 months ago)

Yup. But they would logically have bots up to troll for new posts and would be consuming social media posts with their own generated data.

Also they would absolutely feed in successful posts back into the system. You'd be stupid to not refine successful generations to further help the model.

[–] Deceptichum@kbin.social 3 points 9 months ago* (last edited 9 months ago)

Not after the initial training, no.

That would make it less effective, because instead of being trained on known real things it’s being further reinforced on its own hallucinations.

[–] LineNoise@kbin.social 12 points 9 months ago (1 children)

Looks like it does do the thing Adobe claimed it wouldn't after all.

Still poor form from Nine for using it in the first place, and for not catching it in the editorial process. But seems this is just another reminder this week of the biases of generative models.

[–] shrodes@lemmy.world 12 points 9 months ago

Does it though? Adobe simply claimed it would require human intervention and approval. Which is true and easily provable. You can’t replace someone’s clothing without selecting a part of the image you want to replace.

Someone had to go and do that. Someone hit generate on an AI prompt. Someone saw the result of said AI prompt (which gives you 3 possible alternatives each run) and said “yep, print it”.

This is not a tale of the biases of generative AI. There’s literally no reason for Nine to have even invoked any such thing in the first place.

[–] Deceptichum@kbin.social 11 points 9 months ago* (last edited 9 months ago) (2 children)

How is it legal for newspapers to fabricate parts of a photo and share it around as news?

Meanwhile we can’t even use copyrighted material to make parody.

[–] unionagainstdhmo@aussie.zone 4 points 9 months ago

Do you have billions of dollars? Didn't think so

[–] averyminya@beehaw.org 1 points 9 months ago

It really shouldn't be, FOX news photoshopped the same image of a random dude with a gun into multiple different photos to try and push fear mongering and there are still people who believe it.

They probably would have believed it without the photo, but it sure as hell doesn't help.

[–] N0body@sh.itjust.works 5 points 9 months ago* (last edited 9 months ago) (2 children)

The use of photo manipulation tools to create non-consensual revealing/nude/porn images is incredibly fucked up. I remember seeing multiple stories about lawsuits from teenage girls having these fake images made of them that circulated in their schools. It’s a violation, and it’s categorically wrong.

It sounds strange to say it, but pornography has always been the tip of the spear for technology. It went VHS and killed Betamax. It was a very early adopter of the internet. Onlyfans.

The non-consensual AI porn is the tip of the spear of what AI can do. How much disinformation and bullshit it’s going to introduce into the public square, and how it has absolutely zero ethics. You are going to view statements, interviews, etc. that aren’t real. Pure fabrications amplified by bot networks and useful idiots.

This AI rollout has been like cars before seatbelts and lines on the road. New technology and pure chaos. Good luck looking for geriatric politicians to find a cure. They already took money to look the other way. That’s their real job.

[–] CybranM@kbin.social 4 points 9 months ago (1 children)

It really is pandora's box and we can't do much to stop it. People need to get used to fakes and misinformation but we've already seen how poorly that's turned out and it'll only get worse from here.

[–] HopeOfTheGunblade@kbin.social 2 points 9 months ago

Used to living with the fallout, because I don't expect people will get better at media literacy.

[–] jacksilver@lemmy.world 3 points 9 months ago

This isn't about nonconsenual images, it's about bias in AI models. They used the extend image feature in both images and because the models think women=sexy it produces them in bikini bottoms and men=business it puts them in suits.

This is going to be an ongoing issue in how generative AI assumes things based on the prompt/input image - https://www.bloomberg.com/graphics/2023-generative-ai-bias/

[–] oahi@aussie.zone 5 points 9 months ago

Photos presented as news should be real, not computer generated fakes.

[–] Marsupial@quokk.au 3 points 9 months ago (1 children)

Female politicians*

That’s some horrible grammar from Crikey.

[–] Tristaniopsis@aussie.zone 1 points 9 months ago (1 children)

Thank goodness they didn’t use the pics they say they generated of Pauline Hanson. What a revolting thought.

[–] CrystalEYE@kbin.social 1 points 9 months ago (1 children)

@Tristaniopsis I just wanted to generate some Bikini-wearing Pauline Hanson just for you. But when I looked for fitting source material, I found her in panties doing a car wash. Thanks for that. :D

@LineNoise

[–] Tristaniopsis@aussie.zone 1 points 9 months ago

The horror…

The horror…

[–] Rinsed@sh.itjust.works -2 points 9 months ago