this post was submitted on 07 May 2024
255 points (88.3% liked)

Technology

59168 readers
3093 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] FMT99@lemmy.world 75 points 6 months ago (3 children)

Why would you ask a bot to generate a stereotypical image and then be surprised it generates a stereotypical image. If you give it a simplistic prompt it will come up with a simplistic response.

load more comments (3 replies)
[–] gerryflap@feddit.nl 36 points 6 months ago* (last edited 6 months ago) (2 children)

Kinda makes sense though. I'd expect images where it's actually labelled as "an Indian person" to actually over represent people wearing this kind of clothing. An image of an Indian person doing something mundane in more generic clothing is probably more often than not going to be labelled as "a person doing X" rather than "An Indian person doing X". Not sure why these authors are so surprised by this

load more comments (2 replies)
[–] admin@lemmy.my-box.dev 33 points 6 months ago (2 children)
[–] admin@lemmy.my-box.dev 32 points 6 months ago (1 children)

I'm just surprised there's no windmill in either of them. Canals, bikes, tulips... Check check check.

[–] Graphy@lemmy.world 19 points 6 months ago (1 children)

Careful, the next generated image is gonna contain a windmill with clogs for blades

[–] Hootz@lemmy.ca 5 points 6 months ago (1 children)

Well, they do run on air...

load more comments (1 replies)
[–] Linkerbaan@lemmy.world 32 points 6 months ago (3 children)
[–] jivandabeast@lemmy.browntown.dev 8 points 6 months ago (2 children)

Indians can be Sikh, not all indians are Hindu

[–] Eyck_of_denesle@lemmy.zip 3 points 6 months ago

Yes but the gentlemen in the images are also sikhs

load more comments (1 replies)
load more comments (1 replies)
[–] Sam_Bass@lemmy.world 26 points 6 months ago (1 children)

Get down with the Sikhness

[–] Sorgan71@lemmy.world 6 points 6 months ago

not me calling in sikh to work

[–] VirtualOdour@sh.itjust.works 21 points 6 months ago (1 children)

Articles like this kill me because the nudge it's kinda sorta racist to draw images like the ones they show which look exactly like the cover of half the bollywood movies ever made.

Yes, if you want to get a certain type of person in your image you need to choose descriptive words, imagine gong to an artist snd saying 'I need s picture and almost nothing matters beside the fact the look indian' unless they're bad at their job they'll give you a bollywood movie cover with a guy from rajistan in a turbin - just like their official tourist website does

Ask for an business man in delhi or an urdu shop keeper with an Elvis quiff if that's what you want.

[–] UnderpantsWeevil@lemmy.world 3 points 6 months ago

the ones they show which look exactly like the cover of half the bollywood movies ever made.

Almost certainly how they're building up the data. But that's more a consequence of tagging. Same reason you'll get Marvel's Iron Man when you ask an AI generator for "Draw me an iron man". Not as though there's a shortage of metallic-looking people in commercial media, but by keyword (and thanks to aggressive trademark enforcement) those terms are going to pull back a superabundance of a single common image.

imagine gong to an artist snd saying ‘I need s picture and almost nothing matters beside the fact the look indian’

I mean, the first thing that pops into my head is Mahatma Gandhi, and he wasn't typically in a turbine. But he's going to be tagged as "Gandhi" not "Indian". You're also very unlikely to get a young Gandhi, as there are far more pictures of him later in life.

Ask for an business man in delhi or an urdu shop keeper with an Elvis quiff if that’s what you want.

I remember when Google got into a whole bunch of trouble by deliberately engineering their prompts to be race blind. And, consequently, you could ask for "Picture of the Founding Fathers" or "Picture of Vikings" and get a variety of skin tones back.

So I don't think this is foolproof either. Its more just how the engine generating the image is tuned. You could very easily get a bunch of English bankers when querying for "Business man in delhi", depending on where and how the backlog of images are sources. And urdu shopkeeper will inevitably give you a bunch of convenience stores and open-air stalls in the background of every shot.

[–] 0x0@programming.dev 18 points 6 months ago

There are a lot of men in India who wear a turban, but the ratio is not nearly as high as Meta AI’s tool would suggest. In India’s capital, Delhi, you would see one in 15 men wearing a turban at most.

Probably because most Sikhs are from the Punjab region?

[–] dwalin@lemmy.world 17 points 6 months ago

Overfitting

It happens

[–] Fedizen@lemmy.world 14 points 6 months ago (3 children)

its the "skin cancer is where there's a ruler" phenomena.

load more comments (3 replies)
[–] sentient_loom@sh.itjust.works 10 points 6 months ago (1 children)

This isn't what I call news.

[–] cornshark@lemmy.world 7 points 6 months ago (1 children)

But is it what you call technology?

load more comments (1 replies)
[–] Mango@lemmy.world 5 points 6 months ago

Turbans are cool and distinct.

[–] possiblylinux127@lemmy.zip 5 points 6 months ago (2 children)

1000002349

I'm not sure how AI could be possibility racist. (Image is of a supposed Native American but my point still stands)

[–] demonsword@lemmy.world 8 points 6 months ago (3 children)

the AI itself can't be racist but it will propagate biases contained in its training data

load more comments (3 replies)
[–] TrickDacy@lemmy.world 6 points 6 months ago (1 children)
load more comments (1 replies)
[–] Kolanaki@yiffit.net 4 points 6 months ago* (last edited 6 months ago)

Would they be equally surprised to see a majority of subjects in baggy jeans with chain wallets if they prompted it to generate an image of a teen in the early 2000's? 🤨

[–] Haus@kbin.social 3 points 6 months ago

Whenever I try, I get Ravi Bhatia screaming "How can she slap?!"

[–] autotldr@lemmings.world 3 points 6 months ago

This is the best summary I could come up with:


The latest culprit in this area is Meta’s AI chatbot, which, for some reason, really wants to add turbans to any image of an Indian man.

We tried prompts with different professions and settings, including an architect, a politician, a badminton player, an archer, a writer, a painter, a doctor, a teacher, a balloon seller, and a sculptor.

For instance, it constantly generated an image of an old-school Indian house with vibrant colors, wooden columns, and styled roofs.

In the gallery bellow, we have included images with content creator on a beach, a hill, mountain, a zoo, a restaurant, and a shoe store.

In response to questions TechCrunch sent to Meta about training data an biases, the company said it is working on making its generative AI tech better, but didn’t provide much detail about the process.

If you have found AI models generating unusual or biased output, you can reach out to me at im@ivanmehta.com by email and through this link on Signal.


The original article contains 956 words, the summary contains 164 words. Saved 83%. I'm a bot and I'm open source!

load more comments
view more: next ›