this post was submitted on 18 Dec 2023
247 points (98.1% liked)

Technology

34816 readers
100 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] thanks_shakey_snake@lemmy.ca 67 points 10 months ago (3 children)

This is going to get soooo much more treacherous as this becomes ubiquitous and harder to detect. Apply the same pattern, but instead of wood carvings, it's an election, or sexual misconduct trial, or war.

Our ability to make sense of things that we don't witness personally is already in bad shape, and it's about to get significantly worse. We aren't even sure how bad it is right now.

[–] otter@lemmy.ca 12 points 10 months ago (3 children)

and the flipside is also a problem

Now legitimate evidence can be dismissed as "AI generated"

[–] merc@sh.itjust.works 7 points 10 months ago (1 children)

Relevant:

Criminals will start wearing extra prosthetic fingers to make surveillance footage look like it's AI generated and thus inadmissible as evidence

https://twitter.com/bristowbailey/status/1625165718340640769?lang=en

[–] otter@lemmy.ca 7 points 10 months ago

Well that's a new one lol, hadn't thought of that. That's another level of planning

[–] thanks_shakey_snake@lemmy.ca 5 points 10 months ago (1 children)

Exactly-- They're two sides of the same coin. Being convinced by something that isn't real is one type of error, but refusing to be convinced by something that is real is just as much of an error.

Some people are going to fall for just about everything. Others are going to be so apprehensive about falling for something that they never believe anything. I'm genuinely not sure which is worse.

[–] Anticorp@lemmy.ml 3 points 10 months ago (1 children)

We already saw that with nothing more than two words. Trump started the "fake news" craze, and now 33% of Americans dismiss anything that contradicts their views as fake news, without giving it any thought or evaluation. If a catch phrase is that powerful, imagine how much more powerful video and photography will be. Even in 2019 there was a deep fake floating around of Biden with a Gene Simmons tongue, licking his lips, and I personally know several people who thought it was real.

[–] thanks_shakey_snake@lemmy.ca 2 points 10 months ago

Great example. Yeah, I've had to educate family members about deepfakes because they didn't even know that they were possible. This was on the back of some statement like "the only way to know for sure is to see video." Uh... Sorry fam, I have some bad news...

[–] DrPop@lemmy.ml 4 points 10 months ago

Analog is the way to go now

[–] Bitrot@lemmy.sdf.org 1 points 10 months ago* (last edited 10 months ago)

It already happening. Adobe is selling them but even if they weren't it's not hard to do.

I think the worst of it is going to be places like Facebook where people already fall for terrible and obvious Photoshop images. They won't notice it there are mistakes, even as AI gets better and there are fewer mistakes (Dall-E used to be awful at hands, not so bad now). However even smart folks will fall for these.