this post was submitted on 29 Jul 2023
20 points (79.4% liked)

Technology

34912 readers
151 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
all 10 comments
sorted by: hot top controversial new old
[–] Xoriff@lemmy.ml 11 points 1 year ago* (last edited 1 year ago) (1 children)

I'm also curious how this would stop someone from using AI to generate an image and then just using a digital camera to take a photo of their monitor. "The photo of <politician> executing someone in the street seems to be legit. There's provenance metadata showing that the image hasn't been tampered with since it was taken and cryptographically signed by Nikon's physical sensors" edit: formatting

[–] pup_atlas@pawb.social 1 points 1 year ago

It would definitely stop pretty much any counterfeit if they added some rudimentary depth data into the image format as well, within the signed contents. That way simply taking a picture of a monitor would be obviously detectable, and not alterable without removing the signing. It wouldn’t have to be a high resolution depth map at all either.

[–] beejjorgensen@lemmy.sdf.org 7 points 1 year ago (1 children)

Seems easy to strip off or not add, and then you'd have a video with unknown provenance which the echo chamber will eat hook, line, and sinker. Wouldn't you?

[–] MolochAlter@lemmy.world 7 points 1 year ago (1 children)

Yeah this seems to presume that people care to fact check which, if these last few years have proved anything, they don't.

[–] andruid@lemmy.ml 1 points 1 year ago (1 children)

People need to be educated better on it, expected to be better, etc

[–] sugar_in_your_tea@sh.itjust.works 4 points 1 year ago (1 children)

Nah, that's not going to be enough, well educated people fall for scams all the time.

The best option imo is for independent organizations that people trust to handle fact checking, and have such organizations fact check each other. I also think it's completely appropriate for the government to fact check news agencies as well, provided they don't attempt to shut down orgs with a poor track record.

We should still educate people better, but there's still going to be a ton of people falling for nonsense like this.

[–] andruid@lemmy.ml 1 points 1 year ago

It's both, uneducated people don't have a basis to establish trust with.

[–] KrimsonBun@lemmy.ml 1 points 1 year ago (1 children)

and be a massive attack against online privacy! yay!

[–] ashley@lemmy.ca 1 points 1 year ago

Not really, you can strip the cryptography from a photo. What would be more concerning would be the (likely) corporation controlling the certificates.