this post was submitted on 20 Jul 2023
1819 points (98.3% liked)

Technology

59106 readers
3605 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Auli@lemmy.ca 2 points 1 year ago (1 children)

Who said it was givening up privacy. The worst I heard is slippery slope of they donthis they might ad more to it later. And how was it privacy compromising?

[–] 4AV@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

And how was it privacy compromising?

  1. Anything could be added to the hashes with the user having no way to know what's being searched for beyond "trust us". This could be partially alleviated if, for example, the hash had to be signed by organizations in a combination of states that'd make it difficult to push through hashes for anything other actual CSAM (so not just Five Eyes)

  2. Adversarial examples to intentionally set off the filter were demonstrated to be possible. Apple made it clear that there are types of content they'd be legally obligated to report once they became aware of, and it'd be well within a government agency's capabilities to honeypot, say initially, terrorist recruitment material

  3. Coincidental false positives are also entirely possible (ImageNet had some naturally occuring clashes) and can result in their employees seeing your sensitive photographs

  4. The user's device acting against the user cements other user-hostile and privacy-hostile behavior. "People could circumvent the CSAM scan" would be given as another reason against right to repair and ability to see/modify the software your own device is running

  5. Tech companies erode privacy by flip-flopping between "sure we're giving ourselves abusable power, but we'll stand up to governments pressuring us to expand this" and then "well what were we supposed to do, leave the market?" when they inevitably concede

[–] Auli@lemmy.ca 1 points 1 year ago* (last edited 1 year ago) (1 children)

What's anything? They are not looking for any CSAM pictures they are looking for specific ones that are in a database. Its not like they can create a hash for a guy letting his dog on a horse and find all those pictures.

[–] 4AV@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

they are looking for specific ones that are in a database

They could be looking for any images without your knowing - there's no guarantee that those images came from a CSAM database.

Its not like they can create a hash for a guy letting his dog on a horse

They could trivially create a hash for a picture of a guy letting his dog on a horse (which would also include other very similar images).

I didn't necessarily mean to claim that they can scan for a concept lacking a fixed image, if that's what you're saying. That would theoretically be possible with enough hashes, but impractical.