zephyrvs

joined 1 year ago
[–] zephyrvs@lemmy.ml 1 points 1 year ago

And most people seem to be incapable of seeing what's right in front of them. It's astounding.

[–] zephyrvs@lemmy.ml 1 points 1 year ago (1 children)

I know the Right likes to throw around Biden Bot's text to speech timeouts, motor control issues, ghost handshakes and the gibberish world salad he likes to produce but I like to think there's a parallel dimension where this is frontpage news world wide, no matter where you politically stand.

But we're in this dimension, unfortunately. 🌚

[–] zephyrvs@lemmy.ml 2 points 1 year ago* (last edited 1 year ago)

I'm referring to the CSAM scanning systems that are outside of the control of almost anyone except governments, three letter agencies, other law enforcement and parts of the private sector.

These systems must be fed every hash of every file submitted to as many instances as possible to be efficient with close to no oversight or public scrutiny.

Pass.

Edit: I'm not blocking you but I noticed intermittent connectivity issues on lemmy.ml today, possibly around the time where I replied.

[–] zephyrvs@lemmy.ml 5 points 1 year ago (7 children)

researchers found 112 instances of known CSAM across 325,000 posts on the platform

So you're willing to vacuum up the hashes of every image file uploaded on thousands of decentralized systems into a centralized systems (that is out of "our" control and coupled with direct access for law enforcement and corporations) to prevent the distribution of 0.034% of files that are CSAM and that could just as well be reported and deleted by admins and moderators? Remember how Snowden warned us about metadata?

If you think that's a wise tradeoff, I guess, go ahead. But then I'd have to question the entire goal of being decentralized in the first place. If it's all about "a billionare can't wreak havok upon my social network", then yeah, I guess decentralization helps a bit but even that remains to be seen.

But if you're actually willing to do that, you'd probably also be in favor of having government backdoors into chat encryption (and thus rendering the entire concept moot, because you can't have backdoors that cannot be discovered by other nefarious actors) and even more censorship-resistant systems like Tor because evil people use it to exchange CSAM anonymously as well?

[–] zephyrvs@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

Of course, I didn't say that though.

[–] zephyrvs@lemmy.ml 3 points 1 year ago (2 children)

And the instances who want to engage with that material would all opt for the fork and be done with it. That's all I meant.

[–] zephyrvs@lemmy.ml 0 points 1 year ago

Why was invading Afghanistan justified? And who gets go decide such things?

[–] zephyrvs@lemmy.ml 8 points 1 year ago* (last edited 1 year ago)

The researchers can't be taken seriously if they don't acknowledge that you can't force free software to do something you don't want it to.

Even if we started way down at the stack and we added a CSAM hash scanner to the Linux kernel, people would just fork the kernel and use their own build without it.

Same goes for nginx or any other web server or web proxy. Same goes for Tor. Same goes for Mastodon or any other Fedi/ActivityPub implementation.

It. Does. Not*. Work.

* Please, prove me wrong, I'm not all knowing, but short of total surveillance, I see no technical solution to this.

[–] zephyrvs@lemmy.ml 14 points 1 year ago (13 children)

That'd be useless though, because first, it'd probably opt-in via configuration settings and even if it wasn't, people would just fork and modify the code base or simply switch to another ActivityPub implementation.

We're not gonna fix society using tech unless we're all hooked up to some all knowing AI under government control.

view more: ‹ prev next ›