this post was submitted on 17 Jul 2023
249 points (96.6% liked)

Technology

59197 readers
2873 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Well this is terrifying...

top 48 comments
sorted by: hot top controversial new old
[–] radix@lemmy.world 111 points 1 year ago (1 children)

"And that's why we, the benevolent and peaceful police, need to track all your movements at all hours of the day. For the children. You don't want to be anti-children do you? Skynet told us where you live."

[–] foggy@lemmy.world 39 points 1 year ago (1 children)

Won't somebody PLEASE think of the children?!

[–] Zrybew@lemmy.ml 77 points 1 year ago (4 children)

So, they tracked his plate as he crossed the state line multiple times.

I wonder how many false positives they stopped on the road before getting one successful case to boast about it.

Want to see a really fucked up case? Check this one: https://www.washingtonpost.com/technology/2022/07/15/predictive-policing-algorithms-fail/

[–] Rouxibeau@lemmy.world 11 points 1 year ago (1 children)

Can't. WP paywalls suck dick.

[–] nandeEbisu@lemmy.world 4 points 1 year ago

Unexplainable results should never be probable cause because you can't determine that the decisions were not made using protected traits either directly or inferred.

[–] AbidanYre@lemmy.world 2 points 1 year ago

I was expecting a link to Minority Report.

[–] housepanther@lemmy.goblackcat.com 37 points 1 year ago (1 children)

Indeed it is terrifying. I definitely don't think drugs should illegal to begin with. I am so anti-drug war it's not even funny.

[–] FlyingSquid@lemmy.world 38 points 1 year ago (2 children)

I'm anti-drug war myself, but I'm also sitting here thinking today, it's drug traffickers. What is it tomorrow?

[–] motorwerks@sopuli.xyz 13 points 1 year ago (1 children)

Yes, the question of current purpose is nearly irrelevant. It's the question of possible purpose that's concerning because once it's A) available & B) left to human subjectivity then privacy & 'innocent until proven guilty' is no longer guaranteed.

[–] Nobilmantis@feddit.it 3 points 1 year ago (1 children)

This is the first time I will be on this other side of this argument, but let me disagree. The technology behind it isn't inherently bad, it's the people running the system having access to it that scares us. Take Snowden for example; when he exposed what the NSA was doing with US citizens data (with the help of big companies), do you think he meant that the internet or security cameras are the threat? They sure as hell are a good vector, but you don't trash nor blame your pc for being the mean though which that is achieved. The problem is who we put in power and how we held them accountable for misusing it.

[–] motorwerks@sopuli.xyz 7 points 1 year ago (1 children)

Right, but the moment you're relying on who is in charge then the process is already broken. You have to assume the process is usable no matter who is in charge. I know it's absolute, but it's the only way.

[–] Nobilmantis@feddit.it 2 points 1 year ago* (last edited 1 year ago) (1 children)

The process is broken if the people you rely on suck. It is inevitable that someone, in a form or another, will be representative of the group of people you are part of (may it be a dictator, an influential priest, or an elected representative); we have the luxury of living in (somewhat?) democratic countries. The way out of surveillance misuse is making (or forcing) our politicians pass laws that restrict what companies or agencies can do with our data, or how they can use them. I think spreading awareness about this topic is the most effective way to push these kind of rules in effect.

While individualistic "guerrilla privacy" might be effective for yourself, it's like a band-aid on a broken bone. If 99% of the people around you don't care about it, or simply are unaware (family, neighbours, friends), you will join the surveillance system no matter what: from a family member uploading your details to meta, to a stranger taking a picture with you in it and posting it, to your neighbors ring camera, to your friend's iPhone constantly scanning the surroundings to report nearby devices (your phone, for instance) to "improve location data".

If there is no laws that prevent evil actors from misusing this power, really little changes in the bigger picture by you using signal or protonmail (while you should do it, don't get me wrong).

EDIT: i know this will be controversial, but to me this is a good metaphor for it: the world is slowly getting hotter due to companies just caring about profits and politicians passing no laws to reverse the process, while instead actually taking bribes from those companies to not do anything about it (look, look, it's the same duo again) and your solution is... You dig an underground bunker to survive the next heatwave/hurricane.

[–] motorwerks@sopuli.xyz 2 points 1 year ago

I guess this is where I'd love to have this discussion in person over a drink of your choice because my point, albeit unclear, was that these systems that, on the face of it, "solve crime" shouldn't exist no matter how much 'good' they offer. They have no control &/or limitation to their powers except by the person who decides to use them. I don't see that as manageable. Ultimate power breeds ultimate corruption, if you will. It seems we're at an 'agree to disagree' point & I'm OK w/ that result. Have a good day/week/month & please continue your efforts to healthy debate!

[–] qprimed@lemmy.ml 5 points 1 year ago* (last edited 1 year ago) (1 children)

What is it tomorrow?

quite literally anything the majority quorum of your elected representatives want.

an educated, engaged electorate matters, kids!

edit: letter

[–] Treczoks@lemmy.world 30 points 1 year ago (2 children)

And for that, they also processed the data on thousands of innocent people, too. Without any legal basis or permission, probably.

[–] FlyingSquid@lemmy.world 10 points 1 year ago (1 children)

Exactly. This is why this is scaring me. The police are vacuuming up data on everyone and who knows who else they'll go after, especially if the wrong person gets into power. Even on the state level. I sure hope DeSantis' Florida doesn't have this ability.

[–] MigratingApe@lemmy.world 3 points 1 year ago

ctOS, it is happening

[–] nandeEbisu@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I'm less concerned about that if its purely public data. If a police officer sat in a helicopter looking for drivers driving erratically, then notified a trooper on the ground to check on the car, and perform a field sobriety test if there is cause to do so I think that would fall within the confines of the law, even though thousands of cars could have been in their field of view and considered for potential DUI.

I am of the opinion that if the data is not either directly in public view, or the user can opt out of persisting it and it is available to the general public, even if for a fee, then its fine to use the data. I think any kind of AI algorithm's suggestions on its own should not be considered probable cause, you can use it to narrow down suspects, but you need actual evidence for a warrant or arrest.

I think the issue I have with this situation is collecting and storing such a vast amount of travel data on individuals without their consent. If leaked, that data could be used to track down victims of stalking and abuse, or political dissidents.

[–] jocanib@lemmy.world 17 points 1 year ago

It's OK. Ordinary people will have no trouble at all making sure they use a different vehicle every time they drive their kid to college or collect an elderly relative for the holidays. This will only inconvenience serious criminals.

[–] SirNuke@kbin.social 13 points 1 year ago (1 children)

Is there any actual analysis this went down as written? This sets off two eyebrow alarms for me: 1. AI doing something revolutionary without serious issues and 2. clean cut police work, which never happens (at least not anymore).

Honestly I'd put money down the police caught him by chance and went backwards to find a good explanation for how. I'd also be highly skeptical of an AI system that actually catching drug dealers without also catching like everyone else.

[–] symmetricsilliness@lemmynsfw.com 7 points 1 year ago (1 children)

It’s the new excuse for parallel construction.

[–] dismalnow@kbin.social 3 points 1 year ago

We'll need 10 million dollars to reverse engineer how the unshackled AI did it, or you can take the plea.

Go back to your cell, and we'll ask you again right before your trial.

[–] BilboBargains@lemmy.world 12 points 1 year ago (1 children)

I would like to propose a toast to the end of the war on drugs, thanks to this technology which will surely be decisive in convincing people not to want drugs. Once we've dealt with all these pesky low level dealers the cartels will pack it all in and give up the chance of huge profits.

Kicking the same can down the road. Incredibly depressing and dumb. Stop voting for these idiots and join the likes of Portugal by legalising drugs and treating addiction as a health issue.

[–] paris@lemmy.blahaj.zone 9 points 1 year ago

Also don't defund the program like Portugal did. The conservatives there didn't like that decriminalizing drug possession for personal use actually works, so they immediately worked to cut funding to the program by like 80% and surprise surprise the program stopped being as effective as it was at the start. Essentially every piece of data we have on Tough on Crime™ politics shows that the approach doesn't work. If you want people to stop using drugs, make it easy for them to do so without fear of being arrested/imprisoned.

[–] blazera@kbin.social 12 points 1 year ago (2 children)

Where the heck are they getting training data for traffic patterns of drug dealers?

[–] guyrocket@kbin.social 4 points 1 year ago (1 children)

In my area there are these so called traffic cameras all over the place. Maybe those.

[–] blazera@kbin.social 3 points 1 year ago (1 children)

the problem is the driver's life outside of the car being part of the equation. Imagine a headline like AI learns the driving patterns of anime fans. How is the traffic camera gonna know which cars are being driven by anime fans in the first place? Of course drug dealers are gonna be much less likely to have drug dealer bumper stickers that might tip the cameras off.

[–] TheYear2525@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Correlation after the fact could work. Arrest a few hundred drug traffickers over the course of several years, then feed their plate numbers and the past decade of everyone’s traffic data to the AI.

[–] Pons_Aelius@kbin.social 1 points 1 year ago

The Wire and Breaking Bad.

[–] Hobo@lzrprt.sbs 11 points 1 year ago (1 children)

AI tracker as I drive out my driveway: Well looks like Hobos off to McDonalds again....

[–] Radium@sh.itjust.works 8 points 1 year ago (1 children)

AI does not equal pattern matching and machine learning

[–] Nobilmantis@feddit.it 9 points 1 year ago

AI is officially the current catch-all tech term for news titles right now

[–] Nobilmantis@feddit.it 5 points 1 year ago

Should have used public transport ;)

[–] PortableHotpocket@lemmy.ca 1 points 1 year ago

Ah yes, I knew Demolition Man was an accurate prediction of the future. Thanks for confirming the direction we are headed in!

load more comments
view more: next ›