this post was submitted on 03 Dec 2023
461 points (99.1% liked)

Technology

59414 readers
3138 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Senate bill aims to stop Uncle Sam using facial recognition at airports / Legislation would eliminate TSA permission to use the tech, require database purge in 90 days::Legislation would eliminate TSA permission to use the tech, require database purge in 90 days

all 29 comments
sorted by: hot top controversial new old
[–] LEDZeppelin@lemmy.world 34 points 11 months ago (1 children)

Chances of this bill becoming a law are less than 0%

[–] affiliate@lemmy.world 1 points 11 months ago (1 children)

i long for a day when i see a policy proposal that i like and then think it has a good chance of getting passed

[–] paysrenttobirds@sh.itjust.works 21 points 11 months ago (2 children)

The TSA's use of CAT-2 involves scanning a passenger's face and comparing it to a scanned ID card or passport. The system can detect fake IDs "very quickly," a TSA official told us in July, and is also able to verify the person is on any additional screening lists and is actually scheduled to travel in the next 24 hours.

This I'm ok with actually? The airport is already a place you expect to have to give your real identity to be there, and in the case of unfortunate people who share a name with a watchlist person this technology helps them travel normally without hours long interviews at every stop, I think mainly because the TSA agent can say the computer ok'd it instead of having to stick their neck out personally.

I guess the problem would be if the new scans of your face collected by this software are connected to your identity and/or travel data and then exported to third parties who didn't already have that info.

Because by itself it isn't really giving the TSA any new information. They have your id and your boarding pass. The government already knows who you are and where you're going and this bill doesn't stop them acquiring or keeping that information.

[–] NocturnalMorning@lemmy.world 38 points 11 months ago (2 children)

Facial recognition is bad for a multitude of privacy reasons. But, the biggest reason though is it is also wrong, and often trained with biased data (which is almost impossible to completely remove).

[–] bobgusford@lemmy.world 7 points 11 months ago (1 children)

Sorry, this needs more clarification! Do you mean "intent recognition" where some AI, trained with biased data, will assume that some brown person is upto no good? Or do you mean that they will misidentify black and brown people more often due to how cameras work? Because the latter has nothing to do with biased data.

[–] yeather@lemmy.ca 6 points 11 months ago (1 children)

Both in fact. Training data for things like this regularly mix up minority people. If Omar is a upstanding citizen, but gets his face mixed with Haani, known terrorist, Omar gets treated unfairly, potentially to the point of lethality.

[–] bobgusford@lemmy.world 3 points 11 months ago

For "intent recognition", I agree. A system trained on data of mostly black committing crimes might flag more black people with ill intent.

But for the sake of identification at security checkpoints, if a man named Omar - who has an eerie resemblance to Haani the terrorist - walks through the gates, then they probably need to do a more thorough check. If they confirm with secondary data that Omar is who he says he is, then the system needs to be retrained on more images of Omar. The bias was only that they didn't have enough images of Haani and Omar for the system to make a good enough distinction. With more training, it will probably be less biased and more accurate than a human.

[–] paysrenttobirds@sh.itjust.works -2 points 11 months ago (1 children)

There is nothing in the article to suggest that the TSA programs' errors have inconvenienced people as the agent is right there to correct it, and more scans improves the accuracy. I get what you're saying, but the same biases are undoubtedly programmed into the brains of the agents and just as hard to eradicate.

There are many places I don't want to see facial recognition employed, but where people are already mandated to positively identify themselves seems like a natural fit. I think the senators and the ACLU can find much more persuasive examples of overreach.

[–] NocturnalMorning@lemmy.world 4 points 11 months ago (1 children)

You're free to offer up being tracked and monitored everywhere you go, but I'm not okay with that.

[–] paysrenttobirds@sh.itjust.works 1 points 11 months ago

My point is the bill would be more interesting if it was not restricted to the TSA in an airport, but maybe they have to start somewhere?

[–] inclementimmigrant@lemmy.world 8 points 11 months ago

Remember when those millimeter wave scanning machines rolled out and we were all reassured that the technology, which would create very detailed body scans, would blur out genitalia, would not be saved, and employees would not have access to the scanned data?

We then found out that nothing was blurred, the days was saved, the data was available to TSA agents to be copied onto God damn flash drives and we being traded?

Yeah, notwithstanding but fuck this and I don't trust the shitty security theater that is TSA to not advise this technology, not to mention that facial recognition has a myriad of problems with false positives with POC due to the well recognized racial bias being baked into theses systems because of the programmers who build and train these systems.

[–] JimmyBigSausage@lemm.ee 17 points 11 months ago (2 children)

Japan uses this and scans you when you enter the country. Pass through quickly when you leave country and they know you are gone. It works but could definitely be misused in a country with lower standards of moral foundation.

[–] ChaoticEntropy@feddit.uk 14 points 11 months ago (1 children)

Are we supposed to pretend that Japan's government is somehow above reproach...?

[–] JimmyBigSausage@lemm.ee 1 points 11 months ago* (last edited 11 months ago) (2 children)

Not really. But culturally, it is very different. Robberies are basically unknown. If someone loses a wallet, people look to return it. People on the subway are quiet and respectful. Litter is uncommon. I have been there a couple of times and sometimes wish I could live there. It is not a perfect place by any means. Just like anywhere. But it is SO different than China, S. Korea, or the US. Edit. Spelling

[–] KevonLooney@lemm.ee 9 points 11 months ago (1 children)

People on the subway are quiet and respectful

This is the country with "women only" subway cars because there's so much sexual assault on the subway? You were a tourist. That's why they were polite to you.

[–] JimmyBigSausage@lemm.ee -4 points 11 months ago (2 children)

Not true. Never said I was a tourist. Not sure what country you went to?

[–] KevonLooney@lemm.ee 5 points 11 months ago

In Japan, women-only cars were introduced to combat lewd conduct, particularly groping (chikan).

https://en.wikipedia.org/wiki/Women-only_passenger_car

[–] LWD@lemm.ee 1 points 11 months ago* (last edited 11 months ago)
[–] sigmaklimgrindset@sopuli.xyz 2 points 11 months ago

People on the subway are quiet and respectful

Lmao I literally got someone trying to put their hand up my clothes THREE different times on the subway and my neighbour got mugged at knifepoint in the year I lived there. But sure, Japan is a crime-free utopia.

Not like Japanese cops are infamous for not prosecuting anything that isn’t a slam dunk or anything.

[–] mannycalavera@feddit.uk 10 points 11 months ago

It works but could definitely be misused in a country with lower standards of moral foundation.

No problem using it in the US then..... homerfadeintohedge.gif

[–] NeoNachtwaechter@lemmy.world 11 points 11 months ago* (last edited 11 months ago)

Once upon a time, an artist has performed this at a train station in my city:

He walked up in front of one of the surveillance cameras, where it is known that they use face recognition all the time.

He was wearing a rubber mask resembling a human face, in good quality. (Side note: it is forbidden to go there with your face covered.)

This mask was looking like his own face.

Now the question is, has the camera seen his face, or not?

[–] autotldr@lemmings.world 5 points 11 months ago

This is the best summary I could come up with:


The US Transportation Security Administration's plans to expand its use of facial recognition tech, already in use at several American airports, may be over before it begins if a newly introduced Senate bill becomes law.

The bipartisan Traveler Privacy Protection Act [PDF], SB 3361, was introduced this week by Senators Jeff Merkley (D-OR) and John Kennedy (R-LA), and would stop the TSA's use of facial biometrics dead in its tracks.

"Every day, TSA scans thousands of Americans' faces without their permission and without making it clear that travelers can opt out of the invasive screening," said Senator Kennedy.

Idemia's full suite of biometric technology was recently rolled out by Interpol, which used it to make its first biometric-based arrest of a suspected smuggler who presented false papers at a police checkpoint in Bosnia and Herzegovina.

The system can detect fake IDs "very quickly," a TSA official told us in July, and is also able to verify the person is on any additional screening lists and is actually scheduled to travel in the next 24 hours.

University of Illinois at Urbana-Champaign computer science professor and aviation security expert Sheldon Jacobson described the senators behind the proposal as well-intentioned, "albeit ill-qualified and ill-informed" to make such a call.


The original article contains 706 words, the summary contains 206 words. Saved 71%. I'm a bot and I'm open source!

[–] OutlierBlue@lemmy.ca 4 points 11 months ago

No facial recognition? That's fine. They still have the scanner that sees you naked. They can set up genital recognition systems instead.

[–] hackitfast@lemmy.world 1 points 11 months ago

I travelled recently, and was filled with disgust when I was told to scan my boarding pass and look into the camera to get in line. I would have asked to go through an alternative line but didn't want to miss my flight.

What the fuck is wrong with people

[–] Gregorech@lemmy.world -2 points 11 months ago (1 children)

I recently took a cruise to Mexico, when we returned the customs line was very quick because of this technology. It scanned my face it and gave me a green light and I moved along.

Since we were on a dedicated ship to one port I didn't even get the where are you coming from and going to questions.

[–] brlemworld@lemmy.world 6 points 11 months ago (1 children)

You must not be brown. This kind of technology is often very racist because of training data.

[–] Gregorech@lemmy.world 1 points 11 months ago

To be honest no one did since it was just a three-day to Mexico and only one port of call everyone was just shuffle through.

I'm not saying it can't be miss used, or that new tech is often riddled with issues. I appreciate not having to wait three hours to get through customs.

I can't imagine offloading one of the new 8000 passenger ships without this.