this post was submitted on 25 Nov 2023
732 points (97.3% liked)

Technology

59559 readers
3618 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] CCF_100@sh.itjust.works 12 points 1 year ago (1 children)

Okay, are they actually insane?

[–] janus2@lemmy.zip 7 points 1 year ago
[–] yardy_sardley@lemmy.ca 10 points 1 year ago (1 children)

For the record, I'm not super worried about AI taking over because there's very little an AI can do to affect the real world.

Giving them guns and telling them to shoot whoever they want changes things a bit.

load more comments (1 replies)
[–] GutsBerserk@lemmy.world 9 points 1 year ago

So, it starts...

[–] Uranium3006@kbin.social 9 points 1 year ago (1 children)
load more comments (1 replies)
[–] autotldr@lemmings.world 9 points 1 year ago

This is the best summary I could come up with:


The deployment of AI-controlled drones that can make autonomous decisions about whether to kill human targets is moving closer to reality, The New York Times reported.

Lethal autonomous weapons, that can select targets using AI, are being developed by countries including the US, China, and Israel.

The use of the so-called "killer robots" would mark a disturbing development, say critics, handing life and death battlefield decisions to machines with no human input.

"This is really one of the most significant inflection points for humanity," Alexander Kmentt, Austria's chief negotiator on the issue, told The Times.

Frank Kendall, the Air Force secretary, told The Times that AI drones will need to have the capability to make lethal decisions while under human supervision.

The New Scientist reported in October that AI-controlled drones have already been deployed on the battlefield by Ukraine in its fight against the Russian invasion, though it's unclear if any have taken action resulting in human casualties.


The original article contains 376 words, the summary contains 158 words. Saved 58%. I'm a bot and I'm open source!

[–] 5BC2E7@lemmy.world 9 points 1 year ago (5 children)

I hope they put some failsafe so that it cannot take action if the estimated casualties puts humans below a minimum viable population.

[–] sukhmel@programming.dev 5 points 1 year ago (2 children)

Of course they will, and the threshold is going to be 2 or something like that, it was enough last time, or so I heard

load more comments (2 replies)
load more comments (4 replies)
[–] rustyriffs@lemmy.world 9 points 1 year ago (21 children)

Well that's a terrifying thought. You guys bunkered up?

load more comments (21 replies)
[–] cosmicrookie@lemmy.world 8 points 1 year ago (2 children)

The only fair approach would be to start with the police instead of the army.

Why test this on everybody else except your own? On top of that, AI might even do a better job than the US police

load more comments (2 replies)
[–] ElBarto@sh.itjust.works 8 points 1 year ago

Cool, needed a reason to stay inside my bunker I'm about to build.

[–] FlyingSquid@lemmy.world 6 points 1 year ago (3 children)

I'm guessing their argument is that if they don't do it first, China will. And they're probably right, unfortunately. I don't see a way around a future with AI weapons platforms if technology continues to progress.

load more comments (3 replies)
[–] afraid_of_zombies@lemmy.world 6 points 1 year ago

It will be fine. We can just make drones that can autonomously kill other drones. There is no obvious way to counter that.

Cries in Screamers.

[–] tsonfeir@lemm.ee 5 points 1 year ago

If we don’t, they will. And we can only learn by seeing it fail. To me, the answer is obvious. Stop making killing machines. 🤷‍♂️

[–] gandalf_der_12te@feddit.de 5 points 1 year ago (9 children)

Netflix has a documentary about it, it's quite good. I watched it yesterday, but forgot its name.

[–] Rockyrikoko@lemm.ee 8 points 1 year ago

I think I found it here. It's called Terminator 2: Judgment Day

load more comments (8 replies)
load more comments
view more: ‹ prev next ›