this post was submitted on 10 Dec 2023
177 points (94.0% liked)

Technology

59106 readers
3421 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 20 comments
sorted by: hot top controversial new old
[–] streetfestival@lemmy.ca 33 points 11 months ago (1 children)
[–] BrianTheeBiscuiteer@lemmy.world 22 points 11 months ago (1 children)

"The AI gave us bad advice. Our hands are clean."

[–] Ultraviolet@lemmy.world 9 points 11 months ago (2 children)

There should be some sort of law where if you want to offload decisions to AI, the person who decides to let the AI make those decisions needs to step up to take full civil and criminal liability for everything it does.

[–] orrk@lemmy.world 5 points 11 months ago (1 children)

so you only really need one martyr for the cause?

[–] jonne@infosec.pub 2 points 10 months ago

Yes, one person we can pin all of humanity's sins on, and then we just kill them. It's almost like a religious ritual.

[–] ForgotAboutDre@lemmy.world 2 points 10 months ago

No, every decision maker in the chain of command should be responsible. They should know what the intelligence is based on, if the people sharing the information are competent and should be validating the information.

Using AI to perform these tasks requires gross negligence at several stages. However, it does appear killing civilians and children is the intended outcome so negligence about AI is likely just a cover.

[–] match@pawb.social 31 points 11 months ago (1 children)

"as humans come to rely on these systems they become cogs in a mechanised process and lose the ability to consider the risk of civilian harm in a meaningful way."

that's not even an AI problem, that's a regular society problem

[–] SkyezOpen@lemmy.world 8 points 11 months ago (1 children)

And since "ai" isn't actual artificial intelligence but rather a neural net trained on data from that society, it's only going to reinforce existing issues. Remember the articles about crime ai being racist? That's because they fed policing statistics as the training set, and uh.. I'm sure you know the rest.

[–] LWD@lemm.ee 0 points 10 months ago* (last edited 10 months ago)
[–] crazyCat@sh.itjust.works 11 points 11 months ago

This is fucking insane dystopian shit, it’s worse than I thought and has become real sooner than I thought it would, bloody hell.

[–] frontporchtreat@lemmy.ca 10 points 10 months ago

Yeah we're getting really good at teaching computers to analyze satellite imagry and other forms of spatial data to find the spots we want. All we have to do is decide if we put green spaces, Walmarts or bombs in those spots.

[–] Zehzin@lemmy.world 4 points 11 months ago

That explains a lot

[–] autotldr@lemmings.world 4 points 11 months ago

This is the best summary I could come up with:


As Israel resumes its offensive after a seven-day ceasefire, there are mounting concerns about the IDF’s targeting approach in a war against Hamas that, according to the health ministry in Hamas-run Gaza, has so far killed more than 15,000 people in the territory.

The latest Israel-Hamas war has provided an unprecedented opportunity for the IDF to use such tools in a much wider theatre of operations and, in particular, to deploy an AI target-creation platform called “the Gospel”, which has significantly accelerated a lethal production line of targets that officials have compared to a “factory”.

The Guardian can reveal new details about the Gospel and its central role in Israel’s war in Gaza, using interviews with intelligence sources and little-noticed statements made by the IDF and retired officials.

This article also draws on testimonies published by the Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call, which have interviewed several current and former sources in Israel’s intelligence community who have knowledge of the Gospel platform.

In the IDF’s brief statement about its target division, a senior official said the unit “produces precise attacks on infrastructure associated with Hamas while inflicting great damage to the enemy and minimal harm to non-combatants”.

Multiple sources told the Guardian and +972/Local Call that when a strike was authorised on the private homes of individuals identified as Hamas or Islamic Jihad operatives, target researchers knew in advance the number of civilians expected to be killed.


The original article contains 1,734 words, the summary contains 241 words. Saved 86%. I'm a bot and I'm open source!