Loading...
A series of investigations have revealed the Israeli military's use of an artificial intelligence program, named Lavender, to identify targets in Gaza since October. The AI system has marked approximately 37,000 Gazans as potential Hamas-linked targets, with reports indicating a 10% error rate in identification. The program, developed with minimal human oversight, has led to systematic targeting, including in family homes, raising significant ethical and legal concerns. According to sources, the Israeli Defense Forces (IDF) had pre-authorized allowances for civilian casualties, permitting up to 15-20 civilian deaths for low-ranking militants and up to 100 for senior commanders. These revelations have sparked international scrutiny and debate over the use of AI in warfare, with the United States investigating the reports. The IDF, however, has denied using AI software to target individuals in Gaza bombing campaigns. Additionally, a second AI system called "Where's Daddy?" was mentioned as part of the targeting process.
Report: Israel Uses AI to Target Palestinians for Assassination https://t.co/hbyCBAgLIt
NEW: Did you know that #Israel has used AI with minimal human influence to target tens of thousands of Palestinians, many with little or no connection to Hamas? @connor_echols Q&A w/ @br_rosen https://t.co/VYLquNtW9U
US looking at media report that Israel used AI to identify bombing targets in Gaza - Reuters
BREAKING: US looking at media report that Israel used AI to identify bombing targets in Gaza - Reuters
The United States was looking into a media report that the Israeli military has been using artificial intelligence to help identify bombing targets in Gaza, White House national security spokesperson John Kirby told CNN in an interview on Thursday. https://t.co/gUznemOmbj
THE UNITED STATES IS INVESTIGATING A MEDIA REPORT ALLEGING THAT THE ISRAELI MILITARY UTILIZED ARTIFICIAL INTELLIGENCE TO IDENTIFY BOMBING TARGETS IN GAZA, ACCORDING TO WHITE HOUSE NATIONAL SECURITY SPOKESPERSON JOHN KIRBY IN AN INTERVIEW WITH CNN ON THURSDAY - SOURCES
The United States is looking into a media report that the #Israeli military has been using artificial intelligence to help identify bombing targets in #Gaza, White House national security spokesperson John Kirby tells CNN in an interview. https://t.co/wqOA5UMTWe https://t.co/VeJu3mvCXp
WHITE HOUSE'S KIRBY ON REPORTED USE OF AI FOR BOMBINGS BY ISRAEL IN GAZA: U.S. LOOKING INTO THAT REPORT, HAS NOT VERIFIED ITS CONTENT - CNN INTERVIEW real AI use case
The Israeli military used an artificial intelligence program known as "Lavender" to develop a "kill list" in Gaza with little human oversight. A second AI called "Where's Daddy?" was designed to target Palestinian men on the list at home at night with their families. https://t.co/yx3jFxE21E
WHITE HOUSE'S KIRBY ON REPORTED USE OF AI FOR BOMBINGS BY ISRAEL IN GAZA: U.S. LOOKING INTO THAT REPORT, HAS NOT VERIFIED ITS CONTENT - CNN INTERVIEW
One of the programs is called “ where’s daddy “ No joke Lavender’: The AI machine directing Israel’s bombing spree in Gaza https://t.co/Vy5qDkvWTy
Must read ⬇️ Lavender’: The AI machine directing Israel’s bombing spree in Gaza https://t.co/Vy5qDkvWTy
Israel using AI to pick targets in Gaza – report The program is reportedly designed to detect Hamas operatives, but Israeli military sources say it often marks innocents for death https://t.co/JCTVHklMoC https://t.co/fRuDNBcFeZ
The AI nightmare is it becomes autonomous killing all it identifies as an enemy @972mag reports that Israel has developed an AI targeting system with little human oversight We need a moratorium on AI targeting and ban weapon sales to all that use them https://t.co/BtHOBvgVKT
'The IDF uses artificial intelligence to cross-reference its database and then from there human operators look at all of the risks' Military analyst @YaakovLappin breaks down the 'fake news' circulating that Israel uses AI to determine targets in #Gaza https://t.co/TKyTaIlo9u
Am I crazy for thinking the “AI targeting” story (which fundamentally presupposes that the IDF are striking legitimate targets “disproportionately”) is a kind of “limited hangout” which serves to distract from incidents like this where civilians are being deliberately massacred? https://t.co/TIE2dw0LPB
Israeli army used controversial 'Lavender' AI system to create 'kill list' of Palestinian militants and bomb 37,000 targets, report claims https://t.co/cOosiKIKCt https://t.co/YZSRygYZid
Report: Israel used AI to identify bombing targets in Gaza https://t.co/wQL2qi2LOd
Israel's reported use of AI in its Gaza war may explain thousands of civilian deaths. https://t.co/rueC1ICH0Y
The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas https://t.co/Cp6ug1RuZc
“War crimes.” Two Israeli media outlets report the Israeli military’s use of an AI-assisted system called Lavender to identify Gaza targets https://t.co/6fWr8zuZfq https://t.co/2xSmysZ1Vt
AI Kill List - The Future of Targeted War is Here Israel's "Lavender" AI targeting of tens of thousands for assassination. And, killing them at home to take out the entire family — the "Where's Daddy" program. https://t.co/dnSrclgfos
Explosive allegations that Israel has a secret AI-powered killing machine called ‘Lavender’ spread on Wednesday in a pair of news reports citing anonymous intelligence sources involved in the Hamas-Israel war. https://t.co/eunwYgWo3o
Israel under pressure over use of AI in #IsraelHamasWar 📹— @ArielLW_i24 https://t.co/l4hILjjPcZ
🔴 Israel denies using AI to target 37,000 people in Gaza with airstrikes https://t.co/0XyM8w6BsR
On #AI use by the Israeli military in Gaza--2 links: https://t.co/CjEeIf5ZWQ and https://t.co/jBP233VvUt #ethics #tech #warfare
The Guardian @guardian: Israel Used AI Machine 'Lavender' To Target Hamas - Outside the Beltway. #aistrategy #ArtificialIntelligence #aiact https://t.co/QIKhDAivMt
Israeli Military Uses AI to Target Palestinians for Assassination https://t.co/hbyCBAgLIt
"AI (is) removing the resource constraints that in the past would prevent the IDF from identifying enough targets. Now they're able to go after significantly lower targets with tenuous or no connections at all to Hamas" -- @rosen_br https://t.co/SZYyXTy2D1
Real or Ai? I don't know anymore... https://t.co/PjzPXiko6f
Israel’s argument that they kill so many civilians b/c Hamas uses “human shields” is torn apart by the revelation that the IDF prefers to attack its “targets” when they are at home with their families. It is not Hamas using human shields, it is Israel deliberately hunting… https://t.co/AbWppvdp4j
Israel's using AI to target Palestinians. A very clever trick, so then they can blame the AI, instead of taking the blame themselves. https://t.co/q5Po6pkVmt
Or, to be pedantic, when laughably imprecise “AI” systems probabilistically determined someone was a worthy target of death (think ad targeting but for killing) IDF waited until they went home and killed them and their family and their neighbors and their pets and etc. https://t.co/p6wEhaArMN
🇵🇸🇮🇱‼️🚨 BREAKING: Israel uses an AI for targeting and allows approvers 20 civilians deaths! Israel’s Military Bombing Campaign Uses AI-Powered Database – Which “Identified” 37K Potential Hamas-Linked Targets The Lavender system processed masses of data which was used to… https://t.co/KtRjQOUOww
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets https://t.co/YGyouf1PtX
A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as "Lavender". The two magazines, quoting six Israeli intelligence officers, who have all served in the occupation army during the… https://t.co/Bx3yLEQUck
The Guardian @guardian: Israel is using artificial intelligence to help pick bombing targets in Gaza, report says. #industry40 #MachineLearning #ArtificialIntelligence https://t.co/lPeRfHNr7I
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza The system has been used to identify and attack 37,000 targets, although about 10% of them are misidentified. But there was no ‘zero-error’ policy. Mistakes were treated statistically. The machine gives almost… https://t.co/vRXl7MbmWe
The Guardian @guardian: Israel disputes it has powerful AI program for targeted killing that tolerates civilian casualties. #MachineLearning #aistrategy #ArtificialIntelligence https://t.co/0E1jleMbcj
Must-read reporting by +972 on the IDF using “AI” in their indiscriminate murder in Gaza. An absolute nightmare of AI harms cranked up to the extreme: mass surveillance, "we don't have any choice but to automate", AI as pretext for deadly violence. https://t.co/0OHGM3jDCH
Israel is aggressively disputing assertions that it is using an artificial intelligence system for a targeted killing program that tolerates civilian deaths as acceptable collateral damage in its war against Hamas. https://t.co/WnbyoACrNa
So you know how we've been having this big debate for several years now about the ethics of giving military AI systems release authority for lethal weapons? Discussion's over. Israel's been doing AI targeting for months now and their soldiers appear to have been blindly… https://t.co/HyDeq83P6z
Another blockbuster investigative report from @yuval_abraham and @972mag on Israel's use of AI systems in Gaza. https://t.co/IpafiRB8qY
An extraordinary piece in Israel's @972mag reporting on the use by Israeli forces of deeply flawed AI processes to choose targets in Gaza and then destroy them with dumb bombs that pretty much guaranteed enormous numbers of civilian casualties. https://t.co/wkAw53NK63
Israel disputes it has powerful AI program for targeted killing that tolerates civilian casualties https://t.co/dZ7PT0zIAa
IDF denies it uses AI software to target individuals in Gaza bombing campaigns https://t.co/54vfCNbMvI
An investigation by the Guardian cited intelligence sources as stating that Israel used an AI called Lavender to target over 37,000 individuals in Gaza accusing them of being connected to Hamas and permitted large numbers of Palestinian civilians to be killed. https://t.co/6SBcOVIsHG
WTF, Israel has an AI targeting system called “Where’s Daddy?”, it waits until suspected Hamas members are in family home & then orders human pilots (in US-made planes) to drop (US-made) bombs on building— killing the target, their family & neighbors. Rinse & repeat 1000 times. https://t.co/cxltXf0GjF https://t.co/bsyZ7ImOmz
The Israeli army is using an artificial intelligence machine, Lavender, to select targets for assassination in Gaza. The choices are given only cursory human review, mainly to check if they are male, despite a known error rate of 10% (probably much more). https://t.co/AsWbWuw8Df
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza https://t.co/78DnvRFahQ
As we embark on more and more powerful AI, we also have to be concerned about the inherent dangers and how AI can be used for nefarious reasons. There are two AI systems being used by the Israeli military from the article below written by journalist Yuval Abraham. The first… https://t.co/Gceo8m36F0
If you're publicly speculating about how AI systems might one day exterminate all of us, maybe speak up against how AI systems are being used now to exterminate Palestinians. 'Lavender' - IDF AI to generate a kill list of individuals https://t.co/rnARNPYugQ
As we embark on more and more powerful AI, we hAs we embark on more and more powerful AI, we have to also be concerned about the inherent dangers and how AI can be used for nefarious reasons. There are two AI systems being used by the Israeli military from the article below… https://t.co/Gceo8m36F0
The Guardian @guardian: AI News: Dark Side Of AI Uncovered In Israel-Hamas War - CoinGape. #AI #industry40 #aistrategy https://t.co/Z7zxXHPKxM
Sources: Israel's bombing campaign in Gaza used Lavender, an AI system that identified 37,000 potential human targets based on their apparent links to Hamas (The Guardian) https://t.co/V5IaujdeKu 📫 Subscribe: https://t.co/OyWeKSRpIM https://t.co/uofSqChdMR
using ai to determine war targets? israel in uncharted waters and testing efficacy in real time in gaza war https://t.co/xwfmu3lPEH
This story is horrific, and basically confirms every one of the fears we had back in October when we knew the IDF was using AI—it's error-prone, there's scant human oversight, and it facilitates the mass rubber stamping of targets, of mass killing. https://t.co/LnwPGVUFAn
Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report https://t.co/fHaogXhzAe https://t.co/UmuGiO8mQe
Where are the "AI safety" folks re Israel's militarization & misuse of the tech?
'During the first weeks of the war, Israel's AI-based targeting system called "Lavender" (I think this is a typo for "Lucifer") clocked as many as 37,000 Palestinians as 'suspected' militants — and their homes — for possible air strikes.' https://t.co/UmUTJ62Jvc
The Guardian @guardian: Israel Used Lavender AI To Zero In On Hamas Targets, Say Intel Officers, IDF Denies Claims. #MachineLearning #aiact #industry40 https://t.co/Hen2cSLdDU
NEW investigation in #Israel finds @IDF AI system favors strikes on #Hamas member homes -- with 15-20 civilian deaths acceptable in strikes on fighters & ~100 civilian deaths for #Hamas commanders. An AI system called "Where's Daddy" triggers a strike when a suspect comes home. https://t.co/AqIcUi1wxF
Frightening use of AI https://t.co/QUAKtaSBW8
Israel used an automated system called "Where's Daddy?" to ensure it killed AI-designated targets while the targets were at home with their spouses and children: https://t.co/E18lbV8Dif
Horrific account of near-indiscriminate bombing in Gaza 'directed' by an AI machine. https://t.co/Zv4YmFBnsg
Really must-read reporting by @yuval_abraham on the Israeli government's use of AI tech and the bombing of Gaza https://t.co/7wfWqJwxV1
Gaza aid worker deaths heighten scrutiny of Israel’s use of AI to select targets https://t.co/iNfTL4R9ey
Description of AI system to advise the Israeli army on the choice of targets.. Allegedly it was to be used to target people in their homes (among their families etc), sometimes with 20s to decide about attack. Supposedly a statistical recommender system. https://t.co/ePUwFMKsSg https://t.co/z7IgWyuRmQ
https://t.co/ucEWSCijYN "During the first weeks of the war, the army allowed for the killing of up to 15 or 20 civilians for every junior Hamas operative targeted. For senior officials, the permissible collateral damage was even higher, with more than 100 civilians allowed to be…
Another deeply disturbing expose about how the Israeli military is using artificial intelligence machines to target Palestinians en masse in Gaza. Start Up Genocide. https://t.co/50BM1XN6G3 https://t.co/KUb7zfjXzS
Genocidal AI: "In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians" https://t.co/2JDwRtzLp6
This report describes, in detail, the horrors of machine-based targeting in warfare. Specifically an AI program called "Lavender" developed by the Israeli army, to generate targets for strikes in Gaza, often in people's homes. https://t.co/9jVOrfbJsR
This is without a doubt one of the most important pieces of reporting on Gaza, and by far one of the most disturbing. All by Israeli journalist @yuval_abraham based on Israeli whistleblower accounts from within the IDF and intelligence agencies. Israel has developed an AI called… https://t.co/aEcOxuT5tX
🚨Absolutely stunning new investigation by @yuval_abraham of @972mag. TLDR: In Oct, IDF expanded kill lists from "top commanders" to "everyone Hamas"—37k ppl. Via a slap-up AI bot. And it increased quota of permitted collateral damage from "zero" to "20" civilians per target.
Gaza aid worker deaths heighten scrutiny of Israel’s use of AI to select targets ➡️ https://t.co/pg4i685SPz https://t.co/HCUYg4lmUi
Fake AI. https://t.co/O5dAUwdpxb
The Israeli military is using an AI targeting program called Lavender that tagged around 37,000 Gazans as suspected militants, has around a 10% error rate, and led to systematic targeting of suspects in their family homes, +972 Magazine reports: https://t.co/ykfJVSGsZy
AI IDing human targets in Gaza based on probabilistic stereotypes (37K people!). People familiar w/AI know how inaccurate such assessments are. Targets' homes are then bombed at night when they (their family, neighbors, pets) are most likely to be home https://t.co/ApQqiciMbK https://t.co/ZFjEklpNfk
🤖🇮🇱 'AI in warfare': Israel's AI tool Lavender identified a staggering 37k potential Hamas targets. The revelations raise ethical, legal implications & push us further into uncharted AI territory. https://t.co/ofZs1XuOws
Israel's "unprecedented" calculations for how many Palestinian civilians the IDF is authorized to kill in order to kill a single suspected Hamas operative: 10-15 civilians for every junior operative. 100 civilians for a single "senior" commander. https://t.co/TjpI2o0wAs https://t.co/90naI0tOAI
I spoke with Israeli intelligence officers about the AI-based target machine they used which marked 37,000 Gazans as suspects for assassination. These whistleblowers expose numerous machines & policies that killed thousands of civilians since October. https://t.co/462q4P6y2q
#Israel used #AI to identify 37,000 #Hamas targets - “the IDF applied pre-authorised allowances for the number of civilians who could be killed.” 2 sources said they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. https://t.co/bwZila0syd
New investigation by @yuval_abraham in @972mag & @mekomit reveals that the Israeli army has developed an AI program known as “Lavender” to mark 37,000 Gazans as suspects for assassination, with little human oversight and a permissive policy for casualties. https://t.co/x9wqQWbMJC
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza. An investigation by @yuval_abraham, in partnership with @mekomit. https://t.co/Prxl1MydmC