Categories
Ξ TREND

Israel-Palestine War: How the ‘Habsora’ AI System Masks Random Killings with Math


Israel’s war on Gaza has seen it strike the Palestinian enclave with a new and deadly ferocity. According to a recent report, this attack is powered by an artificial intelligence system that experts say is indiscriminate and inherently flawed.

As part of a joint investigation, Israeli media outlets +972 Magazine and Local Call conducted interviews with several former and current Israeli intelligence officials, revealing that the military has lower expectations than before when it comes to limiting targets civil.

The relaxation of the rules was combined with the use of “Habsora” (“The Gospel” in Hebrew), an AI system capable of generating targets at faster rates than before, making it easier for a former intelligence officer called it a “mass assassination factory”.

Officials admitted to media that the homes of low-ranking members of Hamas and other Palestinian armed factions had been deliberately targeted, even if it meant killing everyone in the building.

In one case, Israeli military intelligence approved the killing of hundreds of Palestinians to assassinate a single Hamas member.

Stay informed with MEE newsletters

Sign up to receive the latest alerts, information and analysis,
starting with Turkey Unpacked

“This is the first time they are talking about how civilians are being targeted on a large scale just because they hit a military target based on AI technology,” said science professor Anwar Mhajne policies at Stonehill College. in Massachusetts, » told Middle East Eye.

“Not feasible at all”

When discussing the Habsora system, a source told Israeli media that the emphasis was on quantity, not quality. They added that even if a human eye examines the targets before each attack, it does not take much time.

Mhajne said: “If you are waging a war on a scale like the one you are waging now in Gaza, how much more can you see again?

An Israeli expert on the military use of AI, who spoke to MEE on condition of anonymity, said it is “not at all feasible” to have a human examine every AI-generated target. AI in Gaza.

He added that the algorithm does not explain how it arrives at its conclusions, making it difficult to verify the validity of a strike’s outcome.

Follow Middle East Eye’s live coverage for the latest news on the Israeli-Palestinian war

While Israel estimates that Gaza is home to around 30,000 Hamas members, experts are concerned about the massive civilian casualties that can be caused by the use of these systems.

The Israeli army estimates that it has killed between 1,000 and 2,000 Hamas members in Gaza since October 7. More than 15,000 Palestinians were killed during this period, including at least 6,150 children.

“We are talking about thousands of civilians who have been killed (due to the use) of such technology,” Mona Shtaya, a non-resident researcher at the Tahrir Institute for Middle East Policy, based in Washington.

“A more important surveillance system”

According to Shtaya, Israel’s use of AI as a military and surveillance tool is neither new nor unexpected.

“AI is part of a larger surveillance system, in which Palestinians live under constant surveillance,” she said.

In 2021, a Washington Post investigation found that Israeli soldiers used an extensive facial recognition program to increase their surveillance of Palestinians in the occupied West Bank city of Hebron. The army also installed face cameras throughout the city “to help soldiers at checkpoints identify Palestinians before they even show their ID cards.”

The same year, Amazon Web Service and Google signed a $1.2 billion deal with the Israeli government known as Project Nimbus. Employees of these two companies warned that this cloud service “enables increased surveillance and illegal data collection on Palestinians, and facilitates the expansion of Israel’s illegal settlements on Palestinian lands.”

Israel also reportedly used AI during its previous major offensive in Gaza in 2021, in what it called “the world’s first war on AI.” During the 11-day battle, drones reportedly killed civilians, damaged schools and medical clinics, and flattened high-rise buildings.

Today, more sophisticated systems are used in the Gaza war, going so far as to predict the number of civilian casualties a strike would cause.

“Nothing happens by accident,” a source told +972 Magazine and Local Call. “When a three-year-old girl is killed in a house in Gaza, it is because someone in the army decided that it was okay that she was killed – that it was a price worth paying to hit (another) target. We are not Hamas. These are not random rockets. It’s all intentional. We know exactly how much collateral damage there is in every house. »

“They have a testing ground”

The current war began when Palestinian factions led by Hamas launched an attack on Israel, killing more than 1,200 Israelis and taking around 240 prisoners. Israel responded by heavily bombing the Gaza Strip and invading the coastal enclave, destroying much of the civilian infrastructure.

Sources involved in the investigation said they believed the widespread killings and destruction could be used to give the Israeli public an image of victory. Mhajne believes this goal can be extended to mirror Israeli technology.

Israeli-Palestinian war: what we know so far about the end of the truce in Gaza

Learn more »

“The Hamas attacks showed the weaknesses of AI in surveillance,” she said.

She said Hamas’ ability to enter Israel unnoticed after its fighters dismantled signal towers around the Gaza Strip had caused serious damage to its reputation.

Israeli spyware technology has been particularly used in many countries to target journalists and activists.

Israel is also the world’s 10th largest arms exporter, with a particularly strong reputation for cybersecurity and AI weapons.

“They are testing things on Palestinians. This is why Israel is leading in developing cybersecurity and AI, because they have a testing ground,” Mhajne said.

“No one talks to them about how they develop it and how they test it. I guarantee you that this technology, after the war, will be sold to every repressive regime you know. »

Shtaya agrees, saying that AI warfare technologies such as Habsora’s are “simply used to impress and facilitate their work in destroying the Gaza Strip.”

Even if this system remains strictly in the hands of the Israeli army for the moment, the Israeli expert believes that this will change.

“In the future, people who work there will go to the private sector, make similar things and export them, that’s for sure,” he said, asserting that Israeli arms sales have already soared. arrow. “This war is already beneficial for Israeli arms dealers and their exports. »

‘There are no limits’

While many are calling for Israel to be held accountable for its actions in Gaza, with UN bodies warning it could lead to accusations of war crimes and genocide, holding it accountable for its use of AI could prove more complicated.

While some governments and international organizations regulate the use of AI for military purposes by asserting that it must remain within the bounds of international law, there are few if any regulations specific to AI in warfare.

Moreover, Israel has so far shown no signs of regulating its use of this new technology, even if it means killing more civilians.

“Because Israel now considers Hamas an existential threat, there is no limit,” the Israeli expert told MEE, suggesting it could go as far as killing Israeli captives if it allows ‘reach Hamas’ highest commanders.

“AI certainly gives the military an illusion of precision and mathematical analysis, which is false,” he said. “All the human faults from which the algorithm has learned are automatic. »

The International Committee of the Red Cross believes that AI can be a tool to make better decisions in conflicts and avoid civilian casualties. Shtaya also believes that these technological advances, when used correctly, can generally improve people’s quality of life.

“It is painful and devastating to see this type of technology used by the state to oppress people and make their lives more difficult, just to suffer this collective punishment,” she said.