— Image is AI generated.

How Israel harnesses technology to advance its offensive in Middle East

From data mining to air strikes, Israeli warfare may be more digital than previously thought.
Published October 7, 2024

In September, thousands of pagers exploded across Lebanon in what seemed to be a sophisticated attack planned months in advance by Israel, turning the spotlight on the country’s cyber capabilities and its use of artificial intelligence (AI) in warfare.

Since October 7, 2023, Israel has shown no signs of slowing down its military rampage on multiple fronts. It is currently engaged in conflicts with Hezbollah in Lebanon, Hamas in the Gaza Strip, and the Houthis in Yemen.

Despite the Israeli military’s claims of using “precision strikes” and buying a host of new technologies to target specifically Hamas members and limit civilian casualties, the death toll in Gaza now exceeds 41,000 civilians, raising the critical question of how does Israeli military decide its targets and does it use advanced technology in its operations?

Worryingly so, Racheli Dembinsky, commander of Israel’s Centre of Computers and Information Systems unit, at a conference called IT for IDF, termed Israel’s offensive in Gaza its “first digital war” with soldiers claiming they were fighting “from inside their laptops”.

In Dembinsky’s YouTube video, logos of big tech companies such as Microsoft Azure, Google Cloud, and Amazon Web Services (AWS) appear, signifying Israel’s contract with big tech companies for their cloud services.

Israeli army terms the onslaught in Gaza its ‘first digital war’, with soldiers claiming they are fighting ‘from inside their laptops’.

While it has been confirmed by multiple reports that Israel has been using the cloud to store large amounts of data, it is important to note that the data is for increased surveillance of Palestinians.

The elusive 8200 unit

Israel has a specialist cyber warfare and intelligence unit — a secretive and elusive unit named 8200. A Forbes report tells us that 90 per cent of intelligence reports in Israel come from this unit alone, its espionage footprint visible in the nearby states.

The 8200 unit, Shmone Matayim in Hebrew, is said to be part of the Israeli Military Intelligence Directorate and has come into the spotlight after the pagers attack in Lebanon. Its activities are usually considered “highly secretive and range from signals intelligence to data mining and technological attacks and strikes”, according to Reuters.

It is also claimed that its manpower is selected from young people in their late teens and early 20s, some identified from “highly competitive high school programmes”, with many going on to having careers in Israel’s tech and cyber security sector later on.

In fact, it was one of the unit’s officials who claimed that Israel has effectively used AI to identify Palestinian targets. The official said that Israel used an AI system called Lavender to lock in potential Hamas members since the conflict began.

How does Lavender work?

Testimonies of Israeli officials to +972 Magazine show that Israel used the Lavender system in its early stages of the conflict, and it can “process massive amounts of data to generate thousands of potential targets for military strikes”.

It is first-handedly involved in generating targets for the army, according to the +972 Magazine report. Moreover, it played a massive role in the “unprecedented” bombing of the Gaza Strip in the early stages.

An image showing a high-tech military command center utilising advanced technology for real-time data analysis and strategic decision-making. — Image generated by ChatGPT
An image showing a high-tech military command center utilising advanced technology for real-time data analysis and strategic decision-making. — Image generated by ChatGPT

The report, citing an Israeli investigative journalist, said that Lavender assigns a number to Palestinians, ranging from 1 to hundred, based on features defined by the Israeli military on how likely a person was affiliated with Hamas — with a hundred being the most likely.

Israel used an AI system called Lavender to lock in Hamas targets and carry out an ‘unprecedented’ bombing of the Gaza Strip in the early stages of the conflict.

However, the system also brought in names who had “loose connections” or “no connections” to Hamas, with one source saying it also included people who had the same name as a Hamas member.

The system is said to have identified 37,000 targets in the early stage of the conflict alone, marked as targets. As a database, it was said to have been used to cross-reference intelligence sources to produce updated lists for military operations.

Reports also suggest that the Israeli military has “fine-tuned” the parameters of the system’s algorithm to produce potential Hamas targets — with a source saying it generated about “90 per cent accuracy rate”.

That is not all, Aviv Kochavi, the head of IDF, said that the army’s target division was “powered by AI capabilities”, referring to a system, that now we know as Lavender, which produced “a vast amount of data” and translated it into targets for strikes.

To get an idea, Kochavi said that in 2021, the system generated 100 targets per day, while the army could only do 50 targets a year without it. It did so by analysing data from drone footage, intercepted communication, and monitoring the movements of Palestinians.

However, experts have disagreed on the effectiveness of AI to reduce civilian harm and specifically target Hamas members.

Just days into the conflict, Amnesty International published a report called “Damning evidence of war crimes as Israeli attacks wipe out entire families in Gaza” where it stated that “Israeli forces have shown a shocking disregard for civilian lives”.

“They have pulverised street after street of residential buildings killing civilians on a mass scale and destroying essential infrastructure,” the report said.

According to Amnesty, although the IDF claimed it only attacked military targets, it found no evidence of the presence of any fighter in the vicinity of the attacks.

Gospel

It doesn’t end at Lavender. Israel is said to have another AI decision support tool called Gospel — or Habsora in Hebrew.

While Lavender is said to suggest individuals, including people marked as low-ranking Hamas members, Gospel is said to suggest buildings as targets rather than individuals.

The Israeli army is said to use the system in an effort to exert their Dahiya doctrine, which is a military strategy that calls for a mass level of destruction of infrastructure to exert civil pressure on the local authority.

The IDF has confirmed using Gospel. In a statement, they said that the system “allows the use of automatic tools to produce targets at a fast pace, and works by improving accurate and high-quality intelligence material according to the requirement”.

In a press release called “A glimpse of the IDF target factory that works around the clock”, the IDF stated that with support of AI and updated intelligence, the system “produces a recommendation for the researcher”.

AI system ‘Gospel’ targeting infrastructure in Gaza with precise identification of buildings and key locations for strikes, utilising advanced holographic visuals and satellite imagery. — Image generated by ChatGPT
AI system ‘Gospel’ targeting infrastructure in Gaza with precise identification of buildings and key locations for strikes, utilising advanced holographic visuals and satellite imagery. — Image generated by ChatGPT

However, despite marked targets from systems, an Israeli official said they preferred to use “unguided” missiles — and not precision strikes — when it came to targeting alleged low-ranking officials of Hamas with permission given to take in 15 to 20 civilians as casualties if needed.

An Israeli official says the army preferred using unguided missiles over precision strikes when targeting low-ranking Hamas officials, with permission to allow 15 to 20 civilian casualties if necessary.

One intelligence source, according to the report, even went as far as to say that they didn’t “want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs].”

‘Where’s Daddy’ device

However, this is not where Israel’s surveillance ends. According to Democracy Now, in addition to Lavender, a second system called “Where’s Daddy” tracks Palestinians at home with their families.

In an interview with the publication, investigative journalist Yuval Abraham revealed the concept of “linking” in the mass surveillance of Palestinians.

He cited one source who said that to automate such systems, you have to quickly get an ID of a person and be able to link that ID to “other stuff” — a house being the most convenient. And therefore, the system is “designed to be able to automatically link between individuals and houses”.

Where’s Daddy’ AI system monitoring families in Gaza during daylight, using advanced surveillance technology to track movements in private homes. — Image generated by ChatGPT.
Where’s Daddy’ AI system monitoring families in Gaza during daylight, using advanced surveillance technology to track movements in private homes. — Image generated by ChatGPT.

Systems like “Where’s Daddy?” alert intelligence officers of alleged AI-marked fighters when they enter houses, allowing the Israeli military to carry out a large number of strikes against their targets in their homes.

Does the IDF try to limit civilian damage?

One Israeli military source told the Guardian that the military used “very accurate measurements” and ensured evacuation of civilians before conducting strikes.

However, this claim has been refuted by experts on AI and conflict who remain skeptical of AI-based systems limiting strikes on civilians.

Moreover, the IDF itself has contradicted this claim by stating on its official website that it has expanded its bombing targets to include non-military spaces as “power targets”.

More Israeli projects

  1. Project Nimbus

The $1.2 billion project has been in the spotlight lately due to Google terminating 28 workers affiliated with the No Tech for Apartheid.

It is basically a contract awarded to Google and Amazon in 2021 to supply the Israeli government with cloud services, supporting the development of military tools by the Israeli government.

Dembinksy, in her video, explains that storage for data can lead to advanced capabilities in AI, which can include apps for marking targets and a portal for viewing live footage of unmanned aerial vehicles (UAVs).

Along with storage, cloud companies also provide AI capabilities and graphics processing unit (GPU), according to one Guardian report.

In a separate YouTube video posted by an Israeli tech expert Itai Binyamin called “Boost your AI Capabilities with Azure”, at the 27:00 timestamp, it is shown how Microsoft Azure can help with facial recognition.

Microsoft says on its website that Azure AI Face service provides algorithms that detect, recognise, and analyse human faces in images.

  1. Project Sirius

Project Sirius is allegedly related to Nimbus, however, it is yet to be signed by any tech company.

According to several reports, the IDF was on the lookout for experts who could work with cloud services to transfer military systems into the public cloud (Nimbus)“, with Sirius being a private and airtight cloud used only by the IDF.

  1. Z-Tube

An application on the Israeli military’s cloud, which reports say looks very much like YouTube, is used for warfare. On the app, soldiers have access to real-time footage of all its military devices in Gaza, including, but not limited to, drones.

  1. MapIt

Another application making rounds is called “MapIt” which allegedly allows soldiers to mark their live targets on a collaborative map.

The Jerusalem Post, however, does write about a “waze-like” GPS being used by the Israeli army as a secret weapon in Gaza.

  1. Hunter

According to to +972 Magazine, the application Hunter is used for “signaling targets in Gaza and detecting patterns of behavior using AI”.

“Imagine Google Maps software—but for the battlefield in Gaza. The system gathers indications of threats and allows users to receive information about them,” according to one Israeli official.

A separate video posted by Times Now World, shows almost a game room environment where soldiers are seen sitting in a room with computers they use to “remotely locate and confirm” attacks on infrastructure and human targets.


Header image: This image was generated using AI from Meta.