Categories
Uncategorized

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

Good piece on Lavender’s use +972 Magazine

Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets.

The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.

One source stated that human personnel often served only as a
“rubber stamp” for the machine’s decisions
, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is
male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity.

What I’ve noticed is the use of Lavender is being presented as the cause for the excessive civilian casualties- But it’s not the AI to blame it’s Israel’s, in my opinion, intentional goal to cause mass civilian casualties.

Read the entire piece at the opening link

6 replies on “‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza”

Israel will use any excuse she can. Any diversion she can. Israel will blame whoever, whatever. Eternal victims are always victims; in this case blame AI for their foul deeds. Too much glee for anyone to buy that.

I’m on board with this being an excuse for the mass civilian casualties- The AI was programmed to do as it has and any check and balances that should have occurred appear to not have been done.

small correction penny. You can’t program AI, you train AI’s by feeding it training data which you then use to classify real data.

When you program something its an algorithm which can’t be an AI.

Because the problem with AI’s are. They are black boxes.

Incase black box in an unfamiliar term. Black boxes have visible input and visible output. However how it processes the input to result a certain output is unknown.

thanks for clarifying/correcting Kaz- I’ve been honest about my lack of understanding on these techie kind of things
Black box is a term I’d associated it with plane crashes- Same or am I way off?

Its true that its associated with plane crashes, but different meaning.

In relation with airplanes. The “black box”(bright orange in real life) is recording all the events in an airplane. When a plane crashes they can retrieve the black box to see what happened.

This is a different definition than black box used in engineering as I described before.

Leave a Reply

PFYT2