Gaza War
Technology

When ‘Proceed with Caution’ is Ignored, Accountability must still Prevail

Date: May 27, 2024.
Audio Reading Time:

Artificial Intelligence (AI) has the capability to improve the well-being of every person on Planet Earth. It has a capability well beyond that of a human to recognise patterns in datasets and provide ‘answers’ to questions. It also underpins the viability of autonomous technological systems.

However, there are concerns in that, rather than improve human well-being, especially in the workplace, it will devastate well-being through the destruction of workplace employment. Further, there is one application of AI that raises the spectre of ‘doom’.

This is the use of AI in ‘Autonomous Weapon Systems’ (AWS). My previous article ‘Proceed with Caution ’ commented about the recent UK House of Lord’s Report about the use of Artificial Intelligence (AI) in Defence. This report emphasized the importance of human control, responsibility and accountability. Moreover, that the development and deployment of AI empowered weapon systems are ethically grounded.

However, there are arguments about why AI should not be used in weaponry. For example, it is dehumanising for a technology to select and kill a human. To add, is the challenge of compliance to International Humanitarian Law (IHL) in terms of the principles of Humanity’, ‘Necessity’, ‘Distinction’, ‘Proportionality’ and ‘Precaution’.

Nevertheless, the attraction of AI empowered weapon systems are speed, accuracy and efficiency and thereby, an expected associated reduction in casualties, especially collateral damage. Irrespective, if we really wish to understand AWS, then we ought to look at the world of practice and the best testbed is in a war. Does AI create more humane wars with reduced collateral damage?

Israel-Hamas conflict in Gaza

One high profile ‘war’ in which AI is acknowledged to be used is the Israel-Hamas conflict in Gaza. This was triggered by events of the 7th October 2023, when Hamas attacked border communities in Israel.

However, all events have historical context. This is the latest confrontation between Israel and Palestinians in an ongoing process of settler- colonialism, which dates back to the 1940s, if not earlier. What confuses our understanding of what is happening is that The Times of Israel reports that ‘Netanyahu propped up Hamas’ for years.

The significance of this current conflict in Gaza is that it has been described as ‘among the deadliest and most destructive in history ’ . Irrespective of the events of the 7th October 2023, at the time of writing this article, there are reported to be over 35,000 Palestinian deaths, with at least 50% being women and children, though this may be significantly higher.

This is aside from an undisclosed number of maimed and orphaned children and those lost in the rubble of destroyed buildings. Around two thirds of all homes (‘domicide’?) and over 80% of schools (‘scholasticide’?) have been damaged or destroyed. To add, it is also reported that only around one third of Hamas fighters have been killed and, likewise, only around one third of its tunnels destroyed. What characterises this conflict is its ‘continuous systematic onslaught ’ , with every aspect of humanity being degraded, as well as atrocities that include torture and execution.

An insight into the principles that perhaps guides Israel’s self-acknowledged use of AI in the selection of and striking at targets is provided in the book ‘The Human Machine Team ’

An insight into the principles that perhaps guides Israel’s self-acknowledged use of AI in the selection of and striking at targets is provided in the book ‘The Human Machine Team ’ . It was published in 2021 by Brigadier General Y.S., who was reported to be a commander of Unit 8200 in the Israel Defense Forces’ Military Intelligence.

The book advocates a policy in which ‘you put your enemy in a situation that is worse than the situation he faced the day before’. This is ‘futuristically’ enacted through a collaboration between humans and AI to identify targets.

The Human-Machine Team has the ability to create tens of thousands of targets before a battle begins, and to assemble thousands of new targets every day during a war. In addition, the ability to create these targets in context means that the military can attack the right targets at the right time.

This relies upon the collection of a very high volume and variety of data, not only about the battlefield, but also about the population, which includes pictures, social media connections and cellphone contacts. The book calls for a ‘system of laws and ethics’ underpinning any decisions about the taking of human lives.

The realisation of this book’s vision of the weaponisation of AI is revealed in two reports by the Israeli-Palestinian +972 Magazine and the Hebrew-language news site Local Call.

The first report (30th Nov., 2023) reveals the existence of the ‘Habsora’ (‘The Gospel’), an AI empowered system that generates targets which transforms Gaza into a ‘mass assassination factory’ .

There are four target categories which comprise military (tactical), underground (i.e. tunnels), ‘power’ (e.g. buildings which will shock and generate ‘civil pressure’ upon Hamas) and (‘operatives’) family homes (killing entire families).

Moreover, collateral damage appears to be accepted, as revealed in the statements ‘ we are authorized to strike all targets that will kill five or less civilians’ and ‘Everything is intentional. We know exactly how much collateral damage there is in every home’.

The second report (3rd April, 2024) provides clarification, explaining that the ‘Gospel’ focused upon physical targets (e.g. buildings), and reveals two other systems, ‘Lavender’ and ‘Where’s Daddy?’.

‘ Lavender’ appears to be an AI empowered system that identifies all suspected military operatives from a dataset about most of the Gaza population, compiled from a mass surveillance system. Each person is then rated as to the likelihood of being a militant based upon the characteristics of known militants, thereby establishing ‘targets’.

It was ‘estimated’ to be 90% accurate, but that there was very limited human supervision to address this and civilians could be selected. Underpinned by a mass surveillance system, ‘Where’s Daddy?’ tracks these targets so that they can be attacked when they have returned to their family home. Entire families could be killed: ‘It’s much easier to bomb a family’s home’.

If there was a lag in time between triggering an attack and the attack itself, the target might have left the house leading to the family being killed, but not the target. From a collateral damage perspective then the report suggests that this was not questioned, even if the number was of the order of 15 to 20, nor effort was made to minimise this. In the case of senior ranking targets, then the number of civilians that could be killed was of the order of 100(s).

Where does this leave us?

First, is the question of the authority or validity of the two reports referred to here. The assumption is that the insight provided by the military sources is truthful. Indeed, why would +972 Magazine and Local Call risk misinformation? The corroboration for the validity of these reports is not only the visible level of destruction and death in Gaza, but in the words of Defense Minister Yoav Gallant: ‘We are fighting human animals and we are acting accordingly ’ .

The second issue, irrespective of the level of autonomous behaviour, is the weaponisation of AI to select, track and target objects to be killed or destroyed.

Given that AI is embedded in a national military system that results in death and destruction, then there is a need to understand the complexity of this system and how it is regulated.

This raises such issues as the nature of underpinning policies and principles as well as how much discretion there is throughout the system to make decisions without referral to a higher authority. Moreover, this invites the question of the nature of any human intervention, especially the issue of accountability for unnecessary death and destruction (i.e. collateral damage). Ultimately, it is the leaders of the system who are accountable for collateral damage.

Third, is the argument that justifies the high level of civilian deaths (i.e. collateral damage). This appears to be based primarily upon the claim that Hamas uses human shields, which includes their use of hospitals, mosques, schools and UN facilities.

Israeli Drone
Given that AI is embedded in a national military system that results in death and destruction, then there is a need to understand the complexity of this system and how it is regulated

Moreover, the 500km tunnel network is claimed to be below civilian infrastructure. However, given the urban density of the small space of Gaza, how much of this ‘presence’ is inevitable? Additionally, given the intentionality of the bombing of ‘power’ targets and family homes with its inevitable loss of human life, particularly women and children, how are the IHL principles of ‘Distinction’ and ‘Proportionality’ explained?

This leads to the fourth issue, which is concerned with compliance to IHL and the principles of Humanity’, ‘Necessity’, ‘Distinction’, ‘Proportionality’ and ‘Precaution’.

The International Court of Justice (ICJ), on 26th January 2024, issued its first order in response to South Africa’s application that violations were being committed by Israel in Gaza that related to the Crime of Genocide.

On 20th May 2024, the International Criminal Court (ICC) ruled that there is a possibility that Israel is committing both war crimes and crimes against humanity, with arrest warrants being sought for the Israeli Prime Minister Benjamin Netanyahu, and Defence Minister Yoav Gallant.

Fifth, is the response by Israel and its supporters to the views and judgements of the UN and internationally recognised legal institutions. On the 15th May 2024, the Financial Times reports that, because official and agencies of the UN have been vocal about the devastation caused by Israel in Gaza, that Israel’s ambassador to the UN stated that ‘the UN has “turned into a collaborator with Hamas . . . maybe even more than that — a terror organisation unto itself”.’

In response to the ICC’s ruling, the US White House issued a statement that included the comment that ‘The ICC prosecutor’s application for arrest warrants against Israeli leaders is outrageous’. In response to the ICJ’ decision that Israel must ‘immediately halt its military offensive in Rafah ’ , US Senator Lindsey Graham is quoted ‘ As far as I’m concerned, the ICJ can go to hell’.

The US was the only country to veto Palestine to become a full member of the UN in April, 2024. This is despite a significant number (143) of countries recognising the statehood of Palestine, with Norway, Spain and Ireland joining this.

Sixth, is the recent statement by UN experts ‘ deploring’ the purported use of artificial intelligence (AI) and related military directives by Israel in occupied Gaza leading to an unprecedented toll on the civilian population, housing, vital services and infrastructure.

To surmise

This article, in many respects, sanitises the true horror of the weaponisation of AI in a conflict as evidenced in Gaza, an Israeli ‘mass assassination factory ’ . Further, no mention has been made to the possible (yet unreported?) use of AI and surveillance systems to control West Bank Palestinians and support the illegal and violent land-grab of Palestinian land.

Irrespective, AI changes the way a war can be conducted. Moreover, AI should not be viewed in isolation, but as a technology that is part of a system comprising technologies, people, organisations, political institutions and the rest.

AI should not make people vulnerable to harm

Underpinning any act resulting from AI are the principles, policies and intent of humans. AI should not make people vulnerable to harm. Core is the valuing of humanity over and beyond IHL or other relevant legislation. Notwithstanding, no nation state can think itself above the law.

Accountability and Justice must prevail over political power and affiliations.

What is the alternative?

Source TA, Photo: Shutterstock