Artificial Intelligence in Weapon Systems? Bishop Helen-Ann joins the debate

First published on: 19th April 2024

 

On 19 April, Bishop Helen-Ann spoke in the House of Lords during the debate on Artificial Intelligence in Weapon Systems Committee report. Read her full speech below. 

My Lords, I’m grateful to Lord Lisvane for his opening summary of this important report, and to Lord Stevens for his remarks just delivered reminding us of the role of maritime contexts in this debate. I too wish to thank all those involved in the creation of the Report. Perhaps this alone is worth noting. AI didn’t produce this report, human beings did.

My Lords, my friend the Rt Rev Prelate the former Bishop of Coventry was a member of the Committee producing this report and he will be delighted that it is receiving the attention it deserves. He is present today and I hope he doesn’t mind me saying this on his behalf.

The principles of ‘just war’ are strongly associated with the Christian moral tradition in which it is for politicians to ensure that any declaration of war is just and then for the military to pursue that war’s aims by just means. In both cases ‘justice’ must be measured against the broader moral principles of proportionality and discrimination.

This then is where AI begins to raise important and urgent questions. AI opens new avenues of military practice that cannot be refused; together with new risks that must not be ignored. The report rightly says: we must proceed with caution. But it does say proceed. Here my Lords there is an opportunity for the UK to fulfil its commitment to offer leadership in this sphere in the international field.

There is a risk of shifting the decision-making process and the moral burden for each outcome onto a system that has no moral capacity and too much opacity. To implement AI’s benefits well, military cultural values need to be named, explained and then translated into autonomous weapon systems’ command and control – especially where the meaning of ‘just’ diverges from the kind of utilitarian calculus that most easily ‘aligns’ digital processes with moral choice.

Inherent human values, including virtue, should also be embedded in the development and not just the deployment of new AI enabled weapon systems. As recent use of AI systems shows in the context of global conflict, AI changes questions of proportionality and discrimination. When a database identifies 37,000 potential targets using ‘apparent links' to an enemy; or an intelligence officer says ‘The machine did it coldly. And that made it easier’, it is vital to attach human responsibility to each action.

AWS designed according to military culture, will at best, practically strengthen the moral aspects of just war – by reducing or eliminating collateral damage. But we should guard against a cultural re-wiring or feedback loop that dilutes or corrodes the moral human responsibility that just war depends upon. It is reassuring, therefore, as other noble lords have noted, to see a clear statement that accountability cannot be delegated to a machine in the government’s response to this report; together with the government’s commitment to fully uphold national and international law.

My Lords, in conclusion: current events across the globe and the rapid pace of development of AI in both civil and military contexts make this a timely and important debate. I commend the Committee, and those in government and in the MOD charged with transforming its helpful insights and practical recommendations into concrete action. Finally, my Lords public confidence and democratic endorsement of any plans the Government might have in the development of AI is vital. I therefore urge the Government to commit to ensuring public confidence and education in its ongoing response to this Report.

Powered by Church Edit