Archbishop Tomasi: address to UN weapons convention
(Vatican Radio) The Permanent Representative of the Holy See to the United Nations
and Other International Organizations in Geneva, Archbishop Silvano Tomasi, addressed
the Meeting of Experts on Lethal autonomous weapons systems of the High Contracting
Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional
Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate
Effects. Below, please find the full text of his statement, in English.
Listen
to Archbishop Tomasi's remarks to Vatican Radio:
********
Statement
by H.E. Archbishop Silvano M. Tomasi Permanent Representative of the Holy See to
the United Nations and Other International Organizations in Geneva at the Meeting
of Experts on Lethal autonomous weapons systems of the High Contracting Parties to
the Convention on Prohibitions or Restrictions on the Use of Certain Conventional
Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate
Effects
13 May 2014 Mr. President, Let me first commend you for the
good preparation for this very important meeting, even if the mandate is simply to
discuss in an informal setting emerging concerns around new technologies which would
not only impact the way of conducting war but more importantly would question the
humanity of our societies in relying on machines to make decisions about death and
life. In 2013, this Delegation expressed its deep concerns in relation with the
use of drones and the troubling ethical consequences for users and victims alike.
While in many fields, autonomous technology may indeed prove beneficial to humanity,
the application of autonomy to weapons technology is entirely distinct: it seeks to
place a machine in the position of deciding over life and death. We are most troubled
by emerging technologies of autonomous weapon systems which may move beyond surveillance
or intelligence-gathering capabilities into actually engaging human targets. Good
intentions could be the beginning to a slippery slope. When humanity is confronted
with big and decisive challenges—from health to the environment, to war & peace—taking
time to reflect, relying on the principle of precaution, and adopting a reasonable
attitude of prevention are far more suitable than venturing into illusions and self-defeating
endeavours. Autonomous weapon systems, like any other weapon system, must be reviewed
and pass the IHL examination. Respect for international law, for human rights law,
and IHL is not optional. The Holy See supports the view that autonomous weapon systems
have, like drones, a huge deficit which cannot be addressed only by respecting the
rules of IHL. To comply, these systems would require human qualities that they inherently
lack. The ethical consequences of such systems if deployed and used cannot be overlooked
and underestimated. The increasing trend of dehumanisation of warfare compels all
nations and societies to reassess their thinking. The prospect of developing armed
robots designed to engage human targets has the potential of changing the fundamental
equation of war. Taking humans “out of the loop” presents significant ethical questions,
primarily because of the absence of meaningful human involvement in lethal decision-making.
Mr. President, For the Holy See the fundamental question is the following:
Can machines—well-programmed with highly sophisticated algorithms to make decisions
on the battlefield in compliance with IHL—truly replace humans in decisions over life
and death? The answer is no. Humans must not be taken out of the loop over decisions
regarding life and death for other human beings. Meaningful human intervention over
such decisions must always be present. Decisions over life and death inherently
call for human qualities, such as compassion and insight, to be present. While imperfect
human beings may not perfectly apply such qualities in the heat of war, these qualities
are neither replaceable nor programmable. Studies of soldiers’ experiences support
that human beings are innately averse to taking life,and this aversion
can show itself in moments of compassion and humanity amidst the horrors of war. Programming
an “ethical governor” or “artificial intelligence” to enable autonomous weapon systems
to technically comply with the law of war in the areas of distinction and proportionality,
even if possible, is not sufficient. The fundamental problem still exists: a lack
of humanity, a lack of meaningful involvement by human beings in decisions over the
life and death of other human beings. The human capacity for moral reasoning and ethical
decision-making is more than simply a collection of algorithms. The human factor in
decisions over life and death can never be replaced. It is already extremely complex
to apply the rules of distinction and proportionality in the context of war. Distinguishing
combatant from civilian, or weighing military gain and human suffering, in the heat
of war, is not reducible to technical matters of programming. Meaningful intervention
by humans, with our unique capacity for moral reasoning, is absolutely essential in
making these decisions. Part of the justification for developing these weapons
may be the idea that “if we don’t develop this technology, someone else will.” The
development of complex autonomous weapon systems is likely out of the reach of smaller
states or non-state actors. However, once such systems are developed by larger states,
it will not be extremely difficult to copy them. History shows that developments in
military technology, from crossbows to drones, give the inventing side a temporary
military advantage. The inevitable widespread proliferation of these weapon systems
will fundamentally alter the nature of warfare for the whole human family. Minimizing
the risks to its own forces is understandable and legitimate. However, with no casualties
or tales of horror from one side, the domestic political cost of waging war becomes
less significant. This represents an important deterrent to overly-hastened military
action, and is a deterrent that should not be lightly disregarded. Autonomous weapon
systems technology makes war too easy and removes its reliance on soldierly virtues.
Several military experts and professional, who consider killing people a most serious
matter, are deeply troubled by the idea of delegating these decisions to machines.
Obviously these voices value the potential of robots to assist in bomb disposal, evacuation
of the wounded, or surveying a battle scene, but the potential for robots to completely
replace soldiers on the field remains of grave concern to them. Furthermore, the
delegation of the human decision-making responsibilities to an autonomous system designed
to take human lives creates an accountability vacuum that makes it impossible to hold
anyone sufficiently accountable for violations of international law incurred by an
autonomous weapon system. It is exactly these concerns that call for a multilateral
approach to questioning the development and implementation of autonomous weapon systems.
As in the case of actions like the Protocol on Blinding Laser Weapons, it is imperative
to act before the technology for autonomous weapon systems progresses and proliferates,
before such weapons fundamentally alter warfare into an even less humane, less human,
affair. Mr. President, In conclusion, it is important to recognise that meaningful
human involvement is absolutely essential in decisions affecting the life and death
of human beings, to recognise that autonomous weapon systems can never replace the
human capacity for moral reasoning, including in the context of war, to recognise
that development of autonomous weapon systems will ultimately lead to widespread proliferation,
and to recognise that the development of complex autonomous weapon systems which remove
the human actor from lethal decision-making is short-sighted and may irreversibly
alter the nature of warfare in a less humane direction, leading to consequences we
cannot possibly foresee, but that will in any case increase the dehumanisation of
warfare.