(Vatican Radio) Archbishop Silvano Tomasi, the Holy See’s Permanent Observer to the United Nations in Geneva, said it would be wise to prohibit those robotic weapons with lethal capacity that cannot be effectively controlled by humans. He was speaking at an Informal Expert Meeting on Lethal Autonomous Weapon Systems held in Geneva.
Please find below the full text of Archbishop Tomasi’s address (made in English) to the meeting:
In today's conflicts, we note the growing importance of conventional military robots or computerized robots. For either one, there is the development of research aiming at granting them larger autonomy in relation to their human operators and a capacity for innovation. This does not go without raising deep juridical and ethical questions. Furthermore, the analysis of combat robots cannot be separated from an analysis of the nature of conflicts that happen more and more in essentially urban areas.
These new developments prompt a serious evaluation of combat robots before we move too far in their integration into the methods of modern warfare. However, it would be insufficient to approach the question merely from the perspective of International Humanitarian Law. The substitution of human decision making by sophisticated and autonomous machines raises a series of questions that are linked to anthropology and ethics.
Mr. President,
I allow myself to enumerate certain elements that my Delegation considers of fundamental importance for our discussion. A more substantial document will be distributed and it will deal essentially with an ethical reflection concerning lethal autonomous weapon systems. International Humanitarian Law is necessary and many agree that it is an important framework to regulate the use of autonomous armed robots. However, this framework is certainly not sufficient. In addition, we know it has sometimes been attempted to justify immoral conflicts by appealing to “the just war theory”. It is therefore necessary to go beyond International Humanitarian Law to grasp, in all their dimensions, the challenges raised by roboticized weapon systems. Certain issues deserve particular attention and demand discussion, if we want to approach the questions of military robotization from an ethical point of view.
1. The temporary delegation or the permanent transfer of certain powers to machines endowed with the capacity for innovation and action must always be evaluated in accordance with anthropological coherence. By adopting civilian or military technological instruments, designed to assist in a series of difficult or dangerous tasks, human beings must always be attentive so as not end up, after having delegated these powers to their machines, in a situation where they become the slaves of their own inventions.
2. The relational aspect is equally important. Today, perhaps, the risk of progressively substituting human actors with machines is not given sufficient consideration. As the philosopher Emmanuel Levinas has shown, the encounter with the face of another is one of the fundamental experiences that awaken moral consciousness and responsibility. All wars are a step backwards from human dignity. However, a war waged unilaterally by technological systems where man is absent can increase this dehumanization.
3. To decide if this or that action is legal or legitimate from an ethical point of view, it is necessary to refer to norms, to principles, whose application to particular contexts demands the capacity for interpretation and appreciation, which are not easily translatable into algorithms. A tradition, coming from Aristotle and Thomas Aquinas, has defined “prudence” as the virtue by which universal knowledge is applied to particular contingent realities.
4. Another important aspect to which we need to be attentive is the fascination created by armed robots and the feeling of power that they elicit. Their use may be linked implicitly to a desire of omnipotence, rather than the desire to make available means which are proportionate for a just defense. The development of “augmented soldiers”, namely of combatants who have received unusual capacities (perceptive, cognitive and performance) by means of a roboticized system (such as exoskeletons), or completely autonomous robots endowed with great abilities, may not come from direct military interests, but from dreams of power. Our humanity is particularly recognized in our ability to restrain our own power.
5. The roboticized weapon, particularly the one which would be autonomous, poses important questions from the point of view of responsibility. In the event of collateral damage, caused by weapon systems controlled by a human operator, the latter is naturally recognized as the one bearing the responsibility. But if these collateral damages are the result of an autonomous machine, even though the global responsibility would clearly be incumbent to the authority who put it to action, it will always be possible to excuse oneself by invoking a series of malfunctions (bugs, technical failures, misconceptions, scrambled communication, etc.) for which the authority would not be responsible. This, in turn, could promote the use of such armed robots because of the impunity they allow. The evaluation of personal responsibility is also related to one’s intention. The taking into account of intention is central in applying, for example “the principle of double effect”, characteristic of situations where an action causes at the same time good and bad consequences. Indeed, this principle demands that we fundamentally have the intention to produce good consequences and not bad ones (even if the latter appear to be inevitable). This notion of intention, equally important for law as for ethics, which greatly influences the question of responsibility, cannot be easily linked to a concept or a technically apprehend ed or apprehensible reality by today’s robotics or computers.
6. One of the crucial points which the ethicist will have to evaluate is the question to know if one can permit the action of machines of which we cannot entirely predict their behavior. One of the conditions to be able to utilize an armed robot is the assurance that it will never produce any action forbidden a priori by its user. However, this is never guaranteed, in an absolute way , whatever the degree of sophistication of the program, due to the logical limitations that have been put in evidence by theoretical algorithms. In addition, these complex systems are not immune to bugs or computer piracy.
7. The economic and financial implications of civilian and military robots are colossal. It is certain that the richer countries will profit from the advantage that is provided by roboticized weapons. Therefore, a gap will develop between those who possess and utilize this kind of combat technology and those who do not possess it. Situations of injustice will appear, as well as possible easy violations of national airspace of less technologically favored countries. We could therefore witness to a widening of the gap already existing between nations.
Besides these problems, a race to obtain roboticized weapons, of ever-greater efficiency, risks to lead to the squander of resources, which could have been useful in other sectors such as health or education. In conclusion, Mr . President, the arguments presented show that, among the new roboticized weapon systems, it is wise to prohibit the systems that possess lethal capacity and that at the same time escape effective human control. The prohibition we advocate is not so much concerned with the autonomy of these systems but the conjunction of their lethal capacity with the possibility of loss of their effective control by the human person.
I thank you Mr . President.
All the contents on this site are copyrighted ©. |