Thank you, Chair.
Human Rights Watch, a co-founding member of Stop Killer Robots, welcomes the discussion of human rights that has taken place in this session. This body of law is highly relevant to autonomous weapons systems because it applies during times of war and peace, and the weapons systems would be used both on the battlefield and in law enforcement operations.
The right to life, highlighted on the agenda of the consultations, has been referred to as the "supreme right" because it is the foundation for all other rights. To avoid arbitrarily depriving someone of their right to life, use of force must be necessary to achieve a legitimate aim and applied in a proportionate manner. The use of lethal force must only be used as a last resort to protect human life. Autonomous weapons systems would face difficulties assessing the necessity and proportionality of the use of force given that such systems would not have the capacity to interpret complex contexts and would lack distinctly human judgment. In addition, they would be unable to defuse a situation and ensure force was only used as a last option.
The human rights concerns raised by autonomous weapons systems extend far beyond the right to life. The use of force by these weapons systems in law enforcement situations would present risks to the right to peaceful assembly. Police officers have a duty to protect peaceful protests and use non-violent means before resorting to the use of force, in accordance with similar criteria as those for the right to life. Autonomous weapons systems would be a poor tool to serve this purpose. The use or threat of use of autonomous weapons systems could strike fear among protesters and thus have a chilling effect on free expression and peaceful assembly.
There are two fundamental human rights principles that autonomous weapons systems would undermine: human dignity and non-discrimination. Human dignity means that all people have an inherent worth. Autonomous weapons systems, however, would kill without the uniquely human capacity to understand or respect the true value of a human life or significance of its loss. Furthermore, they would dehumanize their targets by relying on algorithms that reduce people to data points.
The principle of non-discrimination calls for the protection of all people's human rights, irrespective of race, sex and gender, ability, or other status under the law. Biases of developers could influence the design and decision-making of an autonomous weapon system using AI. In addition, once such a system is deployed, the inability to understand how and why the system makes determinations could prevent a human operator from intervening to correct discriminatory errors.
Autonomous weapons systems would also threaten human rights across the systems' lifecycle. The development and use of autonomous weapons systems that are based on AI technology could violate the right to privacy because it would likely require mass surveillance. To avoid being arbitrary, such data-gathering practices must be both necessary for reaching a legitimate aim and proportionate to the end sought. Mass surveillance fails both these requirements.
The right to remedy seeks to deter future violations of the law and provide retribution for victims. It is triggered at the end of an autonomous weapon system's lifecycle, that is, after it has applied force. There are obstacles to holding individual operators criminally liable for the unpredictable actions of a machine they cannot understand. There are also legal challenges to finding programmers and developers responsible under civil law. Thus, the use of autonomous weapons systems would create an accountability gap, which is of relevance to international humanitarian law and international criminal law as well as international human rights law.
A new legally binding instrument could help address these issues by clarifying which autonomous weapons systems should be prohibited because they violate international human rights law and how the rest should be regulated. Clearer norms are easier to implement and enforce and can influence those outside a treaty by increasing stigma. The proposal specifically to prohibit systems that lack meaningful human control or target people would help avoid the human rights abuses I've described.
For a more in-depth discussion of the threats autonomous weapons systems pose to these rights, see A Hazard to Human Rights, a new report distributed this week by Human Rights Watch and Harvard Law School's International Human Rights Clinic.
Thank you.