New Delhi: There are legal, ethical and operational imperatives for human control over the use of force and, therefore, over Autonomous Weapons System (AWS), says a report jointly prepared by Stockholm based SIPRI International and International Committee of Red Cross (ICRC).
The 53 page report released said humans must retain control (and judgement) over the use of force in specific attacks in armed conflict, and therefore over the use and effects of weapons, for legal, ethical and operational reasons.
However, AWS—weapons that select and attack targets without human intervention—pose fundamental difficulties for humans in exerting such control, which raises associated legal, ethical and operational challenges.
The report prepared by a four-member team said measures for human control must ensure that users have reasonable certainty about the effects of the AWS in the specific environment of use—in other words, that the consequences will be sufficiently predictable.
This requires sufficient situational understanding of the environment of use, understanding of the technical functioning of the AWS, and foresee ability of the interactions between the two. Further, users must exercise judgement and intent in decisions to use force, which requires effective influence over the functioning of the AWS in a specific attack.
According to the report measures for human control can be grouped into three main categories:
- Controls on the weapon system’s parameters of use.
- Controls on the environment are measures that control or structure the environment in which the AWS is used, and which overlap with the weapon system’s parameters.
- Controls through human–machine interaction include measures that allow the user to supervise the AWS and to intervene in its operation where necessary, through direct active control, vetoing or overriding AWS functions, aborting a task or mission, or deactivating the AWS.
Each of these three categories provides different ways to reduce or compensate for the unpredictability in the use of an AWS and to mitigate the risk of wrongful and unintended consequences especially risks for civilians. In order to address legal, ethical and operational considerations, all three types of control measures likely need to be combined in any scenario.
The report is authored by Vincent Boulanin, Neil Davison, Netta Goussec, Moa Peldan Carlsson.
The report makes five recommendations that are intended to inform international efforts to agree on limits on AWS, whether in new legally binding rules, non-binding standards or best practice guidance.
- States should focus their work on determining how measures needed for human control apply in practice.
- Measures for human control should inform any development of internationally agreed limits on AWS—whether new rules, standards or best practices.
- States should clarify where rules under international humanitarian law rules already set constraints on the development and use of autonomous weapon systems, and where new rules, standards and best practice guidance may be needed.
- New rules, standards and best practices must build on existing limits on autonomy under international humanitarian law, and should draw on existing practice.
- Human control measures should be considered in the study, research and development, and acquisition of new weapon systems.