java script is required for this page
Convention on Certain Conventional Weapons
Home    >   Convention on Certain Conventional Weapons   >  Statement by Ambassador (Dr.) Pankaj Sharma, Permanent Representative of India to the Conference on Disarmament on Agenda item 5(c) on Further consideration of the human element in the use of lethal force; aspects of human machine interaction in the development, deployment and use of emerging technologies in the area of Lethal Autonomous Weapons Systems at the 2020 session of GGE on LAWS held in Geneva from 21-25 September 2020

Statement by Ambassador (Dr.) Pankaj Sharma, Permanent Representative of India to the Conference on Disarmament on Agenda item 5(c) on Further consideration of the human element in the use of lethal force; aspects of human machine interaction in the development, deployment and use of emerging technologies in the area of Lethal Autonomous Weapons Systems at the 2020 session of GGE on LAWS held in Geneva from 21-25 September 2020

Mr. Chairperson,

India has well established and time-tested procedures which train military personnel at all levels, to plan the deployment and utilization of existing conventional weapons and systems, pursuant to our obligations under international law, in particular International Humanitarian Law.

In the case of emerging technologies in the area of LAWS, we believe that both the Commander and the Operator need to have adequate knowledge in respect of the attributes of the weapon systems along with training and clearly laid out Standard Operating Procedures incorporating the nuances of international law including  International Humanitarian Law.  The responsibility to remain compliant with IHL remains with humans and we are in agreement that machines, even the most complex ones, cannot be programmed to make ethical decisions in all cases.  

In the absence of a broad definition or characteristics that constitute LAWS at this stage, it would be rather early to discuss attribution of responsibility for violation of IHL by LAWS in specificity. It is our view that the States using lethal autonomous weapon systems cannot absolve themselves from the intended as well as unintended consequences of the application of force in terms of the international law, in particular  IHL. 

Meaningful human control needs to be retained over critical functions of a weapon system which includes identification, selection and engagement of targets, to align it with the application of the international law, particularly IHL. In respect of a lethal autonomous weapon system with self-learning ability, the degree of human control needs to cater to the requirement of human supervision and intervention to prevent an automated redefining of mission, and to negate any autonomous function, if necessary. 

Incorporation of human-in-loop or a controlled human takeover at any or all stages during the design, development and operation are factors to be considered in respect of human element in emerging technologies in the area of LAWS. Such a weapon system must have the ability to perform in a predictable manner and  conform to the intended use within the laid down parameters. Challenges concerning reliability, predictability and secure communication between lethal autonomous weapon systems and operators, including risks of interference and system failure, also need to be considered.

Mr. Chairperson,

In our view, an in-depth discussion with in-person participation of our experts, is needed to arrive at a common understanding on the degree of human element required in the emerging technologies in the area of LAWS.

Thank you Mr. Chairperson.

External website that opens in a new window
External website that opens in a new window
External website that opens in a new window
External website that opens in a new window
External website that opens in a new window
External website that opens in a new window
External website that opens in a new window
 
MEA App Twitter Google plus Youtube