UN urged to ban the fully autonomous drones and killer robots before they can be developed

A report by Human Rights Watch and Harvard Law School has urged the UN to create International treaty which can ban the so called “Killer Robots” before they can be developed.

Volunteers of Human Rights Watch and the Harvard Law School, have together worked upon to write a report known as “Mind the Gap”, which gives the details of how fully autonomous weapon lacks the regulation and could be the cause of human deaths without any accountability. On the basis of this report the Group is now pressurizing UN to ban development of any such fully autonomous weapon which could probably inflict harm to humans in absence of operators. Though the fully autonomous weapons are still under developmental stages the Human Rights Watch and Harvard Law School have recommended UN to bring about some strict International laws that will  not only prohibit the development, production and use of the fully autonomous weapons but also legally bind other nationals from creating such weapons.

Artificial Intelligence is taking its toll and manufacturers are making full use of the technologies to create autonomous drones and vehicles which includes the fully autonomous car by Google, Parrot a company which is into creating hobby drones for its consumers and also Amazon’s Prime Air project which would deliver the customer’s packages by unarmed aerial vehicle, however all these are still under developmental stages. Though the regulators are working on ways to get the licenses approved for these autonomous commercial vehicles, the volunteer of Human Rights Watch and the Harvard Law School are urging the authorities to set up laws which should be placed before the advent of licenses given to manufacture such autonomous “Killer Robots.” Recently FAA (The Federal Aviation Administration) had imposed restrictions on the commercial use of the autonomous drones.

The Guardian recently reported about “Mind the Gap” which is a joint report put up by Human rights Watch and the Harvard Law School. The report argues that the current law, manufacturers, programmers and military personnel could easily shed the accountability in case any deaths occur on the fields due to the autonomous weapons. The report further brings it to the notice of the UN authorities that since there is no proper legal groundwork done to these autonomous drones and the weapons during its manufacturing and disposal it is difficult to hold anyone responsible in case any error occurs. The report also specifies that since there is no human assigned to the fully autonomous weaponry there are chances that these weapons might be erroneously targeted to civilians rather than military. The report also states: “Fully autonomous weapons do not yet exist, but technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defense systems — such as the Israeli Iron Dome and the US Phalanx and C-RAM — that are programmed to respond automatically to threats from incoming munitions. In addition, prototypes exist for planes that could autonomously fly on intercontinental missions (UK Taranis) or take off and land on an aircraft carrier (US X-47B).” Further the report says: “Existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harms fully autonomous weapons might cause. These weapons have the potential to commit criminal acts — unlawful acts that would constitute a crime if done with intent — for which no one could be held responsible. A fully autonomous weapon itself could not be found accountable for criminal acts that it might commit because it would lack intentionality.”

The report adds that during current war scenarios the automated weaponry and drones used by army personnel have some human operator who takes the decision of pulling the trigger and hence for any misuse by the automated weapon he will be held responsible.  However, in case of a fully automated weapon, it will be difficult to catch hold of someone in case there is a  misuse of the weapon and drone. Further in case any blunder occurs due to some error it will be a big question as to who will be held responsible for? Giving full control to a machine would mean lack of meaningful human control in selecting and engaging targets.

The report has also given some alternative suggestions where it says that some commander or a programmer should be held responsible for any error or negligence. This would impose a civil liability on that particular personnel or programmer and thus would prove to be helpful to provide compensation, some judgement and a sense of justice to the harmed victims. The report further adds: “The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force. They would thus challenge long-standing notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon. On the other hand, fully autonomous weapons would fall far short of being human.”

The meeting of International officials of UN will be held this month end in Geneva to discuss the regulation of emerging military technology. Seems the report has been released beforehand for the officials to go through and come up with their suggestions for the same.

Maya Kamath
Maya Kamathhttps://www.techworm.net/
Content writer with unending love to pen down my thoughts and views regarding the new technological inventions as well as probe into the current affairs. Feel as if i am free bird who can actually live life at my pace.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Read More

Suggested Post