The technology has advanced and artificial intelligence (AI) is one of the modern marvels of the technology. It has been used in various applications in various sectors such as automobile, law, BPO, and manufacturing. The number of applications of AI has been increasing day by day and there is no doubt that one day machines will have an intellectual debate with humans. From sassier AI assistants to self-driving cars, AI has made leaps and done things that were unimaginable. AI has been powered drones used in farming and military applications for surveillance and other purposes.
Now, the development is underway for drones that could kill people in a battlefield. In a short film named Slaughterbots, published by the Campaign to Stop Killer Robots, an initiative promoted by the International Committee for the Control of Robotic Weapons (ICRAC) and other entities, these drones have been displayed. They are powered by AI. They detect targets and fire explosives into the brain. Moreover, it is nearly impossible to take them down as they are incredibly faster than human beings. The video ends with a warning that states these lethal weapons could cause more damage than imagined.
Stuart Russell, professor of computational science at the University of Berkeley, U.S., stated that this short film is more than a speculation or work of fiction. This kind of technology exists, and lethal autonomous drones will become reality in near future. In November 2017, the U.S. Department of Defense issued a statement regarding the development of “automatic target recognition of personnel and vehicles from an unmanned aerial system using learning algorithms.”
Drones have been part of the battlefield from past few years. But they were always controlled by humans from a distance. The unmanned drones have taken leaps and they are made of the size of domestic drone along with possessing the capability of deciding for themselves to kill the targets without human intervention. However, many countries have ruled out the manufacturing of autonomous drones. While, some countries have been building advanced robotic weapons of different levels of autonomy. They have not decided whether the control to kill humans should be given entirely to machines. But experts believe there will be countries that will develop fully autonomous drones.
Those who advocate the development of autonomous drones to kill point out their ability to avoid mistakes and emotions that occur in case of humans. It eliminates the moral responsibilities related to casualties. On the other hand, experts also point out the deaths that could occur due to software bugs and errors occurring in recognition of potential targets. Steve Wright, professor of the Politics & International Relations Group at Leeds Beckett University (United Kingdom) and a member of ICRAC told OpenMind, “The negative legal, political and ethical consequences of autonomous armed drones far outweigh any temporary military utility.”
He also explained the aim of ICRAC is to urge the United Nations to roll back the development of autonomous drones for killing under the Geneva Convention on Certain Conventional Weapons (CCW). In addition, nearly hundred senior executives of technology companies signed a letter to request an action on the development of these drones to CCW. However, they have not explicitly mentioned that the programs should be withdrawn. If the push button assassination is stopped, the future generations will thank us, stated Wright.