Smart Weapons and Killer Drones

What are smart weapons?

The principle of the “smart bullet” is based on the detection of a target – Two security researchers, Runa Sandvik and Michael Auger looked at the security of the on-board software of the assisted rifles manufactured by TrackingPoint. 

To discover, unsurprisingly, major flaws that they will present next week on the occasion of the US edition of the Black Hat conference. The main weak point of these rifles is an easy to guess default password. This is supposed to block the connection to the Wi-Fi network generated by the firearm. 

This network provides access to a video stream in real-time, this is the ShotView function. This wireless connection can also be used with a computerized eyewear app to aim and shoot without exposure.

It provides access to the Linux operating system embedded in the assist rifle. From there, the two researchers managed to figure out how to modify the parameters for calculating the trajectory of the weapon, without the knowledge of the user: enough to make him miss his target or force him to hit another. 

But they were able to go further, to obtain root-level rights: enough to permanently alter the embedded software and its configuration. 

The only reassuring news raised by our colleagues at Wired, who had the first to demonstrate: impossible to force the rifle to fire from a distance; the trigger must be manually activated.

 

What are killer drones?

A combat drone (UCAV: Unmanned Combat Air Vehicle) is a particular type of drone (UAV: Unmanned Air Vehicle). It is equipped with observation equipment and/or various weapons. It should be distinguished from the suicide drone, which is also a combat drone, but itself is the main ammunition.

Second, the MQ-9 Reaper is not more autonomous. Their use does not raise the “question of knowing if it is useful and conceivable, to leave to the machine the decision to open fire on a target”. 

And if, “on a battlefield, an armed drone can demonstrate as much or even more “humanity” than a human soldier, contrary to what the authors claim. Because in the case of the Reaper, as with all drones in existence, it is always a human, under political guidelines, who chooses the target and orders the crew to open fire. 

There is a certain degree of autonomy, or rather piloting aid, in some models, but this only concerns navigation – take off and land alone, like all civil aircraft of Airbus or Boeing for example – in any case targeting and opening fire.

 

Why and how armies care about killer robots?

A parliamentary report highlights a new arms race based on artificial intelligence. The United States, Russia, and China are interested in this to gain technological superiority over the adversary. France prohibits the use of weapons systems beyond human control.

 

The risk of gearing up

The risk of Beyond this semantic battle over killer robots, all the armies of the world are interested and take a stand. The deputies even describe the risk of overbidding in this area. 

In particular on the part of the great military powers: to minimize human losses, the Russian generals would have bet on the replacement of their soldiers by robotic units, leading the Americans to make even if they were faced with such an army, as the Chinese military is preparing for it integrating AI in high doses into their new weapon systems.

“There is a form of gear,” warn the rapporteurs. Their document evokes technological advances made by these great powers. The United States is developing the Sea Hunter, a 60-meter autonomous transoceanic ship dedicated to anti-submarine warfare and capable of navigating international waters while adapting autonomously to the navigation rules in force. 

In the naval field, China has designed an autonomous submarine HSU 001, of relatively small size, to mesh the oceans. As for Russia, it relies above all on robotization. She developed a small tank capable of following a soldier and firing at the same target.

Faced with these different military strategies, the report’s authors believe, however, that the development of ever more autonomous weapon systems is inevitable and that their development must even be supported. 

“It is important that France and Europe more generally do not remain on the sidelines of the artificial intelligence revolution. This is not a possible choice. At the risk of a strategic downgrading, research must be carried out, make developments in the field of artificial intelligence of defense”, recommend the deputies. 

According to them, the autonomy of weapons systems lies at the heart of a new arms race and Europe must be part of it, while remaining respectful of ethical principles and international humanitarian law.

By considering all the issues and problems that will encounter in the future due to the advanced system there may be a chance of misuse and it may have control over humans which leads to a non-imaginable life.