Killer robots and autonomous weapons


Technology and war innovation have always been present in wars since time immemorial. Although the conflict between Russia and Ukraine seems conventional, the use of Artificial Intelligence and autonomous weapons is a growing concern in the new conflict.

The Stop Killer Robots organization is an international coalition that has alerted and fights to stop killer robots: autonomous weapons systems based on Artificial Intelligence that can decide the attack against programmed targets based on coordinates and images.

According to Stop Killer Robots, by Russia in Ukraine has already used a drone which works autonomously to select, attack and exploit on certain targets, guided by algorithms.

It is the KUB-BLA system, a drone manufactured by the contractor Kalashnikov Concern (yes, it is named after the famous engineer and designer of the AK-47 assault rifle, but it stands alone.) The aerial vehicle is a loitering munition directed to a target by an operator via video. The KUB-BLA carries an explosive charge of three kilograms, has a flight time of 30 minutes and a maximum speed of 130 kilometers per hour.

The drone is a kamikaze that goes unnoticed by traditional air defense systems. The KUB-BLA prowls and attacks targets after the coordinates are entered into the system. You can attack a target based on a preloaded image (tanks, vehicle columns, artillery), identify it on the battlefield, and rush to the target to detonate your payload. It is an autonomous weapon that attacks without human sanction, even though the programming is.

“KUB-BLA is part of a new generation of weapon systems where the role of the human operator is becoming blurred and risks shrinking over time,” explains Stop Killer Robots.

Ukraine has also used the Turkish-made Bayraktar TB2 drone, capable of taking off, landing and flying autonomously, though it relies on a human operator to decide when to launch laser-guided explosives.

The TB2 is 11 meters long, has a wingspan of 6 meters, flies at a height of 5 thousand meters, reaches a speed of more than 200 kilometers per hour and can carry 150 kilos. It was used in Syria in 2019. Ukraine bought it that same year for $69 million. and was used in Dombass against Russian separatists.

According to the British newspaper GuardianJoe Biden’s “unprecedented assistance” to Ukraine includes 100 drones, the Switchblade model or kamikaze drone that self-destructs upon hitting its programmed target, like the KUB-BLA.

In December 2021, the negotiation attempts of the Group of Governmental Experts to legislate, regulate and prohibit the use of autonomous weapons or killer robots during the Sixth Review Conference of the Convention on Conventional Weapons (CCW) failed. The most militarized countries like Russia and the United States have blocked the attempts.

The use of AI, autonomous weapons and military robots with a “license to kill” are seen as a serious violation of human rights. They are also immoral because algorithms are incapable of understanding the value of life and should not have the power to decide who lives and who dies.

The Secretary General of the United Nations, Antonio Guterres, declared before the CCW Group of Governmental Experts that “machines with the power and discretion to take lives without human participation are politically unacceptable, morally repugnant and should be prohibited under international law.”

In March 2022, in their ninth year of discussions and with the war in Ukraine already underway, the states met again to consider “detailed proposals for the regulatory and operational framework on autonomous weapons systems.”

Russia claimed discrimination for its full participation, the discussions moved to an informal format and both public recording and broadcast of the sessions stopped.

The objective is to draw clear legal and moral lines to guarantee human control over the use of force and avoid the use of lethal autonomous weapons systems enabled by Artificial Intelligence with the capacity to select targets and annihilate people without human supervision.

For the US, China and Russia, Artificial Intelligence is strategic, including war applications. Russian President Vladimir Putin told a group of students in 2017 during the Day of Knowledge:”Artificial Intelligence is the future, not only for Russia, but for all humanity. It comes with colossal opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become in the ruler of the world”.

However, a study of Naval Analysis Centerconducted on behalf of the US Department of Defense Joint Center for Artificial Intelligence, reveals that “Russia is not a leader in AI research, but it has the potential to be a world leader in AI weapons manufacturing.”

He explains that the goals of Russia’s AI and automation ecosystem “are best understood in the context of Russia’s economic development and modernization efforts, and include initiatives aimed at improving the well-being of Russian citizens, as well as the conditions for business and business activities”.

The major armed forces are investing heavily in research and development of autonomous weapons. The United States alone has earmarked $18 billion for AI weapons between 2016 and 2020 (Scientific American). Many other nations are also producing killer robots.

Twitter: @beltmondi

Jorge Bravo

President of the Mexican Association of Right to Information (Amedi)

in communication

Media and telecommunications analyst and academic from UNAM. He studies the media, new technologies, telecommunications, political communication and journalism. He is the author of the book media presidentialism. Media and power during the government of Vicente Fox.



Leave a Comment