Russia
Report Year
Attachment | Mărime |
---|---|
gisw2019_russia_the_future_is_now | 821.35 KB |
The future is now: Russia and the movement to end killer robots
Introduction
Of all the contentious issues regarding misuses of artificial intelligence (AI), the most frightening might be the use of AI in weapons. In films such as the Terminator series or 2001: A Space Odyssey, malevolent machines in the form of Skynet or HAL the computer can think for themselves, and aim to control or eliminate humans. But what might seem like something from a science fiction movie is already a reality: weapons now can function using AI to kill without active human control.
Countries including the United States (US), South Korea and Russia are investing in AI technologies for use in lethal autonomous weapons systems (LAWS), also dubbed “killer robots”. LAWS are distinguished from other forms of AI-enabled warfare, such as cyberwars, which are not directly lethal. The concept of autonomous weapons is also not new. The landmine is an early example of an autonomous weapon, a device that is triggered autonomously and kills without active human intervention. But the use of AI brings such weapon systems to an entire new level, for it allows machines to independently search, target and/or eliminate perceived enemies.
The US has already developed weapons such as AI-enabled drones and tanks, as well as the Sea Hunter, an autonomous war ship. So far, it has placed humans in the loop for these machines, meaning there is no firing of weapons for lethal purposes without direct human intervention. However, these weapons systems have the capacity to be operated by AI alone.[1] The US spends far more than any other country annually on weapons research and development, to the tune of USD 100 billion on “everything from hypersonic weapons to robotic vehicles to quantum computing.”[2]
The Kremlin has announced AI as a priority in Russia.[3] Although at a rate far lower than the US, Russia is investing heavily in AI with the view of developing AI-driven defence capabilities. This investment has included AI-enabled missiles with the capability to change their target mid-flight, and AI-assisted tanks, although they are currently not fully autonomous yet.[4] Russia has even invited foreign investors to fund research and development in AI, raising USD 2 billion to support domestic tech companies.[5]
Russian activism against LAWS
This misuse of AI technology represents a clear danger to humanity on many fronts. Algorithms cannot as yet make perfect decisions, especially in varying warfare conditions. For example, conventional soldiers (i.e. humans) might not kill a child holding a toy weapon, but a robotic device might not be able to distinguish between a child and an adult soldier. The chance of a software glitch accidentally firing upon friendly soldiers or civilians is not just a remote possibility. This has happened before, such as a robotic cannon mistakenly killing nine soldiers and wounding 14 others during a military exercise in South Africa over a decade ago.[6]
There is currently a global campaign against weapons controlled by AI. Ban Killer Robots started as a project of Human Rights Watch, but now has chapters and affiliates in countries around the world.[7] The Russian affiliate is the NGO Ethics and Technology, established and managed by Alena Povova, Russia’s primary campaigner against LAWS.[8] She recently attended a Convention on Certain Conventional Weapons (CCW) meeting[9] at the UN in Geneva (20-21 August 2019) for formal and informal consultations as an expert. One of her main missions is to raise awareness of the situation among ordinary Russians. To this end she promotes the anti-LAWS cause on both Russian and English social media and by blogging on the issue. Povova has spent many years working with the Russian parliament, serving as an advisor and expert. Her NGO takes a five-stage approach in the campaign against killer robots. Firstly, it begins by collecting information, finding available information and creating a relevant database. Secondly, it processes the information, preparing data for its working group and experts. Thirdly, it develops solutions via open discussions, with the aim to eventually turn them into formalised legal acts. Fourthly, it informs the public via social media, news releases, press conferences, etc. Fifthly, it persuades and exerts public pressure on the legal system to adopt laws and measures to implement the solutions, and keep track of the results.[10]
At the CCW meeting in Geneva, she appeared optimistic about the eventual success of the campaign. Issuing a video statement during the CCW meeting, she told her Russian audience that in fact the sci-fi scenario of algorithmic weapons is already a reality. She appealed strongly to the scientists in the Russian Academy of Sciences – indeed scientists everywhere – to join the fight against killer robots. She asked all technology workers to join the movement in declaring they will not support the development of LAWS. She also urged Russians to be a part of this movement before it is too late to stop this nightmare.[11]
To this end, she also connects with other Russian-speaking campaigners from Commonwealth of Independent States (CIS) countries, including the prominent Kazakh activist Alimzhan Akhmetov, the director of the Center for International Security and Policy in Nur Sultan.[12] While Kazakhstan itself is not developing AI-enabled weapons, activists understand there must be regional cooperation to stop Russia from implementing such weapons. Akhmetov takes a moral and legal approach in trying to ban LAWS. He cites the Martens Clause,[13] first introduced in the 1899 Hague Convention II – Law and Customs of War on Land and added to the Geneva Convention in the additional protocols of 1977:
In cases not included in the present Protocol or other international treaties, civilians and combatants remain under the protection and the rule of the law of nations, as they result from the usages established among civilized peoples, from the laws of humanity and the dictates of public conscience.[14]
Applying the Martens Clause to the ban on LAWS, Akhmetov argues that public conscience demands “principles of morality and ethics, the exclusive sphere of human responsibility” be applied and that “robots are not able to appreciate the value of human life” to make life or death decisions.[15] It is arguable that countries should already stop developing LAWS based on the Martens Clause. According to the Arms Control Association:
The Martens clause requires in particular that emerging technology comply with the principles of humanity and dictates of public conscience. Fully autonomous weapons would fail this test on both counts.[16]
Meanwhile, Ethics and Technology has managed to hold meetings to work on a range of issues, including meaningful human control; current and new legal acts; responsibility for system operations; elements of system operation; defence and attack; and expansion beyond the military sphere.[17] So far, the NGO is not banned and its campaign against LAWS has not faced major overt opposition from the Russian government. In fact, the Russia government simply moves forward in developing its AI-enabled weapon systems. Most notably, on 22 August 2019, Russia sent a Skybot F-850 robot called FEDOR (Final Experimental Demonstration Object Research) to the International Space Station as the sole passenger on a Soyuz rocket. FEDOR, a humanoid-looking robot, was sent to practise routine maintenance and repair tasks on the space station.[18] But FEDOR is a dual-use robot, also capable of very lethal actions. In a video shared on Twitter by Russia's Deputy Prime Minister Dmitry Rogozin, FEDOR can be seen firing pistols with high accuracy on a shooting range.[19]
While dual-use machines arguably cannot be prevented, Russia has developed other LAWS that are less likely to be used outside of war. They are not fully autonomous, performing mostly via remote control, but could soon potentially be fitted with AI technologies to work independently. They include tanks, mine-clearing bulldozers and surveillance and utility seacrafts.[20] The most advanced of these might be the Soratnik, an “unmanned vehicle capable of autonomously picking, tracking and shooting targets” without direct active human intervention.[21]
The rising tide against LAWS
Many countries have already expressed a desire for a global ban on AI-enabled weapons, with 29 agreeing to such a declaration, Jordan being the latest addition.[22] Significantly, the People's Republic of China is also on this list, arguing for a ban on the use of fully autonomous weapons, but not their development or production.[23] Arguably, China might have a point in wanting to continue developing such machines, because they can have dual-use capabilities. Although AI-enabled machines that have peaceful functions can be converted into killing machines, an outright ban would, for example, stop the development of a useful machine such as the above-mentioned Skybot F-850.
At the recent CCW meeting in Geneva, the US and Russia continued to argue for the legitimisation of LAWS, but it appears it is a “losing fight against the inevitable treaty that’s coming for killer robots.”[24] The only countries developing LAWS that we know of are China, Israel, Russia, South Korea, the United Kingdom and the US. None of them are so far along the AI weapons development path that the course cannot be reversed. Just like the ban on landmines was a success, a ban on AI weapons is within grasp.
Supporting the Ban Killer Robots campaign is a worldwide movement and cannot be just country-specific. With enough global support, Russia can be persuaded to come around to support this ban as well, since the huge costs involved in AI weapons development puts them at a disadvantage compared to the spending in the US, which is further along in the development of and investment in AI weapons. Russia reportedly spends an estimated USD 12.5 million on developing AI use in weapons, although the true figure is unknown.[25] A ban on LAWS would disarm the US and neutralise its economic advantage, and actually benefit Russia’s military position as a result.
Action steps
Activists working in this field need to step up their work by connecting to more countries around the world than the ones currently in the movement.[26] Establishing organisations in more countries could eventually persuade additional nations to turn against LAWS at the United Nations, isolating those few countries that do develop such weapons. Another approach is convincing tech workers to pressure companies not to develop such weapons. This has worked at Google, whose workers refused to work on an AI project that has implications for weapons.[27] Finally, NGOs should work directly with their country’s government using productive and frank discussions. This can happen in Russia and elsewhere. The budget for LAWS in Russia can be better spent elsewhere. While the global campaign might seem optimistic, the ban on such lethal weapons appears inevitable and governments should be persuaded to consider the benefits of coming on board.
Footnotes
[1] Piper, K. (2019, 21 June). Death by algorithm: the age of killer robots is closer than you think. Vox. https://www.vox.com/2019/6/21/18691459/killer-robots-lethal-autonomous-weapons-ai-war
[2] Thompson, L. (2019, 8 March). Pentagon May Come To Regret Prioritizing R&D Spending Over Weapons It Needs Now. Forbes. https://www.forbes.com/sites/lorenthompson/2019/03/08/weapons-budgets-are-way-up-so-why-isnt-the-pentagon-buying-weapons-faster/#29b7c2a84263
[3] Daws, R. (2019, 31 May). Putin outlines Russia’s national AI strategy priorities. AI News. https://artificialintelligence-news.com/2019/05/31/putin-russia-national-ai-strategy-priorities
[4] RT. (2018, 5 May). Race of the war machines: Russian battlefield robots rise to the challenge. RT. https://www.rt.com/news/425902-war-machines-russian-robots
[5] bne IntelliNews. (2019, 31 May). Russia Raises $2Bln for Investment in Artificial Intelligence. The Moscow Times. https://www.themoscowtimes.com/2019/05/31/russia-raises-2bln-for-investment-in-artificial-intelligence-a65824
[6] Shachtman, N. (2007, 18 October). Robot Cannon Kills 9, Wounds 14. Wired. https://www.wired.com/2007/10/robot-cannon-ki
[7] https://www.stopkillerrobots.org/members
[9] https://www.giplatform.org/events/group-governmental-experts-lethal-autonomous-weapons-systems-gge-laws-2nd-meeting-2019
[11] https://publish.twitter.com/?query=https%3A%2F%2Ftwitter.com%2Falenapopova%2Fstatus%2F1164166965361074181&widget=Tweet
[13] https://en.wikipedia.org/wiki/Martens_Clause
[14] Email exchange with author, 29 July 2019.
[15] Ibid.
[16] Docherty, B. (2018, October). REMARKS: Banning ‘Killer Robots’: The Legal Obligations of the Martens Clause. Arms Control Association. https://www.armscontrol.org/act/2018-10/features/remarks-banning-%E2%80%98killer-robots%E2%80%99-legal-obligations-martens-clause
[18] Cuthbertson, A. (2019, 22 August). Russia sends gun-toting humanoid robot into space. The Independent. https://www.independent.co.uk/life-style/gadgets-and-tech/news/russia-robot-space-fedor-humanoid-iss-a9074716.html
[19] https://twitter.com/Rogozin/status/852869162493935617?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E852869162493935617&ref_url=https%3A%2F%2Fnews.yahoo.com%2Frussia-sends-gun-toting-humanoid-133245592.html
[20] RT. (2018, 5 May). Op. cit.
[21] Ibid.
[22] Campaign to Stop Killer Robots. (2019, 22 August). Russia, United States attempt to legitimize killer robots. https://www.stopkillerrobots.org/2019/08/russia-united-states-attempt-to-legitimize-killer-robots
[23] Campaign to Stop Killer Robots. (2019, 21 August). Country Views on Killer Robots. https://www.stopkillerrobots.org/wp-content/uploads/2019/08/KRC_CountryViews21Aug2019.pdf
[24] Campaign to Stop Killer Robots. (2019, 22 August). Op. cit.
[25] Bendett, S. (2018, 4 April). In AI, Russia Is Hustling to Catch Up. Defense One. https://www.defenseone.com/ideas/2018/04/russia-races-forward-ai-development/147178
[26] https://www.stopkillerrobots.org/members
[27] Campaign to Stop Killer Robots. (2019, 14 January). Rise of the tech workers. https://www.stopkillerrobots.org/2019/01/rise-of-the-tech-workers
Notes:
This report was originally published as part of a larger compilation: “Global Information Society Watch 2019: Artificial intelligence: Human rights, social justice and development"
Creative Commons Attribution 4.0 International (CC BY 4.0) - Some rights reserved.
ISBN 978-92-95113-12-1
APC Serial: APC-201910-CIPP-R-EN-P-301
978-92-95113-13-8
ISBN APC Serial: APC-201910-CIPP-R-EN-DIGITAL-302