Are we on the edge of science fiction becoming the future of warfare?
‘Killer Robots’ – machines that hunt down and terminate the human race have caught the eye of the public’s imagination in science fiction and dystopian genres for generations. While we are still far away from terminators running the streets, the sci-fi of the past is starting to become today’s reality.
From your social media feed to cyberattacks and robotic assistance on the battlefield
From increasingly autonomous drones and intelligent missile attacks – all mimicking some aspects of human intelligence – AI is becoming the arsenal of 21st-century warfare. The demand is increasing for the ability to think in an unlimited manner and the ability to solve problems meant for the human mind, such as pattern and speech recognition, decision making, and data analysis.
Military AI focuses on three things: cyber attacking software, data crunchers, and autonomous machines. The accuracy and precision of current weapons are forcing today’s battlefields to eliminate human combatants, as AI acts as a ‘force multiplier’ – a tool allowing the same amount of people to do and achieve more without sacrificing human lives.
Big tech befriending the military
These methods are not new. Large corporations like Google, Facebook, and Amazon have used big data to predict customer wants and needs and built personalized algorithms through big data – vast and seemingly endless amounts of information generated online. But now, AI is also transforming the military in many ways.
The defense establishments worldwide have realized that their AI needs and potentials are not met. For instance, in the US, the Pentagon’s Defense Advanced Research Projects Agency (DARPA) approached Silicon Valley, asking them for tools for processing even more significant amounts of information to enhance their autonomous technologies.
A wingman or sci-fi horrors?
Another aspect where AI shines is mass surveillance and counter-insurgency. The ability to scan images made available from millions of CCTV cameras or drones and follow multiple potential targets. This type of big data surveillance is already being used in practice with significant success in counter-insurgency operations. The 24-hour surveillance beats the human eye in every aspect, leaving insurgency groups with only two options – to constantly be on the move or to hide.
In conflict, militaries are more likely to choose not to have a human mind involved in decision-making since the stakes of being injured or killed are so high. Although, it is not the case with the high-end, intelligence-gathering robots. One of these technologies is the American-built unarmed drone ‘Global Hawk,’ which can carry out orders without vulnerable data links. Money is now flowing into these intelligent military technologies that can fully fly themselves; under development is France’s ‘Dassault Neuron,’ Russia’s Sukhoi S0, and Australia’s AI bodyguard plane ‘Loyal Wingman.’
A glimpse of the future conflicts
Countries’ want to build and produce these new technological combatants is likely to tip the balance in future warfare and already has generated an arms race between the US, Russia, and China.
Some of the smart lethal weapons are currently being used in the war in Ukraine. Russian “suicide drone” ‘KUB-BLA’, which enhances the capability to identify targets using artificial intelligence, has been noticed in images of the ongoing war in Ukraine. Recently, the Biden administration has announced to provide Ukraine with 100 killer drones ‘Switchblade’ made in the US – robotic smart bombs set with cameras, guidance technologies, and explosives – which have also been used in Afghanistan.
The concern of war ethics and fears of humans losing control over future robotic weapons has been increasing because strong diplomatic efforts are rushing to prohibit and regulate AI weapons. The Human Rights Watch has called for a ban on fully autonomous AI units, comparing them to biological and chemical weapons. The United Nations has held several meetings to discuss the matter and advocate for restrictions.
The technology behind these weapons is still in its early stages and error-prone, although increasingly developing. While we are not facing the Terminator yet, there is a possibility that it might not be that long before the warfare starts looking a lot like science fiction movies.
Help us fight for world peace, along with all other pressing issues of our world by contributing however you can. You can write for us, share our articles, advise, volunteer, intern, donate, etc.
We use this help to provide awareness, training, and education to youth from underserved communities (though our material is available for all) to help them become better leaders of tomorrow.
Share your thoughts on this issue in the comments section below. Reach out to us at firstname.lastname@example.org for collaborations.
Thank you and take care!
References: (click the arrow to expand)
https://theconversation.com/how-ai-is-shaping-the-cybersecurity-arms-race-167017 https://css.ethz.ch/content/dam/ethz/special-interest/gess/cis/center-for-securities-studies/resources/docs/CNAS_AI%20and%20International%20Security.pdf https://www.govtech.com/security/how-ai-is-shaping-the-cybersecurity-arms-race https://www.politico.eu/newsletter/digital-bridge/eu-u-s-frenemies-global-data-deal-responsible-ai/ https://www.aljazeera.com/features/2021/3/28/friend-or-foe-artificial-intelligence-and-the-military https://www.wired.com/story/ai-drones-russia-ukraine/ https://www.hrw.org/news/2022/03/24/interview-weapons-war-ukraine https://www.scientificamerican.com/article/ai-influenced-weapons-need-better-regulation/ https://www.nbcnews.com/politics/national-security/ukraine-asks-biden-admin-armed-drones-jamming-gear-surface-air-missile-rcna20197
Kristiana Nitisa is an investigative journalist based in Sweden. She is also a research journalist at the International Youths Organization for Peace and Sustainability.
Inputs and Edits by Sovena Ngeth.