Lethal Autonomous Weapons: The Gendered Consequences of 'Killer Robots'
By Kirthi Jayakumar
Lethal Autonomous Weapons: The Gendered Consequences of 'Killer Robots'
Technological advancements have been deployed to enhance military capabilities because they are known to increase efficiency and reduce risk (Wilcox, 2017). However, the deployment of AI can cause the unintended escalation of conflict, misidentification of targets and civilian harm, vulnerability to cyberattacks and manipulation, an arms race, and a range of ethical dilemmas emerging from the question of accountability and the blackbox nature of AI decision-making (Acheson, 2020a). One area where the benefits and disadvantages of AI use in military applications come to head concerns “Lethal Autonomous Weapons Systems” or LAWS, which are more commonly known as killer robots .
LAWS represent one of the most contentious developments in modern military technology (Manjikian, 2014). They have the capacity to select and engage targets without any meaningful human control, and in the process, alter the nature of armed conflict fundamentally (Manjikian, 2014; Acheson, 2020a). There are very significant challenges to the adoption and use of LAWS that cannot be overlooked in making a case for their speedy adoption.
The Anatomy of LAWS: Technical Capabilities and Current State
LAWS operate through the integration of AI, machine learning algorithms, and sensor technologies into weapon systems (Sexton & Carneal, 2024; Walsh, 2022). With the integration of these technologies, these systems can process vast amounts of data of various kinds from a wide range of sources, and help identify, classify, and prioritize potential targets (Sexton & Carneal, 2024). They are deployed as defensive systems to intercept incoming projectiles, and as offensive systems that are capable of autonomous target selection and engagement.
The technical architecture of LAWS comprise perception systems that gather environmental data, decision-making algorithms that process these data according to pre-programmed parameters, and actuation mechanisms that execute brute force (Walsh, 2022). With machine learning capabilities, these systems can adapt their behaviour based on experience – a feature that has evoked concerns around their potential capacity to evolve beyond their original programming constraints.
Even as these technological advancements offer military advantages there are still significant limitations. For instance, current AI systems struggle with context recognition and do not do a good enough job of distinguishing between combatants and civilians in complex environments (Viveros Álvarez, 2024). Recognition algorithms are beset with documented biases – especially in identifying people of colour and people from certain ethnic backgrounds (Acheson, 2020a). This is particularly dangerous, as these systems make life-or-death decisions.
Nearly 400 partly autonomous weapon systems have been deployed or are being developed in 12 countries, such as China, France, Israel, Republic of Korea, Russia, the UK, and the US (Acheson, 2020b). For instance, the Republic of Korea has deployed mechanised sentries in the demilitarized zone, Israel has deployed the Iron Dome to detect and destroy short-range rockets, the US has semi-autonomous missile-defence systems like Patriot and Aegis and has tested an autonomous anti-submarine vessel that can sink submarines and ships without anyone on board, the UK has developed Taranis, a drone that can avoid radar detection and fly in autonomous mode, Russia has built a robot tank that can be fitted with a machine gun or grenade launcher and a fully automated gun that uses artificial neural networks to choose targets, and China has developed weapon swarms that comprise small drones that can be fitted with heat sensors and programmed to attack anything that emits a body temperature (Acheson, 2020b). Turkey deployed autonomous and semi-autonomous drones like Bayraktar TB2 and other advanced platforms in Syria (Acheson, 2020b).
A look through the feminist lens
LAWS represent an extreme manifestation of what Enloe (2016) and Cohn (1987) called militarized masculinities, which effectively prioritize technological solutions to war over diplomatic engagement. With the development and deployment of LAWS, there is a tendency to valorise technological dominance and lethal capability over conflict transformation and resolution – where the idea of a “solution” is about attaining and maintaining military superiority and dominance (Acheson, 2020). Militarized masculinities make male bodies, often racialised ones, “expendable” by determining that they are inherently violent and a threat to the system (WILPF, 2025). This is built on the assumption that cis het men of colour are inherently violent, and that cis het women of colour need to be protected against them, reiterating a long-held colonial idea (Pratt, 2013)
A feminist lens invites us to ask the most pressing question of all: Should we be putting the capacity to decide and execute a decision to kill a human being in the hands of a machine at all? LAWS have been considered “superior” to human soldiers because they would make the "perfect killing machine," as they are devoid of a human conscience and empathy - and lack the emotion that could, potentially, hold a human soldier back from killing another (Acheson, 2020a). A feminist lens asks us to introspect on why we must lean in on this level of dehumanization as to want to have a killing machine at all.
The very nature of LAWS suggests an inherent tendency to violate human rights and dignity – where the human is almost entirely out of the loop in making key life-and-death decisions. There should be meaningful human control over decisions to use lethal force, and LAWS take any form of human control out of the equation. To the military industry complex, which is inherently built on necropolitical worldviews, the lives of the racialised other matter little to the attainment of the specific aims of war and war-mongering policies (Acheson, 2020b). Accordingly, the lack of precision in LAWS and the potential to confuse civilians and combatants, to not distinguish among different persons of colour, and to not distinguish among targets based on gender have not received enough attention in the pursuit of military superiority (Walsh, 2024). This is largely because these systems are trained on datasets that underrepresent women, non-binary folks, and people from marginalized backgrounds, which ultimately result in tangible biases in target recognition and threat assessment (Acheson, 2020a; Acheson, 2020b). The deliberate targeting of racialized men also points at the White supremacist and colonial attitudes underlying war (Acheson, 2020b).
The deployment of LAWS creates acute risks for civilian populations, exacerbating existing discriminations and vulnerabilities significantly in the process. Women, non-binary folks, and children tend to comprise the majority of civilian casualties, and tend to face heightened vulnerability from LAWS, given their inability to accurately distinguish between combatants and non-combatants (Human Rights Watch, 2020). The deployment of these weapons restricts mobility and has a psychological cost by normalizing an ambient fear, given their unpredictable nature. LAWS may engage targets based on algorithmic assessments that lack contextual understanding of civilian activities (Sharkey, 2019).
Given the complete absence of a human in the loop, LAWS are not likely to comply with the established standards of international humanitarian, criminal, and human rights laws (International Committee of the Red Cross, 2021). This also poses serious questions around accountability, liability, and legal responsibility – what does justice look like when the entity making decisions on the target, the extent of force, and the scale of violence used is not human? The legal grey area has unfortunately been taken advantage of, rather than addressed – more actors are engaging LAWS for their efficiency than are making the case for an interrogation of their legality and the development of meaningful legislative regimes that can cover accountability for their deployment and use (International Committee of the Red Cross, 2021).
As has been the case with many weapon systems, particularly Small and Light Weapons, it is possible for autonomous weapons to be deployed outside of conflict settings, such as in border control, policing, and surveillance contexts, by actors beyond the scope of state militaries, such as private military and security contractors. The proliferation of LAWS outside settings of conflict also pose the very real risk of lowering the threshold for war, as the cost and effort of waging war are far lesser than they would in a purely human-driven endeavour (Human Rights Watch, 2020).
Steering away from violent futures
The development and deployment of LAWS represents a critical juncture in the evolution of warfare technology. While proponents emphasize potential military advantages and reduced risks to soldiers, a feminist analysis reveals profound concerns about civilian protection, gender equity, and human dignity. The consequences of LAWS extend far beyond immediate battlefield effects, encompassing psychological impacts, social disruption, and the broader militarization of international relations. As the international community grapples with regulatory responses to LAWS, incorporating feminist perspectives upfront rather than as an afterthought is essential for developing policies that protect civilian populations and preserve human agency in life-and-death decisions.
References:
Acheson, R. (2019). Autonomous Weapons and Gender. Women's International League for Peace and Freedom.
Acheson, R. (2020a). Autonomous Weapons and Patriarchy. https://reachingcriticalwill.org/images/documents/Publications/aws-and-patriarchy.pdf
Acheson, R. (2020b). A WILPF Guide to Killer Robots. https://www.reachingcriticalwill.org/images/documents/Publications/wilpf-guide-aws.pdf
Cohn, C. (2013). Women and wars: Contested histories, uncertain futures. Signs: Journal of Women in Culture and Society, 38(2), 451-460.
Enloe, C. (2016). Globalization and Militarism: Feminists Make the Link. Rowman & Littlefield.
Cohn, C. (1987). "Sex and Death in the Rational World of Defense Intellectuals." Signs: Journal of Women in Culture and Society, 12(4), 687-718.
Human Rights Watch. (2020). Stopping Killer Robots: Country Positions on Banning Fully Autonomous Weapons.
International Committee of the Red Cross. (2021). Autonomous Weapons Systems: Technical, Military, Legal and Humanitarian Aspects.
Manjikian, M. (2014). “Becoming Unmanned: The gendering of lethal autonomous warfare technology,” International Feminist Journal of Politics, 16(1), 52–53.
Pratt, N. (2013). Reconceptualizing gender, reinscribing racial–sexual boundaries in international security: the case of UN Security Council Resolution 1325 on “Women, Peace and Security”. International Studies Quarterly, 57(4), 772-783.
Sexton, M. & Carneal, E. (2024). Lethal Autonomous Weapons 101. https://www.thirdway.org/memo/lethal-autonomous-weapons-101
Sharkey, N. (2019). Autonomous weapons systems, killer robots and human dignity. Ethics and Information Technology, 21(2), 75-87.
Viveros Álvarez, J.S. (2024). "The risks and inefficacies of AI systems in military targeting support." ICRC Humanitarian Law & Policy Blog, September 4, 2024. https://blogs.icrc.org/law-and-policy/2024/09/04/the-risks-and-inefficacies-of-ai-systems-in-military-targeting-support/
Walsh, T. (2022). “The Problem with Artificial (General) Intelligence in Warfare.” Centre for International Governance Innovation, https://www.cigionline.org/articles/the-problem-with-artificial-general-intelligence-in-warfare/.
Wilcox, L. (2017). “Embodying algorithmic war: Gender, race, and the posthuman in drone warfare,” Security Dialogue, 48(1), 13.