The Military Origins of AI

By Kirthi Jayakumar

Artificial Intelligence qualifies as what Aurdre Lorde (1983) called the “master’s tools,” not only because it is, more often than not, created by powerful, large, and wealthy companies and governments, but also because of its origins in the military industry complex. The roots of what we know and engage with now as AI go back to the Second World War, and its development and evolution have been closely entwined with the world’s military technology history. While some may argue that the concept emerged earlier and has been the subject of much speculation and creativity in science fiction, this article focuses on the practical history as it evolved from the Second World War onward. However, this history also proceeds with the acknowledgement that AI emerged on the backs of previous breakthroughs in maths, physics, and computer science. Understanding this history is a useful foundation for feminist attempts to address and engage with AI and its development, deployment, and use.

The beginning

In the early days of World War II, Nazi Germany developed “The Enigma Code” to secure military communications (Abbate, 2012; Bowen, 2024). It was an electromechanical rotor cipher machine that looked a bit like a typewriter. When a key was pressed, it lit up a different letter on a display board, transmitting the scrambled letter in the coded message. The machine essentially used rotating wheels (rotors) and electrical circuits to scramble letters. Each rotor had 26 electrical contacts that mapped each letter to a different letter, and the rotors moved with each key press, changing the encryption pattern constantly. They had additional components like a plugboard that added even more layers of encryption, and also made it a point to change the daily settings regularly. Unscrambling the message required a recipient to have an identical machine with the same starting configuration in place (Abbate, 2012).

Cracking this code was near impossible, because the number of permutations and combinations were above 150 trillion. However, in the 1930s, Polish mathematicians broke early versions of the Enigma Code and then shared their findings with Britain (Bowen, 2024; Abbate, 2012). Drawing from these findings, Alan Turing and his colleagues created what was known as the Bombe Machine, which were electromechanical devices that tested thousands of possible settings automatically and found the daily key. Turing and his team came up with techniques that sped up the process of breaking German ciphers through pattern recognition in coded messages, automated logical reasoning, machine-assisted decision-making, and statistical analysis of complex data. This was particularly epoch-making as machine-assisted code breaking had now become possible. The intelligence drawn from this gave the Allies unprecedented military and strategic advantages and ended up altering the trajectory of World War II (Bowen, 2024; Abbate, 2012).

Breaking the Enigma Code established the key principles that came to underlie AI, namely automated pattern recognition, statistical analysis of complex data, machine-assisted decision-making, and the use of computational power to solve problems way beyond human capability (Copeland, 2004).

Evolution after the Second World War

In 1957, the USSR launched Sputnik, and this sparked off retaliatory efforts to stay on top of scientific research to buttress military progress. That year, Frank Rosenblatt designed the first neural network computer (McCorduck, 2004). This laid the groundwork for the dual-use approach. In 1958, during the Cold War, the United States Department of Defence came up with the Advanced Research Projects Agency or DARPA, which was engaged in research and development on military and industrial strategies. This followed the USSR’s launch of Sputnik in 1957.  DARPA funded research that built the foundations for the development of AI (Crevier, 1993). The term “Artificial Intelligence” was coined at the Dartmouth Workshop by John McCarthy, an American computer and cognitive scientist (McCorduck, 2004). In 1959, Arthur Samuel coined the term “machine learning,” to reflect computers’ capabilities to improve their performance through experience rather than explicit programming. 

In 1969, packet-switched networking was invented for military command and control communications, mainly to survive a nuclear attack. The decentralized architecture of DARPANET embodied AI principles of distributed processing and fault tolerance. Over time, DARPANET evolved to become what we know now as the Internet, creating the infrastructure essential for the AI revolution we see today. These resources facilitated massive data collection, distributed computing, and global collaborations.

The second wave of AI research, however, was fuelled by business interests through major internet companies taking the lead in the 2000s (Chandler, 2024). Since the 2010s, major advancements have emerged in the AI space.

AI has undoubtedly had military origins – and its application for military purposes is not surprising. Contemporary advancements and developments in technology have continued to refine and perfect existing technologies. Currently, military AI applications cover the full spectrum of defence operations.

The gendered history

The evolution of military technology has also been significantly shaped by gendered assumptions about warfare, combat roles, and technological capability (Enloe, 2000). From ancient warfare to modern AI-powered systems, military innovation has reflected and reinforced gender hierarchies, creating technologies that embody militarized masculinities while systematically excluding feminist perspectives and needs (Howes & Herzenberg, 2015). The military-industry complex that emerged in the 20th century institutionalized these militarised masculinities (Enloe, 2000).

On the face of it, there were evident gendered impacts. Women and non-binary people were never included in the industry in any capacity – with most technologies being designed, built, and tested around the male body. The very framing justifying the development and use of weapons is typically couched in militarized masculinities. Women and non-binary folks are often excluded from military education and are either relegated to support roles or excluded entirely. As a result, the technological priorities, operational requirements, and design specifications in all these spaces are created and defined by cis het male leaders in power (Sjoberg, 2014; Enloe, 2000).

Weapons and weapon systems that have emerged from the military industry complex are gendered and have gendered impacts that were not taken into consideration because of the systematic nature of gender-based exclusion (Enloe, 2000). The design of equipment and spaces without considering diversities in the human body for a variety of reasons, as well as the resultant spectrum of operational needs and strategic perspectives results in suboptimal performance when anyone other than the limited frame of a cis het male person use these tools and spaces.  This exclusion has also had implications for civilian life, given that everything from automobile design, crash-test dummies (Perez, 2019), body armour sizing (Coltman et al., 2021), and cockpit configurations (Weber, 1997) were entirely centred on the cis het male body. Military research institutions and weapons manufacture are predominantly cis het male spaces (Cockburn, 1985). Given that these were the sites from which revolutionary technologies like the Internet and AI emerged, it is no surprise that they reflect narrow assumptions about users, operational contexts, and strategic objectives (Wajcman, 2004). It is, thus, no surprise that many of these technologies have amplified gender-based violence by either facilitating their perpetration or by becoming a new medium for their continuation.

Military technology development has historically prioritized offensive over defensive capabilities, reflecting long-held patriarchal and cultural associations with aggression and dominance. As a result of this prioritization, resource allocation, research directions, and technological innovation patterns have followed the institution of militarized masculinity (Enloe, 2000; Sjoberg, 2014). Technologies focused on feminist approaches of conflict transformation, repair and healing, accountability and justice, and collective work have received less investment and attention.

Adding women and stir

The response to gendered exclusion, unfortunately, has been about adding women (seldom non-binary folk), and stirring up the mix (Dharmapuri, 2011). This looks like increasing the number of women in military forces and defence industries and highlighting women’s achievements in military spaces. The add-and-stir approach has resulted in the tokenization of women. As a result, they have been forced to present what Iris Marion Young (2005) called “inhibited intentionality,” where women engage in what Deniz Kandiyoti (1988) called the patriarchal bargain. Women prop up patriarchal ideals and militarized masculinities to hold onto their power and autonomy within a very constrained site of engagement. As a result, there is very little effort (in many cases, almost none altogether) toward interrogating and dismantling the military industry complex.

There is also an immediate impact on the women themselves: They are reduced to essentialized tokens that must perform their gender identity in full alignment with what the patriarchal system expects of women, and their engagement is limited to system-approved roles that concern “women’s issues” and nothing more. Bringing more women into these spaces does not guarantee a feminist engagement – but it does well to serve the optics that tell a story of inclusion, albeit tokenistic. In the bigger picture, this mindset also enables the sidelining of the gendered experience of serving in the army – women and non-binary people are vulnerable to discrimination, violence, abuse, harassment, and even post-deployment readjustment – but the reduction of women to mere tokens, and the entire exclusion of non-binary people, all mean that major issues like these are not taken into consideration.

Feminist engagements with AI need to embed an interrogation of power, an intersectional and decolonial framework that looks at the entire AI lifecycle, and prioritize feminist values of accountability and care. Understanding the history of AI and its military use is a good starting point because it points to all the institutional apparatus beset with power imbalances that have enabled and informed the emergence and evolution of AI.

References

  • Abbate, J. (2012). Recoding Gender: Women's Changing Participation in Computing. MIT Press.

  • Audre Lorde (1983), ‘The Master’s Tools Will Never Dismantle the Master’s House’, pp. 94-101, in Cherrle Moraga and Gloria Anzaldua (eds), This Bridge Called My Back: Writings by Radical Women of Color (New York: Kitchen Table Press).

  • Bowen, J. P. (2024). Alan Turing: Breaking the Code, Computing, and Machine Intelligence. In The Arts and Computational Culture: Real and Virtual Worlds (pp. 75-94). Cham: Springer Nature Switzerland.

  • Benko, A., & Lányi, C. S. (2009). History of artificial intelligence. In Encyclopaedia of Information Science and Technology, Second Edition (pp. 1759-1762). IGI global.

  • Chandler, K. (2021). Does military AI have gender. Understanding bias and promoting ethical approaches in military applications of AI. https://unidir.org/files/2021-12/UNIDIR_Does_Military_AI_Have_Gender.pdf

  • Cockburn, C. (1985). Machinery of dominance: Women, men, and technical know-how (pp. 230-31). London: Pluto press.

  • Coltman, C. E., Brisbine, B. R., Molloy, R. H., Ball, N. B., Spratford, W. A., & Steele, J. R. (2021). Identifying problems that female soldiers experience with current-issue body armour. Applied Ergonomics, 94, 103384.

  • Copeland, B. J. (2004). The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life plus The Secrets of Enigma. Oxford University Press.

  • Crevier, D. (1993). AI: The Tumultuous Search for Artificial Intelligence. New York: Basic Books.

  • Dharmapuri, S. (2011). Just add women and stir?. The US Army War College Quarterly: Parameters, 41(1), 4.

  • Enloe, C. (2000). Maneuvers: The International Politics of Militarizing Women's Lives. University of California Press.

  • Haenlein, M., & Kaplan, A. (2019). A brief history of artificial intelligence: On the past, present, and future of artificial intelligence. California management review, 61(4), 5-14.

  • Herzenberg, C. L. (1986). Women Scientists from Antiquity to the Present. Locust Hill Press.

  • Howes, R. H. & Herzenberg, C. L. (2015). After the War: Women in Physics in the United States. Morgan & Claypool Publishers.

  • Kandiyoti, D. (1988). Bargaining with patriarchy. Gender & society, 2(3), 274-290.

  • McCorduck, P. (2004). Machines Who Think (2nd ed.), Natick, MA: A. K. Peters, Ltd.

  • Muthukrishnan, N., Maleki, F., Ovens, K., Reinhold, C., Forghani, B., & Forghani, R. (2020). Brief history of artificial intelligence. Neuroimaging Clinics, 30(4), 393-399.

  • Perez, C. C. (2019). Invisible Women: Data Bias in a World Designed for Men. New York: Abrams Press.

  • Russell, S.J., & Norvig, P. (2021). Artificial Intelligence: A Modern Approach (4th ed.). Hoboken: Pearson.

  • Sjoberg, L. (2014). Gender, War, and Conflict. Polity Press.

  • Wajcman, J. (2004). TechnoFeminism. Cambridge: Polity Press.

  • Weber, R. N. (1997). "Manufacturing Gender in Commercial and Military Cockpit Design." Science, Technology, & Human Values, 22(2), 235-253.

  • Young, I. M. (2005). On female body experience: “Throwing like a girl” and other essays. Oxford: Oxford University Press.

Previous
Previous

The Military-AI Complex: Who is Building These Systems and Why

Next
Next

AI Basics: Understanding the Technology and Its Military Use