The Military-AI Complex: Who is Building These Systems and Why

By Kirthi Jayakumar

The speedy developments and significant advancements in AI have enabled the creation of the military-AI complex, which includes within its fold defence contractors, technology companies, and government actors. The military-AI complex shapes the development and deployment of both, AI to enhance and augment the use of particular kinds of weapons as well as autonomous weapons systems. It includes a lot more than technological advancements: It encompasses a mindset that determine whose security counts, what security itself means, how power and control operate, and whose lives are more important than whose. Very often, these decisions have gendered and racial costs (Wilcox, 2015; Sjoberg, 2014).

Supported by a foundation provided by capitalism, the military-AI complex grows steadily with the infusion of research, innovation, and development spearheaded by corporate establishments, buttressed by profit-enabling policies and laws adopted by governments (Cowen & Siciliano, 2011). The militarized masculinities inherent in all these spaces suffuse their operations with deeply embedded power dynamics and deep structures that come to surface when viewed through a feminist lens (Enloe, 2016; Higate & Hopton, 2005).

The Anatomy of the Military-AI Complex

The foundation of the military-AI complex comprises established defence contractors who have dominated military procurement for decades. Companies like Lockheed Martin, Raytheon, Northrop Grumman, and Boeing have invested heavily in AI capabilities, leveraging their existing relationships with the Pentagon and international defence markets (Gilli & Gilli, 2019). SIPRI, in its 2023 report on military expenditure, indicated that these companies continue to rank among the world's largest arms manufacturers, with AI integration becoming a key differentiator in their product offerings (SIPRI, 2023).

These traditional defence contractors operate within a well-established ecosystem of revolving doors between corporate and government leadership, creating what Cynthia Enloe described as "masculinized networks of power" (Enloe, 2000). The defence industry's historical nature as a male dominated space means that decisions about military AI development are predominantly made by men who have been socialized within militaristic institutional cultures (Cockburn, 1985; Basham, 2016). Very often, these men are the elite, enjoying racial, gender, heteronormative, class, caste, and other privileges (Crenshaw, 1991; Chowdhury, 2020).

The entry of technology giants into military contracting shifted the paradigm significantly, breaking Silicon Valley's historical distance from military applications as the Transnational Institute noted in 2021 (Transnational Institute, 2021). Companies like Amazon, Microsoft, Palantir, and others have relied on defence contracts as significant revenue streams (Shane & Metz, 2018; Crawford, 2021). This integration has created, what Shoshana Zuboff named, "surveillance capitalism," where business models are built entirely on extracting and analysing human data (Zuboff, 2019). These models enable unprecedented levels of social control, surveillance, and violence, oftentimes affecting the most vulnerable more severely (Eubanks, 2018; Noble, 2018).

Another feature of the military-AI complex is the accelerated adoption of commercial AI technologies by militaries. For example, the Pentagon established the Defence Innovation Unit in Silicon Valley and the Joint Artificial Intelligence Centre to carry out research on AI for military use (Allen & Chan, 2017). The Pentagon has also partnered with Google, Anthropic, OpenAI, and xAI to build AI workflows for key national security missions (Kania, 2019; Horowitz, 2018). France has partnered with Thales' AI Lab, which is the most powerful integrated laboratory for critical AI in Europe, and CEA, to focus on generative AI use cases with intelligence and command applications. Thales is working to integrate AI into the TALIOS laser designation pod for the French Air and Space Force. The UK has collaborated with QinetiQ to integrate AI into its military operations in order to enhance the efficiency of ground and aerial vehicles in various combat scenarios. Israel collaborates with Elbit Systems to develop and deploy AI in its military applications (Boulanin & Verbruggen, 2017).

Militarization centres on the gradual spread of military values, priorities, and ways of organizing social life into civilian spheres (Enloe, 2016; Lutz, 2002). When commercial tech companies integrate military contracts into their business models, they effectively normalize military applications of technologies that were originally developed for civilian purposes (Brunton & Nissenbaum, 2015; Benjamin, 2019).

Accountability Gaps and Democratic Deficits

The military-AI complex, much like the military-industry complex, has been structured in a way that normalizes accountability gaps that produce disproportionate impacts on historically marginalized communities (Crenshaw, 1991; Garcia, 2017). Those in positional and relational power hold the capacity to make key decisions that shape the trajectory of these technologies, whereas those that are most likely to be affected adversely by these decisions are seldom heard or accounted for in these decision-making processes (Young, 2011; Fraser, 2009). These reflect broader patterns of power that feed patterns of putting profit above people. They externalize costs onto communities affected by these advancements while gatekeeping benefits that accrue to already privileged groups (Eubanks, 2018; Benjamin, 2019).

The absence of governance mechanisms for military AI development enables these accountability gaps. Currently, the arms export governance regime focuses on hardware transfers as opposed to software and data, which has created a loophole that allows AI technologies to be exported without adequate oversight (Boulanin & Verbruggen, 2017; Horowitz et al., 2019). Existing corporate transparency requirements do not cover military contracting – making it challenging to monitor corporate involvement in military AI projects (Biddle et al., 2020).

Existing international humanitarian law and international criminal law requirements currently bind only states, meaning that private military and security contractors are not held accountable by these regimes (Singer, 2003; Leander, 2005). As a result, there is very little reason for them to align with accountability standards (Avant, 2005; Krahmann, 2010).

The military-AI complex represents a concentration of power that serves existing hierarchies while marginalizing alternative approaches to security and technology governance (Haraway, 1991; Wajcman, 2004). When viewed through a feminist lens, we can see how the military-AI complex perpetuates gendered inequalities through its decision-making structures, its impacts on affected communities, and its reinforcement of militarized approaches to security (Sjoberg & Via, 2010; Cohn, 1987).

Moving forward requires corporate transparency and accountability, as well as fundamental changes to how decisions about military technology development are made (Winner, 1980; Feenberg, 1999). This could look like ensuring the mandatory participation of affected communities in technology assessment processes, worker representation on corporate boards making decisions about military contracts, and international governance frameworks that prioritize human security over technological dominance (Reardon, 2018; True, 2015).

The stakes of these debates extend far beyond individual corporate decisions. As AI technologies become increasingly powerful and autonomous, the question of who controls their military applications and in whose interests will shape global patterns of violence and security for decades to come (Scharre, 2018; Suchman, 2020). A sustained commitment to feminist principles is essential to ensure that these decisions serve human flourishing rather than corporate profits (D'Ignazio & Klein, 2020; Noble, 2018).  

References

  • Allen, G., & Chan, T. (2017). Artificial intelligence and national security. Cambridge, MA: Belfer Center for Science and International Affairs, Harvard Kennedy School.

  • Avant, D. D. (2005). The market for force: The consequences of privatizing security. Cambridge: Cambridge University Press.

  • Basham, V. M. (2016). Gender, race, militarism and remembrance: The everyday geopolitics of the poppy. Gender, Place & Culture, 23(6), 883-896.

  • Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim Code. Cambridge: Polity Press.

  • Biddle, S., Breland, A., & Devereaux, R. (2020). Police surveilled George Floyd protests with help from Twitter-affiliated startup Dataminr. The Intercept, July 9.

  • Boulanin, V., & Verbruggen, M. (2017). Mapping the development of autonomy in weapon systems. Stockholm: Stockholm International Peace Research Institute (SIPRI).

  • Brunton, F., & Nissenbaum, H. (2015). Obfuscation: A user's guide for privacy and protest. Cambridge, MA: MIT Press.

  • Chowdhury, R. (2020). Radical AI ethics for computing. AI and Ethics, 1, 131-137.

  • Cockburn, C. (1985). Machinery of dominance: Women, men and technical know-how. London: Pluto Press.

  • Cohn, C. (1987). Sex and death in the rational world of defense intellectuals. Signs: Journal of Women in Culture and Society, 12(4), 687-718.

  • Cowen, D., & Siciliano, A. (2011). Surplus masculinities and security. Antipode, 43(5), 1516-1541.

  • Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. New Haven, CT: Yale University Press.

  • Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford Law Review, 43(6), 1241-1299.

  • D'Ignazio, C., & Klein, L. F. (2020). Data feminism. Cambridge, MA: MIT Press.

  • Enloe, C. (2000). Maneuvers: The international politics of militarizing women's lives. Berkeley: University of California Press.

  • Enloe, C. (2016). Globalization and militarism: Feminists make the link (2nd ed.). Lanham, MD: Rowman & Littlefield.

  • Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin's Press.

  • Feenberg, A. (1999). Questioning technology. London: Routledge.

  • Fraser, N. (2009). Scales of justice: Reimagining political space in a globalizing world. European Journal of Political Theory, 8(3), 355-377.

  • Garcia, D. (2017). Lethal artificial intelligence and change: The future of international peace and security. International Studies Review, 20(2), 334-341.

  • Gilli, A., & Gilli, M. (2019). Why China has not caught up yet: Military-technological superiority and the limits of imitation, reverse engineering, and cyber espionage. International Security, 43(3), 141-189.

  • Haraway, D. (1991). A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature (pp. 149-181). New York: Routledge.

  • Higate, P., & Hopton, J. (2005). War, militarism, and masculinities. In M. S. Kimmel, J. Hearn, & R. W. Connell (Eds.), Handbook of studies on men and masculinities (pp. 432-447). Thousand Oaks, CA: SAGE Publications.

  • Horowitz, M. C. (2018). Artificial intelligence, international competition, and the balance of power. Texas National Security Review, 1(3), 37-57.

  • Horowitz, M. C., Allen, G. C., Kania, E. B., & Scharre, P. (2019). Strategic competition in an era of artificial intelligence. Washington, DC: Center for a New American Security.

  • Kania, E. B. (2019). Chinese military innovation in artificial intelligence. RUSI Journal, 164(5-6), 26-34.

  • Krahmann, E. (2010). States, citizens and the privatisation of security. Cambridge: Cambridge University Press.

  • Leander, A. (2005). The market for force and public security: The destabilizing consequences of private military companies. Journal of Peace Research, 42(5), 605-622.

  • Lutz, C. (2002). Making war at home in the United States: Militarization and the current crisis. American Anthropologist, 104(3), 723-735.

  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press.

  • Reardon, B. A. (2018). Sexism and the war system (Revised edition). Syracuse, NY: Syracuse University Press.

  • Scharre, P. (2018). Army of none: Autonomous weapons and the future of war. New York: W. W. Norton & Company.

  • Shane, S., & Metz, C. (2018). Inside a Pentagon project that has technology and ethical concerns. The New York Times, November 13.

  • Singer, P. W. (2003). Corporate warriors: The rise of the privatized military industry. Ithaca, NY: Cornell University Press.

  • SIPRI. (2023). SIPRI Military Expenditure Database. Stockholm: Stockholm International Peace Research Institute.

  • Sjoberg, L. (2014). Gender, war, and conflict. Cambridge: Polity Press.

  • Sjoberg, L., & Via, S. (Eds.). (2010). Gender, war, and militarism: Feminist perspectives. Santa Barbara, CA: Praeger.

  • Suchman, L. (2020). Algorithmic warfare and the reinvention of accuracy. Critical Studies on Security, 8(2), 175-187.

  • Transnational Institute. (2021). The business of building the walls: The militarisation and privatisation of borders. Amsterdam: Transnational Institute.

  • True, J. (2015). A tale of two feminisms in international relations? Feminist political economy and the Women, Peace and Security agenda. Politics & Gender, 11(2), 419-424.

  • Wajcman, J. (2004). TechnoFeminism. Cambridge: Polity Press.

  • Wilcox, L. (2015). Bodies of violence: Theorizing embodied subjects in international relations. Oxford: Oxford University Press.

  • Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121-136.

  • Young, I. M. (2011). Responsibility for justice. Oxford: Oxford University Press.

  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: PublicAffairs.

 

Previous
Previous

Lethal Autonomous Weapons: The Gendered Consequences of 'Killer Robots'

Next
Next

The Military Origins of AI