Feminist Approaches to Tech: Interview with Eleonora Sironi

Eleonora Sironi holds a Bachelor of Arts in International Studies from the University of Milan and a Master of Arts in International Affairs from the Hertie School, specialising in Security. With roles at the tech feminist start-up VIOLA, Global Solutions Initiative, and MSF, focuses on gender equity, human rights, as well as sustainable development. Fluent in Italian, English, German and Spanish, they worked in project management, advocacy, feminist policy, and violence prevention.

Can you start by telling me a little bit about yourself, the work you do, and how you got to this point in your journey?

I am a researcher in gender technology and digitalization. I interrogate power structures and how they're reproduced and sometimes produced by technologies and data practices. I got here through user and personal experience first, because I got into this field through activism and by advocating for feminist approaches to public safety, as well as the right to reclaim the streets and feeling safe in public spaces. Online activism soon paved the way for my work. I started leaning into the tech startup field. This was also an opportunity to think deeper about how I think about technology and digital media, because I realized that I had different assumptions on the nature of technology itself. I used to think it was too technical and not political and was thus not my thing. That changed, of course. I first felt that we use these technologies every day, and it didn’t feel like it was an area of concern for me. With time, I began to study the multiple intersections between technology, feminism, and decolonization, and began to explore what feminist approaches to such platforms may mean, and what community-based applications and feminist approaches to AI can look like in practice.

If social justice is deeply connected to technology, why is the exclusion of social justice principles so normalized in tech spaces?

This is one of the key questions that everyone in this field of social justice and technologies tries to answer. I work in Europe. Out of all the venture capital in Europe, less than 5% goes to female founders and majority- or all-female teams. For other marginalized communities and identities, it is even more of a challenge, especially when it comes to their seat at the decision-making table. All this is rendered even more complex when it comes to profit-driven fields, which technology is. Being far away from social justice as tech practitioners is really about the nature of this work and industry, which reproduces a series of biases. These biases are structured within the profit-driven ethos of this industry, which uses the language of “data for good” and “green data.” There are few accountability structures to uphold basic human rights and embed basic principles of non-discrimination. As practitioners and researchers in the field, it is important to look at the systemic underpinnings to ensure that representation is real. It is not enough to just put a woman in your team to fix all your biases. 

We really need a more serious, regular, rigorous, and intersectional approach that looks at all the various types of discriminations there are. It is also important to advocate for more holistic approaches when it comes to education on these topics, because we see a lot of considerations on how these technologies are developed. The approach is technical, and they don’t account for historical oppressions. We need multidisciplinary collaboration, and it is important to engage with different stakeholders throughout the process. For instance, within the AI4D program in Africa, as gender equality researchers we work very closely with AI researchers to develop innovations, which are all locally owned. It is really a project of knowledge exchange and support, and covers a variety of areas like agriculture, sexual health, gender equality, and disability inclusion. Knowledge exchange and communication come together to constitute the first step to mainstreaming the idea that multiple aspects of technology impact our lives in differentiated ways. I think it is something we learn with time, but when we do learn it, we must streamline this idea and recognize the impact technology has on all spheres of life.

The benevolent philanthropy argument is consistently being used within the industry to market these technologies, which fits very well with the digital colonialism discourse, unfortunately. I think this is going to be more and more present with all the governmental lobbying by tech corporations that have been rebranding very extractivist practices through the terminology of advancement of growth and the very strategic use of the word “intelligence.” We see them marketing the economic framework of technology through the lens of development, and now, also as something for the good of all humanity. They claim to be doing the right thing and that this will bring progress for everyone, but it is not really like that currently, is it? The negative consequences are not redistributed equally but go heavily to the majority world, and the positive consequences are all locked up in the global north. All the data are being extracted from the global majority and is built entirely on the backs of low-wage labour from and in the global majority. These things have huge repercussions on the people who are being exploited by labeling jobs, both on their mental and physical health. We need to embed social justice principles upfront in our development, deployment, and use of, and engagement with technology.

A lot of conversations in the policy space talk about data being the new oil. This view has shaped how geopolitics is being performed and engaged with today. What stands out to you about this right now?

I see how it is the new oil in the sense that it is a very profitable business model given both, the extractivism and the profit motive. Something we're not fully maybe equipped to address is the unprecedented concentration of power. There is a vast amount of data that governments and tech corporations own, on users and citizens. It is like nothing we have seen before. This is not just a bunch of statistics about citizens – it is a lot different, and it is enabling the massive role of big tech giants in this field by mainstreaming the lack of transparency and accountability as core features in how our data is managed

A few private companies that we know are based in the US are controlling so much of the digital space and infrastructure. There are no benefits for the people who helped build these infrastructures, especially those of them that come from the majority world. Before we can really address this in a serious feminist way with serious feminist policies, it is important to underline how these technologies are being developed and to ask questions like who is developing these technologies, and for what interests, who will be protected and who will be harmed. We see this a lot with facial recognition that Joy Buolamwini and other scholars and practitioners have spoken about – these tools reproduce human bias, which are deeply embedded in these technologies.

These are all the points we need to address in doing a power analysis of these technologies, especially so that we can make feminist policies for the future. It is important to ensure that these practices are not the norm. We have seen the accumulation of power in history, but this is an unprecedented, completely new reality that we must all address. We have been conditioned to accept the accumulation of power and assume that it cannot be changed, but within feminist work, we must dismantle power hierarchies and ensure an equal distribution of power within data practices for the future. 

You've had a background in GBV advocacy, which happens to be one of the areas where tech is being used to enable and facilitate harm. What are you noticing on this front?

Feminists engaged in technology studies have been trying to convey the idea that technology is not necessarily just disruptive or just positive. There is a lot of complexity around this. All of it needs to be addressed because on the one hand, of course it can bring about greater possibilities for survivors of gender-based violence, in the form of expanded access to resources, support, and community. But this is not unconditional. We need strict privacy and security policies, and also a basis to trust that the platforms on which we rely for this type of support will not use our data in ways that can harm us further. Technology can be of use for services to support survivors of violence, especially for women who are facing control by their partners. Maybe they can't move outside freely and have to rely on technology. But at the same time, they must feel safe with how this type of support will not cause them greater challenges by giving out their information in a way that will harm them.

When we address gender-based violence, it is important to always do no harm. This is what we have all been doing in our work with gender-based violence advocacy and what I’ve learned working with safety apps. This is not about reinventing the wheel, but rather about providing more structural support to many, especially women who rely on such services. In our use of technology, we can reproduce the violence we already see in our societies. This is the root issue and we need to address this. Slow regulations play a role too, because if you're so slow with regulations, the risks connected to these technologies also rise because there is nothing to control them. There is a price for lack of regulation, which can look like an invasive data sharing request and invasive data processing practices, which are so embedded in our normal usage of technologies. Our data is part of us. It's not something outside of us. It's literally us. And in the case of technology-facilitated gender-based violence, think about deepfakes, it's literally our image. Platforms should be held accountable when they don't build the consent into the design. All these aspects are connected and platforms should be held accountable when this is not respected.

If technology was owned, deployed, or developed by communities rather than by capitalists, so many of the crises we find ourselves in the middle of would not happen. Maybe you agree, you disagree. What's coming up?

I love this idea. I think it's very interesting to think about this because much of the strength of the tech industry and this type of so-called development lies in the idea that this rhythm of advancement is inevitable. We see these companies presenting their products as unstoppable and that they need to be developed and used right now. They reject regulations based on a sense of urgency. This type of imagination you sowed with this prompt is one of the seeds of hope that drove me to do this type of work. Things don't need to be the way they are. Alternative ways to imagine technology and imagine reality are possible!

Although we need to constantly address risks, the possibilities that come with innovation are not the enemy. I don't know how this would work out or look in practice. I would love to see it work out. I strongly advocate for platforms that work for the protection of communities, and that are owned by communities. I would love to see personal data not be sold for profit. One way to get there is to center perspectives that are not western led. I think this would be a way to feel safer in our digital bodies instead of fearing exploitation by platforms as data objects. I'd love to see more of this in practice!

What might it mean for us to engage with technology to a feminist lens?

Let's make space for alternatives! I think alternative feminist imaginations of technologies are possible and highlighting collective engagement and organization is strategic and important. For example, Microsoft cut off Israel's military access to a very vast database for a technology that allowed Israel to access Palestinian civilians’ phone calls, and to collect, process, and store the data. This happened after the mobilization of tech workers and several other people. This is not the solution for all our problems, but it's important to highlight these victories and show how users and tech workers have the power to change the structures and create alternative features, while being grounded in a feminist framework based on feminist principles.

Tech practices can produce discrimination, and they can produce it as technology, but we know also they reproduce human dynamics and human discriminatory behaviors. Just as discriminatory human behaviors, they can also be challenged. We should aim to address the root issues and ensure that we engage meaningfully with all affected groups throughout the lifecycle. This is crucial if we want to create an ethical benchmark that is really grounded in the principles of transparency, accountability, and human rights. We do have an alternative for our futures with technology. Grounding ourselves in feminist principles like intersectionality is key. If we want to be intentional and meaningful with our work, intersectional power analysis should be fundamental in our work, practices, and tools that we use and deploy.

 

 

Next
Next

Feminist Approaches to Tech: Interview with Monique Munarini