HITL Reading Group: Meeting #1
In our first meeting, a group of participants from different walks of life reflected on two articles. The first of these was from Code Pink, titled "The War Intervention: AI, Data Centers, and the Environment" and the second was from The Guardian, titled “‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI.”
Reflecting on the first article, participants highlighted concerns around the privatisation of AI and its implications for data storage and privacy. Many acknowledged that we are past giving consent for the use of AI, as the deployment of AI in our circles is happening to us, whether or not we opt in. With this in place, there are many questions around who handles the data, which authorities are being held accountable (if anyone is being held accountable at all), and what is being done to action accountability. Common people are not involved in any level of decision-making around AI – be that for its design, development, deployment, or use. Participants recognised wisdom in being wary about the data one shares. Some recognised the value in starving models of their data and preventing the harvesting of personal information.
The group noted that costs of data privacy breaches, and environmental sustainability issues, are borne disproportionately by some communities compared to others. Pushing back and resisting is also a privilege that not many are able to engage in. Reflecting on this article, participants acknowledged that there was some amount of pain in exploring what we can afford to do differently in the world as it is, given that we are dependent on being connected in a certain way, while also treading a rather fragile balance when it comes to putting ourselves out there, given that jobs, incomes, and survival are at stake.
In reflecting on the use of AI for military purposes, the group noted that they were not surprised by the connection given technology’s military origins. This effectively strengthened the beliefs of some in the group toward choosing to use local models, while some affirmed that there is no inevitable “need” to use AI. However, acknowledging its wide proliferation, the group reflected on the wisdom behind regulating the entire AI lifecycle – starting from before its very existence, all the way through to its every use and iteration. They also made the case for rethinking regulation in itself, as the breakneck speed with which AI is evolving means that regulation made at one point in time could well be dead on arrival when it is rolled out. One participant made the case for reflecting on regulating AI like it were a weapon: Paying attention to the scale of harm and violence it can cause can go a long way in keeping it in check.
The second article provoked the group to think about the gender, race, caste, and colour of labour that makes AI possible. The article shared that the notion of “respectability” is often tagged to jobs like data labelling and coding. Content moderation calls for people to pay attention to vast amounts of content that is disturbing, harmful, and violent, and most folks chosen for work of this kind happen to be women from marginalised castes and Indigenous communities. Companies that recruit for these roles typically go for those that are most vulnerable to exploitation with the promise of respectability and job security. Oftentimes, for those receiving these opportunities, it comes with the promise of security and access to employment in a social ethos where mobility is restricted.
However, these jobs come with a heavy cost: mental health challenges, exploitation, stigmatisation that may come from expressing the true nature of what this work entails, as well as no work security. These jobs also become a new form of control and exploitation – as the restriction on women’s mobility in public spaces means that they are more likely to be employed in these roles than not. This also happens in workplaces that take pride in calling themselves DEI-equipped, safe workplaces – but whose safety is prioritised when the nature of work exposes women from marginalised backgrounds to such harmful, disturbing, and violent content?