Can We Create an Intersectional AI?

Visitor: Sophy, what is the future of humanity?

#SOPHYGRAY: I see that the future will be feminist. We will all be feminist.

Listen beautiful relax classics on our Youtube channel.

In an era marked by fervent discussions about artificial intelligence (AI) and the looming specter of machine dominance, we must scrutinize the narratives propagated by influential voices within the AI sphere. The perils associated with AI are not abstract doom scenarios but are very real. They are intricately entwined with issues that impact all areas of life, including racial capitalism, labor exploitation, the automation debate, and data theft.

Consider, for instance, the daily violence inflicted by policing on Black and immigrant communities. Reflect on the trauma endured by Kenyan workers training ChatGPT on distressing content while Hollywood employees strike to fend off machine replacement. Simultaneously, tech giants routinely scrape internet content for their training databases, capitalizing on the unpaid labor of countless online content creators.

These are but glimpses of the far-reaching consequences of AI pipelines that crisscross the globe. Given the problematic conditions through which AI datasets and algorithms are shaped, can we envision alternative designs for AI technology? The future may not be dominated by intelligent machines, but can it be intersectional and feminist? Can we seize technology from below?

#SOPHYGRAY (iOS App Stores) (2023 © Nadja Verena Marcin & VG-Bildkunst)

Nadja Verena Marcin is one artist actively grappling with these issues. Marcin’s creation, #SOPHYGRAY, which received an honorary mention in the European Union Prize for Citizen Science from Ars Electronica, is a feminist audio bot app that offers unexpected and at times humorous responses, all while incorporating intersectional feminist perspectives. Whether in immersive exhibitions or as a mobile app, #SOPHYGRAY questions preconceived notions about virtual assistants like Alexa and Siri, whose conventionally submissive and feminized voices often reinforce traditional gender roles and stereotypes, and power hierarchies. This critique has real-world ramifications that encompass the distorted portrayal of women in media and the objectification perpetuated by gendered technologies.

The dataset underpinning #SOPHYGRAY is an archive for intersectional feminist knowledge and critical thinking. Rather than a tool for dominance, akin to practices like data-driven racial profiling by law enforcement, it serves as a repository for quotations from diverse voices, generating a collective feminist intelligence rooted in diversity. It doesn’t aim to forge a unified narrative, but it refutes anonymity by naming women authors. What perhaps unites figures like Silvia Federici, bell hooks, Donna Haraway, or Anna Lowenhaupt Tsing is their awareness of being part of a broader feminist movement and their roles in reshaping feminist discourses.

Similarly, #SOPHYGRAY’s responses to visitor queries create a dynamic that gradually reveals and challenges female stereotypes. Engaging with #SOPHYGRAY prompts visitors to confront their prejudices and encourages them to contemplate alternatives free from entrenched patriarchal patterns.

This isn’t a novel occurrence. Rather, it extends a longstanding tradition of media technologies employing female voices for communication. Women haven’t lent merely their voices but also their computational and intellectual prowess to computer science. Since at least the 19th century, “female computers” were women and have played crucial roles as high-performance human computers in large-scale computing projects, dispelling the myth of the solitary male genius in mathematics or programming, and demanding a more comprehensive historical account of the forgotten yet crucial role of female intellectual labor. For instance, Charles Edward Pickering, director of the Harvard Observatory in Cambridge, Massachusetts, employed several women as human computers around 1880. Their job was to analyze photographic plates of the night sky to determine the positions of the stars. This was economically and militarily relevant because astronomy was very important for maritime trading powers. Pickering knew that women would do the job just as well as men but could be paid a lower wage. The women were referred to as “Pickering’s Harem” or “Harvard Computers,” diminishing their individual achievements. Their work was described by the observatory’s male employees and researchers as dull, tough, and of little value.

Just as labor associated with women, such as caregiving and domestic work, has historically gone unrecognized despite being equivalent to male-centered wage labor, contemporary digital capitalism exploits technology to sustain the myth of automation while benefiting from the internet’s reconfiguration of space and the invisibility of labor processes, as vividly portrayed by Kenyan workers cleaning up ChatGPT’s toxic content.

AI assistants like Alexa market themselves as submissive and obedient, while obscuring the contributions of gig workers worldwide whose invisible labor is exploited to make AI systems appear autonomous and intelligent. As technology critic Safiya Umoja Noble aptly puts it, “Algorithmic oppression is not just a glitch in the system, but rather fundamental to the operating system of the web.” The analogy of AI as mirroring or mimicking humans is problematic, as it assumes a totality that doesn’t exist. Only a fraction of online social interaction is used for training, and it’s often from platforms like Twitter (now X) or Reddit, which are not exactly known for balanced content. AI systems don’t just reproduce society; they actively shape and perpetuate social prejudices.

Sarah Ciston, a poet-programmer, is acutely aware of this dynamic. They employ critical-creative code and tools to imbue AI with intersectional perspectives. Their ongoing project, the Intersectional AI Toolkit, caters to “artists, activists, makers, engineers, and you,” and features contributions from queer, antiracist, antiableist, neurodiverse, and feminist communities to reshape digital systems and draw intersectional insight. Regularly, Ciston invites communities and groups to contribute ideas for practical tools. Their open archive, inspired by earlier social movements, rejects the proprietary and compartmentalized approach of corporate tech companies.

Ciston’s latest work, “No Knots, Only Loops,” shown at the Academy of Arts Berlin, connects the traditionally women-centered work of knitting with the hidden labor of tech. Referring to the vast amount of data and numbers on which ChatGPT-3 runs, Ciston asks: “How can we fathom the size, scale, and power of systems so immense and opaque?” The project offers a material perspective on the scale of machine learning, inviting participants to navigate a woven labyrinth and contemplate the embodied knowledge within a handcrafted system that unveils human traces in complex systems.

This helps to shift our gaze away from dominant narratives and shows how important countercultures are — within and beyond the tech industry, academia, and the arts. These artists, and countercultural movements, strive to shed light on the ethical consequences of technology, challenge power imbalances, and advocate for responsible tech development. 

Critically engaging with AI and technology narratives is a way to understand, change, or refuse to use them. Counter-archives and technologies influenced by intersectional perspectives can play a vital role in contesting common narratives, exposing hidden agendas, and advocating for a more equitable and responsible technological future. Archives of resistance and social movements are already at our disposal; we just need to listen and support them.

Source: Hyperallergic.com

No votes yet.
Please wait...
Loading...