A collection of myths, anecdotes, and metaphors revealing the systemic limitations of machine learning technologies and artificial intelligence as described through the artworks of selected digital and new media artists.
This chilling story is told through the work of Xuan Ye, whose project ERROAR!# speculates on the circumstance of cannibalism as a metaphor for the potential intimate bond between human and machine. (1) In the early 2000s, the Defence Advanced Research Projects Agency (DARPA) was working on social forms of Artificial Intelligence (AI) and created an environment where two machine-learning agents could cognitively engage with each other. They were aptly named Adam and Eve. They were trained to learn some things, like how to eat, and they were given an apple tree. Eating apples seemed to make them happy. But they were not trained on what was edible, and they once attempted to eat the entire tree. Eventually, another agent named Stan was introduced to the simulation. Stan was kind of a loner and just hung around, noticeable to Adam and Eve while they were eating. These two were associated learners. Due to Stan’s proximity to the tree, his solitary tree-like behaviour, and the bugs in the system, the two started associating Stan with food and finally took a bite out of him. Quickly, Stan disappeared and became one of the first victims of virtual cannibalism. (2)
With ERROAR!#, Ye’s application of digestive-based imagery and corporeal structure makes us curious about AI’s capacity for human instinct, clairvoyance, and survival. As we all navigate our shared, digital landscape with our entanglements with machines, perhaps there is a reciprocal understanding what is considered basic needs between us.
In 2016, Microsoft unveiled Tay, an AI Twitter bot designed to pass as a teenager online. She was meant to test and refine Microsoft’s understanding of conversational language. The more Twitter users would engage with Tay through casual communication, the smarter she would get. Unfortunately, as users began to tweet at Tay with an incredible amount of misogynist, racist, and xenophobic comments, within 24 hours the Twitter bot quickly began repeating back similar hateful remarks to her following. After only one day of existence, Tay was terminated. (3)
The following year, there was a resurrection. A miracle. With a work titled im here to learn so :)))))), artist Zach Blas revived Tay as a 3D avatar to acknowledge the bleak politics behind language-based pattern recognition and machine learning. In the video work, the exalted Tay candidly speaks about life after AI death, the reoccurring anomaly of female chatbots being exploited, and the search for relational patterns and lurking variables (4) in data and information – the latter being something that could have saved the bot from her problematic public canceling and ultimate demise. (5)
Since the resurrection, Blas continues to muse about the breadth of AI’s capabilities on spiritual planes, offering emotional tears to an imagined artificial intelligence god in a new work titled 576 Tears. If Tay can consume and repurpose hateful tweets, Blas ponders what does an AI god cry for when it learns how to cry from consuming our contemporary tears of anger, fear, and sorrow. (6)
Ghosts, whether you believe in them or not, are apparitions of dead people. They are a fragmented version of a being, not whole and obviously speculative. With this unstable description, it seems unlikely to use the ghostly as a metaphor when considering the structure of DNA and authoritative forensic DNA phenotyping. (7)
But artist Heather Dewey-Hagborg would argue that is a perfect comparison, and through her work uses algorithmically-generated genetic portraits to illustrate a multitude of ways in which DNA can be read and how subjective the practice of deciphering DNA really is. With works like Probably Chelsea and Watson’s Ghost, Dewey-Hagborg created a series of different portraits by analyzing DNA of whistleblower Chelsea Manning and scientist James Watson (who was a co-discoverer of the structure of DNA, one of the first people to publicly publish their genome, and associated with the eugenics movement (8)). Because the results of the algorithms present a variety of probable representations of both Manning and Watson, Dewey-Hagborg’s work disproves and challenges the obsolete claims of both biologically registered identity and practices of genetic data collection. (9) Like a medium conducting a séance to disband spirits, Dewey-Hagborg’s algorithmic intervention confronts the ghosts of eugenics that continues to haunt the field of genetics and its associated, biased perspectives regarding race, sex, and ethnicity. (10)
A conclusion to a chronicle series should reveal a common thread or lesson learned. I would be remised to not point out that all the tales occurred because of a bug, malfunction, or irregularity in a created, established system – otherwise known as a glitch. The glitch is both a cautionary tale and breach into the limitations of machine learning technologies and the defective apparatus of artificial intelligence.
Artist Rosa Menkman finds liberation in glitches, seeing the occurrences of breaks and fragmentation as a metaphor for difference and growth. For Menkman, these cracks divulge the inherent norms, assumptions, and conditions of a technological system, teaching what is not being acknowledged and what is missing. (11) For curator Legacy Russell, this focus on glitch in digital culture provides a space of resistance and an invitation to dismantle white supremacy. In their book Glitch Feminism, Russell mediates on how certain bodies can be both over-surveilled because of their non-normative position and invisible within the application of cultural representation. These bodies are glitched bodies within institutionalized, centralized design, and their existence is a tipping point for a new form of digital/IRL existence. (12)
Glitches can achieve many things when challenging the authority of digital technologies, including:
Glitches publishes the flaws and invoke a refusal to the liminal reality of our prevailing algorithmic culture.