A.I, WRITE ME A STORY ABOUT A GAY COUPLE
I love a story. Who doesn’t? Before sex and chocolate, finding joy in storytelling is, without a doubt, the best part of the human experience. Stories, in all their forms, give us goosebumps. They bring us to tears in laughter and sadness. A good story helps us feel a little less alone. This is why I was crushed a few months ago at a middle school promotion ceremony.
Those things are boring. I don’t recommend attending them unless you absolutely have to. At what should’ve been the halftime show, where the class valedictorian gives her speech, my attention was mildly stirred I will admit. A young scholar, tomorrow’s next star, addressing a crowd of bored parents and pubescents with a speech, a story, a dream. For the first few minutes of her speech, I was impressed! She actually made me chuckle. Might I say? I was captivated by this girl’s speech. Then she admitted an awful truth… she had used A.I software to write her speech.
It’s all the rage these days. Artificial Intelligence. Technology slowly killing off what remains of the human imagination. First, copywriting, then Hollywood, and now, valedictorian speeches. Are we doomed? I don’t know, my nihilistic self would lean closer to a yes, but I honestly don’t know. Maybe A.I will be a new tool to unlock aspects of human creativity never thought possible. Artists, like Sougwen Chung, have begun to liken A.I to a paintbrush in “human-machine collaboration.” That’s all jolly good, but A.I is subject to its coding, and by extension, its coders.
Author Safiya Umoja Noble writes about online systems’ entanglement with real-world oppression in her book “Algorithms of Oppression : How Search Engines Reinforce Racism.” In her book, Noble details how, despite what we would like to think, the algorithms and big data running our A.I are all but neutral. “The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors.” She writes.
Human-machine errors of prejudice have been long documented by A.I scholars. Noble documents some famous cases of coding gone wrong such as Google’s facial recognition marking Black folks as “apes” and “animals”. And in 2011, Google’s algorithms led pornographic search results when prompted “black girls…”. Clearly A.I is fallible, and sometimes really bigoted.
I wanted to test this. If A.I is encoded with the prejudice and stereotypes of real-world systems, I wanted to see how A.I’s algorithms see me as a gay man. What conceptions about gay love was an A.I story generator hiding? What emotions would it stir in me? I went to Toolsaday, an A.I story-generating website, and prompted it to “write me a story about a gay couple.”

It was a pretty bad story. It wrote about “Alex and Ryan”, being in love on a “perfect evening” planning out a romantic weekend “trip to the countryside” at a “lovely little bed and breakfast.” After the gay cliches, the story got weirdly violent when “a passerby yelled a homophobic slur at them, shattering the peaceful moment.” The story ends with the couple angrily heading home, holding each other, “feeling a mix of sadness and frustration at the world’s prejudices.” Woah.
As a gay man, this is a situation that often plays out for me in my head when I am in public holding my partner’s hand. I think most queer relationships have this worry, if not already been subjected to this exact situation. The fact that A.I picked up on this worry, and found it story worthy when prompted to simply write about a gay couple, is telling of something deeper. It means A.I recognizes that gay couples get harassed and are frustrated with “the world’s prejudices.”
Maybe A.I is starting to learn that those prejudices exist and are fundamental to our identities. This is pivotal, and I am not saying my Toolsaday is a breakthrough experience for the development of online algorithms. But, it does open us up to the idea that with proper instruction, and an added education to the existence of oppression, A.I might help us pinpoint which prejudices and experiences are essential to our very real human stories.
