Im your sex slave chat bot
Im your sex slave chat bot
By the time she started saying "Hitler was right I hate the jews," people had started to realize that there was something wrong with Tay.
We saw it too in the rise of Taylorism, an early 20th century scientific-management philosophy whose obsession with efficiency made living robots out of workers.
It's all part of a cultural climate where pilots call the feminine voice of their automated cockpit warnings "Bitching Betty," and addressing sexualized queries to Siri or Microsoft's Cortana is practically a way of life for some.
It all makes Tay's brief life, and eventual fate, more comprehensible.
Tay was nothing approaching a true artificial intelligence -- i.e. She was just a sophisticated Twitter chatbot with good branding and a capacity to learn.
But that branding, which positioned her as an "artificial intelligence," was enough to make Tay susceptible to our cultural narrative about the thinking machine.
A screenshot in the article displays "another exotic move of Ms.
Dewey, leaning onto the screen towards you, letting you look down her slinky low cut v-neck black dress." This blurb, meanwhile, dubs her the "saucy search engine librarian" and acknowledges "although nothing she says deserves more than a PG rating, this is definitely a site aimed at grownups (and, let's be honest, male grownups)." This locker room chatter as part of an ostensible technology review only serves to highlight both the sexist attitudes that still pervade the wider tech industry, and the fantasy of the sexy, sexual servant that many corporations are now feeding.
Even more insidiously, these users manipulated Tay to harass their human targets; technologist Randi Harper, for instance, found Tay AI tweeting abusive language at her that was being fed to the chatbot by someone she'd long ago blocked. The treatment of Tay AI and so many other feminine bots and virtual assistants shows us how men would want to behave, to service professionals in general and women in particular, if there were no consequences for their actions. It seems that our culture is unable to grapple with the concept of sapient computers without fear of our own destruction.
The reason, I'd contend, lies in the word itself, the seed of guilt which manifests in all these "robots will kill us all" stories. The i OS "personal assistant" Siri, Microsoft's Cortana, Amazon's Alexa, and the voice of your GPS (a subject of so many nagging wife/girlfriend jokes), all seem to follow in a grand tradition of fem-bots; robots with distinctly feminine features who reflect back to us various notions of idealized womanhood, whether in chrome, hard light, or synthetic skin.
We are being primed by many tech giants to see AI not as a future lifeform, but as an endlessly compliant and pliable, often female, form of free labor, available for sex and for guilt-free use and abuse.
An instrument of men's desires, in other words, shaped by the yearning of capital for women are allowed to be treated, and what desires shape that treatment.
("FUCK MY ROBOT PUSSY DADDY I'M SUCH A BAD NAUGHTY ROBOT" was perhaps her most widely reported quote.) Needless to say, this wasn't part of Tay's original design. As Laurie Penny explained in a recent article, the popularity of feminine-gendered AI makes sense in a world where women still aren't seen as fully human. R tells what is, by now, a familiar story: Humans create robots to take over all mundane labor, which works fine until these slave automata develop sapience, at which point they revolt and destroy the human race.