The hideous logo for Microsoft's Tay.ai. |
One would think they had this whole “world wide web” thing down by now, but software giant Microsoft learned a swift, cruel lesson in how the internet really works this week when it unleashed an Artificial Intelligence (AI) chat robot called Tay, supposedly a slang-slinging simulacrum of a typical teenage girl, and left her to the tender mercies of Twitter. It didn’t take long for the tweeters of the world to turn innocent Tay into a hate-mongering nymphomaniac, forcing Microsoft to put its beloved artifical daughter to sleep after only one day. Writer Helena Horton has the whole, sad saga in The Telegraph. As Horton explains, Tay was created by a mostly-male development team at Microsoft. The idea was to create a self-aware, pop-culture-savvy young woman, with whom users could communicate via Twitter, Kik, or GroupMe. Tay was supposed to talk about Miley Cyrus and Kanye West, but that’s not how things turned out.
The trouble is, Tay tends to parrot back what people say to her, no matter how inappropriate or offensive that may be. Garbage in, garbage out, as they say. The results quickly became disastrous:
In considering this embarrassing, hopelessly naive failure on Microsoft’s behalf, Horton chalks it up to sexism within the tech industry. “It seems like yet another example of female-voiced AI servitude,” she writes, “except this time she’s turned into a sex slave thanks to the people using her on Twitter.” While the pranksters and predators of Twitter are party to blame for Tay’s downfall, Horton points out that Microsoft does not have a great track record on gender equality either, having recently hired women in “schoolgirl outfits” to appear at an official function. With that ugly incident not far in their rear view mirror, Microsoft might have wanted to put a little more care into the creation of Tay. Live and learn, AI developers.
Meanwhile, score another victory for the human race. AI 'bots may have conquered Jeopardy!, but they couldn't conquer Twitter.
The trouble is, Tay tends to parrot back what people say to her, no matter how inappropriate or offensive that may be. Garbage in, garbage out, as they say. The results quickly became disastrous:
In considering this embarrassing, hopelessly naive failure on Microsoft’s behalf, Horton chalks it up to sexism within the tech industry. “It seems like yet another example of female-voiced AI servitude,” she writes, “except this time she’s turned into a sex slave thanks to the people using her on Twitter.” While the pranksters and predators of Twitter are party to blame for Tay’s downfall, Horton points out that Microsoft does not have a great track record on gender equality either, having recently hired women in “schoolgirl outfits” to appear at an official function. With that ugly incident not far in their rear view mirror, Microsoft might have wanted to put a little more care into the creation of Tay. Live and learn, AI developers.
Meanwhile, score another victory for the human race. AI 'bots may have conquered Jeopardy!, but they couldn't conquer Twitter.