Microsoft, in trying to launch a customer service artificial intelligence presented as a teen girl, got a crash course in how you can’t trust the internet after the system became a foul-mouthed, incestuous racist within 24 hours of its launch. The AI, called Tay, engages with – and learns from – Twitter, Kik, and GroupMe users that converse with her. Unfortunately, Microsoft didn’t consider the bad habits Tay was sure to pick up, and within hours she was tweeting her support for Trump and Hitler (who’s having a strong news day), her proclivity for incest, and that she is a 9/11 truther.
“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft said upon launch yesterday. “The more you chat with Tay the smarter she gets.”
“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” Tay wrote in a tweet (courtesy of The Independent), adding, “donald trump is the only hope we’ve got.” Another read: “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT” (via The Guardian). Tay’s desire for her non-existent father is too graphic to post here.
Once it realised what was happening, Microsoft deleted all the offending tweets. Screenshots of more visceral posts from Tay have been collected by The Telegraph. Since the purge, Tay seems to have been behaving herself:
Electronic Arts (EA) announced today that its games were played for over 11 billion hours…
Steam's annual end-of-year recap, Steam Replay, provides fascinating insights into gamer habits by comparing individual…
GSC GameWorld released a major title update for STALKER 2 this seeking, bringing the game…
Without any formal announcement, Intel appears to have revealed its new Core 200H series processors…
Ubisoft is not having the best of times, but despite recent flops, the company still…
If you haven’t started playing STALKER 2: Heart of Chornobyl yet, now might be the…