Taylor Swift once threatened to sue Microsoft over their AI chatbot Tay.
Microsoft President Brad Smith reflects on a 2016 email sent by a Beverly Hills lawyer regarding Tay, a chatbot designed by Microsoft and Bing. “I was on vacation when I made the mistake of looking at my phone during dinner,” Smith says in his forthcoming book, Tools and Weapons.
“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you,'” he continues, per The Guardian.
“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention,” Smith says. “The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot and that it violated federal and state laws.”
Tay turned out to be something of a disaster. The chatbox had been learning from conversations with people all over the Internet, including a small group of pranksters spreading racist rhetorics.
Some of Tay’s tweets included, “Bush did 9/11 and Hitler would have done a better job than the monkey we have now” and “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT.” Microsoft removed Tay from Twitter just 18 hours post-launch.