Taylor Swift Reportedly Threatened To Sue Microsoft Over AI Chatbot

Marketers might remember Xiaoice, the chatbot Microsoft developed for the Chinese market and had high hopes for when introducing it into the U.S.

Microsoft designed it as a conversational, voice-responsive bot -- one that could have conversations on social media. The company planned to call the U.S. version Tay.

As history would have it, Microsoft's bot didn't last long in the United States -- at least not under the name Tay.ai.

In his book Tools and Weapons, Microsoft President Brad Smith reveals why. The book was co-authored by Carol Ann Browne and is scheduled for release Tuesday.

It turned out that country music singer Taylor Swift had a copyright on the name Tay. “I was on vacation when I made the mistake of looking at my phone during dinner,” Smith wrote in Tools and Weapons, according to The Guardian. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’”

When the chatbot began tweeting racist comments on Twitter, Swift had more reason for concern. Microsoft removed the bot. When it reappeared in another form, it was recalled Zo. It no longer had the ability to converse about politics, race, and religion as topics.

The book discusses a range of topics from technology to privacy to cyberattacks, as well as how to balance promise with risk. It also delves into “the promise and the peril of the digital age,” examining issues such as social media, facial recognition, and “critiquing the ‘move fast and break things’ mentality that has hurt so many people in recent years.”

Next story loading loading..