Posted on

Microsoft’s chatbot Tay utters racist, sexist, homophobic slurs

In an attempt to form relationships with younger customers, Microsoft launched an AI-powered chatbot called “Tay.ai” on Twitter last spring. “Tay,” modeled around a teenage girl, morphed into, well, a ” Hitler-loving, feminist-bashing troll”—within just a day of her debut online. Microsoft yanked Tay off the social media platform and announced it planned to make “adjustments” to its algorithm.

Share This:

Leave a Reply

Your email address will not be published. Required fields are marked *