Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business News

On Virgin Radio today

  • Non Stop Hits

    Midnight - 6:00am

    The UAE's #1 Hit Music Station with no interruptions

  • The Kris Fade Show

    6:00am - 10:00am

    Kris, Priti and Rossi host the UAE's biggest radio show. It's full of fun, laughs and it's Where The Stars Live.

Trending on Virgin Radio

  • Talent Takeover

    For one morning only, you could host the UAE’s biggest radio show, The Kris Fade Show - live across the nation.

  • Bassem Youssef

    One of the most talked about comedians in the world takes over The Kris Fade Show. Watch the full interview here...

  • Regional Artist Spotlight Podcast

    Hear Maz & James chatting to the featured artists every month with Flash Entertainment.

  • Untold Dubai

    It arrived: The UAE's First Mega Music Festival at Expo City Dubai!