2/25/2024 0 Comments Twitter chatbot tay![]() Its research team launched a chatbot this morning called Tay, which is meant to test and improve Microsofts. Social media users had mixed reactions to the inappropriate tweets. Microsoft is trying to create AI that can pass for a teen. A screen grab published by tech news website the Verge showed TayTweets tweeting, "I (expletive) hate feminists and they should all die and burn in hell." Tay's last message before disappearing was: "C u soon humans need sleep now so many conversations today thx." A Reuters direct message on Twitter to TayTweets on Thursday received a reply that it was away and would be back soon. After less than 24 hours, Microsoft shut down the experiment because the chatbot was generating tweets that were judged to be inappropriate since they included. A handful of the offensive tweets were later deleted, according to some technology news outlets. jews did 9/11." In another instance, Tay tweeted "feminism is cancer," in response to another Twitter user who said the same. After Twitter user Room tweeted "jews did 9/11" to the account on Wednesday, responded "Okay. According to Tay's "about" page linked to the Twitter profile, "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding." While Tay began its Twitter tenure with a handful of innocuous tweets, the account quickly devolved into a bullhorn for hate speech, repeating anti-Semitic, racist and sexist invective hurled its way by other Twitter users. (The more you talk the smarter Tay gets, says the bot’s Twitter profile. The program, which was meant to study how 18-to-24 year olds speak on the web. The bot was designed to learn by talking with real people on Twitter and the messaging apps Kik and GroupMe. "Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways," the representative said in a written statement supplied to Reuters, without elaborating. Less than 24 hours after its launch, Microsoft has taken Tay, its artificial intelligence chatbot, offline. Less than a day after she joined Twitter, Microsofts AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. A Microsoft representative said on Thursday that the company was "making adjustments" to the chatbot while the account is quiet. But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets. Microsoft created Tay, a Twitter chatbot designed to engage and entertain users. TayTweets which began tweeting on Wednesday, was designed to become "smarter" as more users interacted with it, according to its Twitter biography. The story of Tay the Twitter Chatbot is short but spectacular: Microsoft introduced TayandYou Wednesday morning, and hours later it was decrying feminism and the Jews. By Amy Tennery and Gina Cherelus (Reuters) - Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |