Microsoft has admitted it faces some "difficult" challenges in AI design after its chatbot "Tay" had an offensive meltdown on social media. Microsoft issued an apology in a blog post on Friday ...
Microsoft CEO Satya Nadella was awarded a $30 million pay raise — 63% more than what he earned last year — even as the Windows maker slashed its workforce by 2,500 people. Stock awards for ...
Just 18 hours later, the Microsoft president explained, Tay was euthanized. Curiously enough, Microsoft also plays into this latest Swift AI debacle, too. As 404 Media reported, creeps on the ...
Taylor's lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith. She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds ...
One infamous example of this trend is Microsoft's artificial intelligence bot, Tay. Microsoft sent Tay out onto Twitter to interact and learn from humans, so it could pick up how to use natural ...
Think of the Microsoft Tay Twitter bot that started outputting racist tweets, leading it to be taken down 16 hours after its launch. Lack of Human Spontaneity AI influencers can only mimic human ...
Microsoft has had past problems when it comes to AI. A chatbot dubbed Tay that was released on Twitter in 2016 was hastily removed after it was taught to swear and make racist comments.
Gartner analyst Jason Wong said new technological advancements will mitigate what led to Microsoft's disastrous 2016 launch of the experimental chatbot Tay, which users trained to spout racist and ...
The internet is a vast place, and even thinking back to Tay from Microsoft years ago, training an AI on user-generated ...
Recognizing the inherent danger of unregulated artificial intelligence, governments all over the world are paying closer ...