国产精品美女一区二区三区-国产精品美女自在线观看免费-国产精品秘麻豆果-国产精品秘麻豆免费版-国产精品秘麻豆免费版下载-国产精品秘入口

Set as Homepage - Add to Favorites

【homosexual animal sex videos】Enter to watch online.Microsoft's AI makes racist error and then publishes stories about it

Source: Editor:explore Time:2025-07-05 22:29:20

Hey,homosexual animal sex videos at least Microsoft's news-curating artificial intelligence doesn't have an ego. That much was made clear today after the company's news app highlighted Microsoft's most recent racist failure.

The inciting incident for this entire debacle appears to be Microsoft's late May decision to fire some human editors and journalists responsible for MSN.com and have its AI curate and aggregate stories for the site instead. Following that move, The Guardianreported earlier today that Microsoft's AI confused two members of the pop band Little Mix, who both happen to be women of color, in a republished story originally reported by The Independent. Then, after being called out by band member Jade Thirlwall for the screwup, the AI then published stories about its own failing.

So, to recap: Microsoft's AI made a racist error while aggregating another outlet's reporting, got called out for doing so, and then elevated the coverage of its own outing. Notably, this is after Microsoft's human employees were reportedly told to manually remove stories about the Little Mix incident from MSN.com.


You May Also Like

Still with me?

"This shit happens to @leighannepinnock and I ALL THE TIME that it's become a running joke," Thirlwall reportedly wrote in an Instagram story, which is no longer visible on her account, about the incident. "It offends me that you couldn't differentiate the two women of colour out of four members of a group … DO BETTER!"

As of the time of this writing, a quick search on the Microsoft News app shows at least one such story remains.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!
Mashable ImageA story from T-Break Tech covering the AI's failings as it appears on the Microsoft News app. Credit: screenshot / microsoft news app

Notably, Guardian editor Jim Waterson spotted several more examples before they were apparently pulled.

"Microsoft's artificial intelligence news app is now swamped with stories selected by the news robot about the news robot backfiring," he wrote on Twitter.

We reached out to Microsoft in an attempt to determine just what, exactly, the hell is going on over there. According to a company spokesperson, the problem is not one of AI gone wrong. No, of course not. It's not like machine learning has a long history of bias (oh, wait). Instead, the spokesperson insisted, the issue was simply that Microsoft's AI selected the wrong photo for the initial article in question.

"In testing a new feature to select an alternate image, rather than defaulting to the first photo, a different image on the page of the original article was paired with the headline of the piece," wrote the spokesperson in an email. "This made it erroneously appear as though the headline was a caption for the picture. As soon as we became aware of this issue, we immediately took action to resolve it, replaced the incorrect image and turned off this new feature."

Unfortunately, the spokesperson did not respond to our question about humanMicrosoft employees deleting coverage of the initial AI error from Microsoft's news platforms.

Microsoft has a troubled recent history when it comes to artificial intelligence and race. In 2016, the company released a social media chatbot dubbed Tay. In under a day, the chatbot began publishing racist statements. The company subsequently pulled Tay offline, attempted to release an updated version, and then had to pull it offline again.

As evidenced today by the ongoing debacle with its own news-curating AI, Microsoft still has some work to do — both in the artificial intelligence and not-being-racistdepartments.

Topics Artificial Intelligence Microsoft Racial Justice

0.3024s , 14433.109375 kb

Copyright © 2025 Powered by 【homosexual animal sex videos】Enter to watch online.Microsoft's AI makes racist error and then publishes stories about it,  

Sitemap

Top 主站蜘蛛池模板: a片视频 | www国产成人免 | 福利久久久 | www成年免费色综合狠狠躁 | 91九色国产 | 国产91av在线播放网站 | 91久久精品美女高潮喷水app | bbw丰满大肥奶肥婆 bl年下猛烈顶弄h | 国产av无码区亚洲av | 国产av一区二区三区传媒色欲 | 国产av妓女影视妓女影院 | 国产av激情无码久久天堂 | 国产91导航 | 东京一本一道一二三区 | 东京一区二区三区高 | 91a国产尤物视频 | 99精品国产免费久久国语 | 丰满少妇性xxxxx做受 | 一区中文字幕在线日本 | 91久久精品国产91久久公交车 | av网址国产在线看 | 国产aa免费视频观看网站 | 91成人在线日本中文字幕免费 | 99久久无码一区人妻a片潘金莲 | 国产av巨作丝 | 国产成a人片在线观看视频99 | 懂色一区二区二区av免费观看 | 97精品人妻酒店综合大胆无码 | a级片在线观看 | 国产91热爆ts人妖在线 | 一区二区在线 | 99久久精品国产麻豆 | 97在线中文字幕观看视频 | 91成人网站在线 | av资源每日更新网站在线 | av一本无码不卡在线播放 | 91精品国产日韩91久久久久久 | 国产ts人妖合集magnet | 波多野结衣潮喷系列 | 高清无码小视频 | 波多野结衣加勒比 |