【】

  发布时间:2025-04-30 09:53:32   作者:玩站小弟   我要评论
The first look a Twitter user gets at a tweet might be an unintentionally racially biased one.Twitte 。

The first look a Twitter user gets at a tweet might be an unintentionally racially biased one.

Twitter said Sunday that it would investigate whether the neural network that selects which part of an image to show in a photo preview favors showing the faces of white people over Black people.

The trouble started over the weekend when Twitter users posted several examples of how, in an image featuring a photo of a Black person and a photo of a white person, Twitter's preview of the photo in the timeline more frequently displayed the white person.

The public tests got Twitter's attention - and now the company is apparently taking action.

Mashable Games

"Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing," Liz Kelly, a member of the Twitter communications team, told Mashable. "But it’s clear from these examples that we’ve got more analysis to do. We're looking into this and will continue to share what we learn and what actions we take."

Twitter's Chief Design Officer Dantley Davis and Chief Technology Officer Parag Agrawal also chimed in on Twitter, saying they're "investigating" the neural network.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

The conversation started when one Twitter user initially posted about racial bias on Zoom's facial detection. He noticed that the side-by-side image of him (a white man) and his Black colleague repeatedly showed his face in previews.

After multiple users got in on testing, one user even showed how the favoring of lighter faces was the case with characters from The Simpsons.

Twitter's promise to investigate is encouraging, but Twitter users should view the analyses with a grain of salt. It's problematic to claim incidences of bias from a handful of examples. To really assess bias, researchers need a large sample size with multiple examples under a variety of circumstances.

Anything else is making claims of bias by anecdote – something conservatives do to claim anti-conservative bias on social media. These sorts of arguments can be harmful because people can usually find one or two examples of just about anything to prove a point, which undermines the authority of actually rigorous analysis.

That doesn't mean the previews question is not worth looking into, as this could be an example of algorithmic bias: When automated systems reflect the biases of their human makers, or make decisions that have biased implications.

SEE ALSO:People are fighting algorithms for a more just and equitable future. You can, too.

In 2018, Twitter published a blog post that explained how it used a neural network to make photo previews decisions. One of the factors that causes the system to select a part of an image is higher contrast levels. This could account for why the system appears to favor white faces. This decision to use contrast as a determining factor might not be intentionally racist, but more frequently displaying white faces than black ones is a biased result.

There's still a question of whether these anecdotal examples reflect a systemic problem. But responding to Twitter sleuths with gratitude and action is a good place to start no matter what.

Related Video: Why you should always question algorithms

TopicsArtificial IntelligenceTwitter

  • Tag:

相关文章

  • One of the most controversial power struggles in media comes to a close

    One of the world's biggest media companies has been embroiled in a complex personal and professional
    2025-04-30
  • 魔王華為高管是什麽梗

    周琦直接隔空喊話新疆男籃高層 ,到底是聰明還是愚昧?真的希望周琦和小丁都不要再拿自己的職業生涯做賭注了,把心思都用在打球上,別的事對你來說都不是最重要的。即使一時得到了很多錢,你能持續性的給俱樂部帶來.
    2025-04-30
  • 南河小仙女是什麽梗

    漢川市沉湖鎮名字的來源?漢川市以漢水流經本市境內而得名 。宋太平太平興國二年(公元977年)置漢川縣。1949年屬沔陽專區,1951年屬孝感專區,1959年屬武漢市,1961年複屬孝感專區,1970..
    2025-04-30
  • 李子柒螺螄粉袋裝

    前言 :李子柒的螺螄粉怎麽那麽難吃?因為每個人的口味各不相同螺獅粉為什麽那麽臭?就像魚香肉絲裏沒有魚 ,夫妻肺片裏麵沒有肺片一樣,螺螄粉裏麵沒有螺螄。但 ,有螺螄不屈不滅的靈魂。在一碗完整的螺螄粉中,螺螄是
    2025-04-30
  • Despite IOC ban, Rio crowds get their political messages across

    The Olympics aren't meant to be a place for political expression -- the International Olympic Commit
    2025-04-30
  • 包包五金磨損怎樣修複

    包包五金磨損怎樣修複-業百科細砂紙打磨:對於五金件上的劃痕磨損,可以用打磨珠寶的細砂紙來打磨 。焊接:對於一些鍍金的五金件,簡單的打磨隻能起到一定的作用,而不能長久的保持五。五金磨損怎麽修補?2、焊接:
    2025-04-30

最新评论