{"id":1125,"date":"2021-11-22T18:53:21","date_gmt":"2021-11-22T17:53:21","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/22\/facebook-knew-its-algorithms-were-biased-against-people-of-color-the-washington-post\/"},"modified":"2021-11-22T18:53:21","modified_gmt":"2021-11-22T17:53:21","slug":"facebook-knew-its-algorithms-were-biased-against-people-of-color-the-washington-post","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/22\/facebook-knew-its-algorithms-were-biased-against-people-of-color-the-washington-post\/","title":{"rendered":"Facebook knew its algorithms were biased against people of color &#8211; The Washington Post"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>Last year, researchers at Facebook showed executives an example of the kind of hate speech circulating on the social network: an actual post featuring an image of four female Democratic lawmakers known collectively as \u201cThe Squad.\u201d<br \/>The poster, whose name was scrubbed out for privacy, referred to the women, two of whom are Muslim, as \u201cswami rag heads.\u201d A comment from another person used even more vulgar language, referring to the four women of color as \u201cblack c&#8212;s,\u201d according to internal company documents exclusively obtained by The Washington Post.<br \/>The post represented the<b> <\/b>\u201cworst of the worst\u201d language on Facebook \u2014 the majority of it directed at minority groups, according to a two-year effort by a large team working across the company, the document said. The researchers urged executives to adopt an aggressive overhaul of its software system that would primarily remove only those hateful posts before any Facebook users could see them.<br \/>But Facebook\u2019s leaders balked at the plan. According to two people familiar with the internal debate, top executives including Vice President for Global Public Policy Joel Kaplan feared the new system would tilt the scales by protecting some vulnerable groups over others. A policy executive prepared a document for Kaplan that raised the potential for backlash from \u201cconservative partners,\u201d according to the document. The people spoke to The Post on the condition of anonymity to discuss sensitive internal matters.<br \/>The previously unreported debate is an example of how Facebook\u2019s decisions in the name of being neutral and race-blind in fact come at the expense of minorities and particularly people of color. Far from protecting Black and other minority users, Facebook executives wound up instituting half-measures after the \u201cworst of the worst\u201d project that left minorities more likely to encounter derogatory and racist language on the site, the people said.<br \/>\u201cEven though [Facebook executives] don\u2019t have any animus toward people of color, their actions are on the side of racists,\u201d said Tatenda Musapatike, a former Facebook manager working on political ads and CEO of the Voter Formation Project, a nonpartisan, nonprofit organization that uses digital communication to increase participation in local state and national elections. \u201cYou are saying that the health and safety of women of color on the platform is not as important as pleasing your rich White man friends.\u201d<br \/>The Black audience on Facebook is in decline, according to data from a study Facebook conducted earlier this year that was revealed in documents obtained by whistleblower Frances Haugen. According to the February report, the number of Black monthly users fell 2.7 percent in one month to 17.3 million adults. It also shows that usage by Black people peaked in September 2020. Haugen\u2019s legal counsel provided redacted versions of the documents to Congress, which were viewed by a consortium of news organizations including The Post.<br \/>Civil rights groups have long claimed that Facebook\u2019s algorithms and policies had a disproportionately negative impact on minorities, and particularly Black users. The \u201cworst of the worst\u201d documents show that those allegations were largely true in the case of which hate speech remained online.<br \/>But Facebook didn\u2019t disclose its findings to civil rights leaders. Even the independent civil rights auditors Facebook hired in 2018 to conduct a major study of racial issues on its platform say they were not informed of the details of research that the company\u2019s algorithms disproportionately harmed minorities. Laura Murphy, president of Laura Murphy and Associates, who led the civil rights audit process, said Facebook told her that \u201cthe company does not capture data as to the protected group(s) against whom the hate speech was directed.\u201d<br \/>\u201cI am not asserting nefarious intent, but it is deeply concerning that metrics that showed the disproportionate impact of hate directed at Black, Jewish, Muslim, Arab and LGBTQIA users were not shared with the auditors,\u201d Murphy said in a statement. \u201cClearly, they have collected some data along these lines.\u201d<br \/>The auditors, in the report they released last year, still concluded that Facebook\u2019s policy decisions were a \u201c<a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/07\/08\/facebook-civil-rights-audit\/?itid=lk_inline_manual_14\">tremendous setback<\/a>\u201d for civil rights.<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/07\/08\/facebook-civil-rights-audit\/?itid=lk_interstitial_manual_15\">Facebook\u2019s own civil rights auditors say its policy decisions are a \u2018tremendous setback\u2019<\/a><\/span><br \/>Facebook spokesman Andy Stone defended the company\u2019s decisions around its hate speech policies and how it conducted its relationship with the civil rights auditors.<br \/>\u201cThe Worst of the Worst project helped show us what kinds of hate speech our technology was and was not effectively detecting and understand what forms of it people believe to be the most insidious,\u201d Stone said in a statement.<br \/>He said progress on racial issues included policies such as banning white nationalist groups, prohibiting content promoting racial stereotypes \u2014 such as people wearing blackface or claims that Jews control the media \u2014 and reducing the prevalence of hate speech to 0.03 percent of content on the platform.<br \/>Facebook approached the civil rights audit with \u201ctransparency and openness\u201d and was proud of the progress it has made on issues of race, Stone said.<br \/>Stone noted that the company had implemented<a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/03\/facebook-hate-speech\/?itid=lk_inline_manual_20\"> parts<\/a> of the \u201cworst of the worst\u201d project. \u201cBut after a rigorous internal discussion about these difficult questions, we did not implement all parts as doing so would have actually meant fewer automated removals of hate speech such as statements of inferiority about women or expressions of contempt about multiracial people,\u201d he added.<br \/>Facebook researchers first showed the racist post featuring The Squad \u2014<b> <\/b>Reps. Alexandria Ocasio-Cortez (D-N.Y.), Ilhan Omar (D-Minn.), Rashida Tlaib (D-Mich.) and Ayanna Pressley (D-Mass.) \u2014 to more than 10,000 Facebook users in an online survey in 2019. (The Squad now has six members.) The users were asked to rate 75 examples of hate speech on the platform to determine what they considered the most harmful. <br \/>Other posts among the examples included a post that said, \u201cMany s&#8212;hole immagruntz on welfare send money back to their homejungles.\u201d An image of a chimpanzee in a long-sleeve shirt was captioned, \u201cHere\u2019s one of Michelle Obama.\u201d Another post in the survey said, \u201cThe only humanitarian assistance needed at the border is a few hundred motion-sensor machine gun turrets. Problem solved.\u201d<br \/>The 10 worst examples, according to the surveyed users, were almost all directed at minority groups, documents show. Five of the posts were directed at Black people, including statements about mental inferiority and disgust. Two were directed at the LGBTQ community. The remaining three were violent comments directed at women, Mexicans and White people.<br \/>These findings about the most objectionable content held up even among self-identified White conservatives that the market research team traveled to visit in Southern states. Facebook researchers sought out the views of White conservatives in particular because they wanted to overcome potential objections from the company\u2019s leadership, which was known to appease right-leaning viewpoints, two people said.<br \/>Yet racist posts against minorities weren\u2019t what Facebook\u2019s own hate speech detection algorithms were most commonly finding. The software, which the company introduced in 2015, was supposed to detect and automatically delete hate speech before users saw it. Publicly, the company said in 2019 that its algorithms proactively caught more than 80 percent of hate speech.<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/03\/facebook-hate-speech\/?itid=lk_interstitial_manual_29\">Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show<\/a><\/span><br \/>But this statistic hid a serious problem that was obvious to researchers: The algorithm was aggressively detecting comments denigrating White people more than attacks on every other group, according to several of the documents. One April 2020 document said roughly 90 percent of \u201chate speech\u201d subject to content takedowns were statements of contempt, inferiority and disgust directed at White people and men, though the time frame is unclear. And it consistently failed to remove the most derogatory, racist content. <a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/03\/facebook-hate-speech\/?itid=lk_inline_manual_30\" target=\"_blank\" rel=\"noopener\">The Post previously reported<\/a> on a portion of the project.<br \/>Researchers also found in 2019 that the hate speech algorithms were out of step with actual reports of harmful speech on the platform. In that year, the researchers discovered that 55 percent of the content users reported to Facebook as most harmful was directed at just four minority groups: Blacks, Muslims, the LGBTQ community and Jews, according to the documents.<br \/>One of the reasons for these errors, the researchers discovered, was that Facebook\u2019s \u201crace-blind\u201d rules of conduct on the platform didn\u2019t distinguish among the targets of hate speech. In addition, the company had decided not to allow the algorithms to automatically delete many slurs, according to the people, on the grounds that the algorithms couldn\u2019t easily tell the difference when a slur such as the n-word and the c-word was used positively or colloquially within a community. The algorithms were also over-indexing on detecting less harmful content that occurred more frequently, such as \u201cmen are pigs,\u201d rather than finding less common but more harmful content.<br \/>\u201cIf you don\u2019t do something to check structural racism in your society, you\u2019re going to always end up amplifying it,\u201d one of the people involved with the project told The Post. \u201cAnd that is exactly what Facebook\u2019s algorithms did.\u201d<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/business\/economy\/facebook-agrees-to-dismantle-targeted-advertising-system-for-job-housing-and-loan-ads-after-discrimination-complaints\/2019\/03\/19\/7dc9b5fa-4983-11e9-b79a-961983b7e0cd_story.html?itid=lk_interstitial_manual_35\">Facebook agrees to overhaul targeted advertising system for job, housing and loan ads after discrimination complaints<\/a><\/span><br \/>\u201cThis information confirms what many of us already knew: that Facebook is an active and willing participant in the dissemination of hate speech and misinformation,\u201d Omar said in a statement. \u201cFor years, we have raised concerns to Facebook about routine anti-Muslim, anti-Black, and anti-immigrant content on Facebook, much of it based on outright falsehoods. It is clear that they only care about profit, and will sacrifice our democracy to maximize it.\u201d<br \/>For years, Black users said that those same automated systems also mistook posts about racism as hate speech \u2014 sending the user to \u201cFacebook jail\u201d by blocking their account \u2014 and<b> <\/b>made them disproportionate targets of hate speech that the company failed to control. But when civil rights leaders complained, those content moderation issues were routinely dismissed as merely \u201cisolated incidents\u201d or \u201canecdotal,\u201d said Rashad Robinson, president of Color of Change, a civil rights group that regularly sought more forceful action by the company against hate speech and incitements to violence on Facebook, and has argued that Kaplan should be fired.<br \/>\u201cThey would regularly push back against that,\u201d Robinson said. \u201cThey would say, \u2018That\u2019s simply not true, Rashad.\u2019 They\u2019d say, \u2018Do you have data to support that?\u2019 \u201d<br \/>Malkia Devich-Cyril, a Black and queer activist, and the former executive director of the Center for Media Justice, who ran two Black Lives Matter pages on Facebook in 2016, said they had to stop managing the pages because they were \u201charassed relentlessly,\u201d including receiving death threats.<br \/>\u201cIt sickened me,\u201d Devich-Cyril said. \u201cAs an activist \u2014 whose calling is to stand on the front lines and fight for change \u2014 it created in me a kind of fear. If that kind of chill factor in a democratic state is what Facebook is going for, they have achieved it.\u201d<br \/>In December 2019, researchers on the \u201cworst of the worst,\u201d which came to be known as Project WoW, were ready to deliver their findings from two years of work to key company leaders, including Kaplan and head of global policy management Monika Bickert.<br \/>They were proposing a major overhaul of the hate speech algorithm. From now on, the algorithm would be narrowly tailored to automatically remove hate speech against only five groups of people \u2014 those who are<b> <\/b>Black, Jewish, LGBTQ, Muslim or of multiple races \u2014 that users rated as most severe and harmful. (The researchers hoped to eventually expand the algorithm\u2019s detection capabilities to protect other vulnerable groups, after the algorithm had been retrained and was on track.) Direct threats of violence against all groups would still be deleted.<br \/>Facebook users could still report any post they felt was harmful, and the company\u2019s content moderators would take a second look at it.<br \/>The team knew that making these changes to protect more vulnerable minorities over others would be a hard sell, according to the people familiar with the situation. Facebook largely operates with one set of standards for billions of users. Policies that could benefit a particular country or group were often dismissed because they were not \u201cscalable\u201d around the globe, and could therefore interfere with the company\u2019s growth, according to many former and current employees.<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/04\/08\/muslim-advocates-facebook-lawsuit\/?itid=lk_interstitial_manual_48\">Civil rights groups flagged dozens of anti-Muslim pages and groups to Facebook that stayed up, lawsuit alleges<\/a><\/span><br \/>In February 2020, Kaplan and other leaders reviewed the proposal \u2014 and quickly rejected the most substantive changes. They felt the changes too narrowly protected just a few groups, while leaving out others, exposing the company to criticism, according to three of the people. For example, the proposal would not have allowed the automatic deletion of comments against Mexicans or women. The document prepared for Kaplan referenced that some \u201cconservative partners\u201d might resist the change because they think that \u201chate targeted toward trans people is an expression of opinion.\u201d<br \/>When asked for comment on Kaplan bending to conservatives, Facebook\u2019s Stone said that Kaplan\u2019s objection to the proposal was because of the types of hate speech it would no longer automatically delete.<br \/>Kaplan, the company\u2019s most influential Republican, was widely known as a strong believer in the idea that Facebook should appear \u201cpolitically neutral,\u201d and his hard-line free speech ideology was in lockstep with company CEO Mark Zuckerberg. (Facebook recently changed its corporate name to Meta.) He bent over backward to protect conservatives, <a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/02\/20\/facebook-republican-shift\/?itid=lk_inline_manual_52\" target=\"_blank\" rel=\"noopener\">according to previous reporting in The Post<\/a>, numerous former insiders and the Facebook Papers.<br \/>But Kaplan and the other executives did give the green light to a version of the project that would remove the least harmful speech, according to Facebook\u2019s own study: programming the algorithms to stop automatically taking down content directed at White people, Americans and men. The Post previously reported on this change when it was announced internally later in 2020.<br \/>\u201cFacebook seems to equate protecting Black users with putting its thumb on the scale,\u201d said David Brody, senior counsel for the Lawyers\u2019 Committee for Civil Rights Under Law, when The Post presented him the company\u2019s research. \u201cThe algorithm that disproportionately protected White users and exposed Black users \u2014 that is when Facebook put its thumb on the scale.\u201d<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/07\/02\/facebook-racial-bias-suit\/?itid=lk_interstitial_manual_55\">Complaint alleges that Facebook is biased against black workers<\/a><\/span><br \/>This year, Facebook conducted a consumer product study on \u201cracial justice\u201d that found Black users were leaving Facebook. It found that younger Black users in particular were drawn to TikTok. It appeared to confirm a study from three years ago called Project Vibe that warned that Black users were \u201cin danger\u201d of leaving the platform because of \u201chow Facebook applies its hate speech policy.\u201d<br \/>\u201cThe degree of death threats on these platforms, specifically Facebook, that my colleagues have suffered is untenable,\u201d said Devich-Cyril, who added that today they rarely post publicly about politics on Facebook. \u201cIt\u2019s too unsafe of a platform.\u201d<br \/>The <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook Papers<\/b><\/a> are a set of internal documents that were provided to Congress in redacted form by Frances Haugen\u2019s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post.<br \/>The trove of documents show how <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook CEO Mark Zuckerberg<\/b><\/a> has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.<br \/>The documents also provided new details of the social media platform\u2019s role in fomenting the <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/\" target=\"_blank\" rel=\"noopener\"><b>storming of the U.S. Capitol<\/b><\/a>.<br \/>Facebook engineers gave <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/\" target=\"_blank\" rel=\"noopener\"><b>extra value to emoji reactions, including \u2018angry,\u2019<\/b><\/a> pushing more emotional and provocative content into users\u2019 news feeds.<br \/><b>Read more from The Post\u2019s investigation:<\/b><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\">Key takeaways from the Facebook Papers<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/\">How Facebook neglected the rest of the world, fueling hate speech and violence in India<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2021\/how-facebook-algorithm-works\/\" target=\"_blank\" rel=\"noopener\">How Facebook shapes your feed<\/a><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><\/p>\n<p><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/11\/21\/facebook-algorithm-biased-race\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Last year, researchers at Facebook showed executives an example of the kind of hate speech circulating on the social network: an actual post featuring an image of four female Democratic lawmakers known collectively as \u201cThe Squad.\u201dThe poster, whose name was scrubbed out for privacy, referred to the women, two of whom are Muslim, as \u201cswami [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-1125","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1125","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=1125"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1125\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=1125"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=1125"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=1125"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}