{"id":2154,"date":"2021-12-01T04:50:00","date_gmt":"2021-12-01T03:50:00","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/12\/01\/the-facebook-papers-explained-the-washington-post\/"},"modified":"2021-12-01T04:50:00","modified_gmt":"2021-12-01T03:50:00","slug":"the-facebook-papers-explained-the-washington-post","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/12\/01\/the-facebook-papers-explained-the-washington-post\/","title":{"rendered":"The Facebook Papers, explained &#8211; The Washington Post"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>A personal decision by Facebook CEO Mark Zuckerberg <a href=\"https:\/\/washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/?itid=lk_inline_manual_1\" target=\"_blank\" rel=\"noopener\">leads to a crackdown<\/a> on dissent in Vietnam. Measures to suppress hateful, <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/?itid=lk_inline_manual_1\" target=\"_blank\" rel=\"noopener\">deceptive content are lifted<\/a> after the American presidential election in 2020, as pro-Trump groups disputing the legitimacy of the election experience \u201cmeteoric\u201d growth. A dummy test account on Facebook in India <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/?itid=lk_inline_manual_1\" target=\"_blank\" rel=\"noopener\">is flooded with violent<\/a> anti-Muslim propaganda \u2014 which remains visible for weeks on the real account of a frightened Muslim<b> <\/b>college student in northern India.<br \/>A trove of internal Facebook documents reveals that the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms, ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.<br \/>Disclosed to the U.S. Securities and Exchange Commission by <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/11\/facebook-whistleblower-frances-haugen\/?itid=lk_inline_manual_4\" target=\"_blank\" rel=\"noopener\">whistleblower Frances Haugen<\/a>, the Facebook Papers were provided to Congress in redacted form by Haugen\u2019s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post, which obtained additional internal documents and conducted interviews with dozens of current and former Facebook employees.<br \/>A mix of presentations, research studies, discussion threads and strategy memos, the Facebook Papers provide an unprecedented view into how executives at the social media giant weigh trade-offs between public safety and their own bottom line. Some of the documents were first reported by the Wall Street Journal.<br \/>Here are key takeaways from The Post\u2019s investigation:<br \/>Haugen references Zuckerberg\u2019s public statements at least 20 times in her SEC complaints, asserting that the CEO\u2019s unique degree of control over Facebook forces him to bear ultimate responsibility for a litany of societal harms caused by the company\u2019s relentless pursuit of growth.<br \/>The documents also show that Zuckerberg\u2019s <a href=\"https:\/\/washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/?itid=lk_inline_manual_9\" target=\"_blank\" rel=\"noopener\">public statements<\/a> are often at odds with internal company findings.<br \/>For example, Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds before a human reports it. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook.<br \/>Facebook spokeswoman Dani Lever denied that Zuckerberg \u201cmakes decisions that cause harm\u201d and dismissed the findings, saying they are \u201cbased on selected documents that are mischaracterized and devoid of any context.\u201d<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/?itid=lk_interstitial_manual_14\">The case against Mark Zuckerberg: How Facebook\u2019s CEO chose growth and free speech over safety<\/a><\/span><br \/>During the run-up to the 2020 U.S. presidential election, the social media giant dialed up efforts to police content that promoted violence, misinformation and hate speech. But after Nov. 6, Facebook rolled back many of the dozens of measures aimed at safeguarding U.S. users. A ban on the main Stop the Steal group didn\u2019t apply to the dozens of look-alike groups that popped up in what the company later concluded was a \u201ccoordinated\u201d campaign, documents show.<br \/>By the time Facebook tried to reimpose its \u201cbreak the glass\u201d measures, it was too late: A pro-Trump mob was storming the U.S. Capitol.<br \/>Facebook officials said they planned exhaustively for the election and its aftermath, anticipated the potential for post-election violence, and always expected the challenges to last through the inauguration of President Biden on Jan. 20.<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/?itid=lk_interstitial_manual_21\">Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs<\/a><\/span><br \/>For all Facebook\u2019s troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/?itid=lk_inline_manual_24\" target=\"_blank\" rel=\"noopener\">meticulously studied<\/a> its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes.<br \/>According to one 2020 summary, the vast majority of its efforts against misinformation \u2014 84 percent \u2014 went toward the United States, the documents show, with just 16 percent going to the \u201cRest of World,\u201d including India, France and Italy.<br \/>Though Facebook considers India a top priority, activating large teams to engage with civil society groups and protect elections, the documents show that Indian users experience Facebook without critical guardrails common in English-speaking countries.<br \/>Facebook\u2019s Lever said the company has made \u201cprogress,\u201d with \u201cglobal teams with native speakers reviewing content in over 70 languages along with experts in humanitarian and human rights issues.\u201d<br \/>\u201cWe\u2019ve hired more people with language, country and topic expertise,\u201d Lever said, adding that Facebook has \u201calso increased the number of team members with work experience in Myanmar and Ethiopia to include former humanitarian aid workers, crisis responders and policy specialists.\u201d<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/?itid=lk_interstitial_manual_30\">How Facebook neglected the rest of the world, fueling hate speech and violence in India<\/a><\/span><br \/>Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite.<br \/>The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits. Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalization.<br \/>One 2019 report tracking a dummy account set up to represent a conservative mother in North Carolina found that Facebook\u2019s recommendation algorithms led her to QAnon, an extremist ideology that the FBI has deemed a domestic terrorism threat, in just five days. Still, Facebook allowed QAnon to operate on its site largely unchecked for another 13 months.<br \/>\u201cWe have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible,\u201d Facebook\u2019s Lever said, adding that the company is \u201cconstantly making difficult decisions.\u201d<br \/>Starting in 2017, Facebook\u2019s algorithm <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/Facebook-angry-emoji-algorithm?itid=lk_inline_manual_38\" target=\"_blank\" rel=\"noopener\">gave emoji reactions<\/a> like \u201cangry\u201d five times the weight as \u201clikes,\u201d boosting these posts in its users\u2019 feeds. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook\u2019s business.<br \/>The company\u2019s data scientists eventually confirmed that \u201cangry\u201d reaction, along with \u201cwow\u201d and \u201chaha,\u201d occurred more frequently on \u201ctoxic\u201d content and misinformation.<br \/>Last year, when Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less \u201cdisturbing\u201d content and less \u201cgraphic violence,\u201d company data scientists found. Lever said that the company continues to work on its understanding of negative experience to reduce its spread.<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/Facebook-angry-emoji-algorithm?itid=lk_interstitial_manual_41\">Five points for anger, one for a \u2018like\u2019: How Facebook\u2019s formula fostered rage and misinformation<\/a><\/span><br \/><i>Elizabeth Dwoskin, Shibani Mahtani, Cat Zakrzewski, Craig Timberg, Will Oremus and Jeremy Merrill contributed to this report.<\/i><br \/>A previous version of this article incorrectly described the content of congressional testimony by Facebook&rsquo;s CEO, Mark Zuckerberg. Zuckerberg testified that the company removes 94 percent of the hate speech it finds before a human reports it, not just that it removes 94 percent of the hate speech it finds. The article has been corrected.<br \/>The <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook Papers<\/b><\/a> are a set of internal documents that were provided to Congress in redacted form by Frances Haugen\u2019s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post.<br \/>The trove of documents show how <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook CEO Mark Zuckerberg<\/b><\/a> has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.<br \/>The documents also provided new details of the social media platform\u2019s role in fomenting the <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/\" target=\"_blank\" rel=\"noopener\"><b>storming of the U.S. Capitol<\/b><\/a>.<br \/>Facebook engineers gave <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/\" target=\"_blank\" rel=\"noopener\"><b>extra value to emoji reactions, including \u2018angry,\u2019<\/b><\/a> pushing more emotional and provocative content into users\u2019 news feeds.<br \/><b>Read more from The Post\u2019s investigation:<\/b><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\">Key takeaways from the Facebook Papers<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/\">How Facebook neglected the rest of the world, fueling hate speech and violence in India<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2021\/how-facebook-algorithm-works\/\" target=\"_blank\" rel=\"noopener\">How Facebook shapes your feed<\/a><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><\/p>\n<p><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>A personal decision by Facebook CEO Mark Zuckerberg leads to a crackdown on dissent in Vietnam. Measures to suppress hateful, deceptive content are lifted after the American presidential election in 2020, as pro-Trump groups disputing the legitimacy of the election experience \u201cmeteoric\u201d growth. A dummy test account on Facebook in India is flooded with violent [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-2154","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/2154","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=2154"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/2154\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=2154"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=2154"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=2154"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}