{"id":695,"date":"2021-11-19T05:40:10","date_gmt":"2021-11-19T04:40:10","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/why-facebook-wont-let-you-control-your-own-news-feed-the-washington-post\/"},"modified":"2021-11-19T05:40:10","modified_gmt":"2021-11-19T04:40:10","slug":"why-facebook-wont-let-you-control-your-own-news-feed-the-washington-post","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/why-facebook-wont-let-you-control-your-own-news-feed-the-washington-post\/","title":{"rendered":"Why Facebook won&#039;t let you control your own news feed &#8211; The Washington Post"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>In at least two experiments over the years, Facebook has explored what happens when it turns off its controversial news feed ranking system \u2014 the software that decides for each user which posts they\u2019ll see and in what order, internal documents show. That leaves users to see all the posts from all of their friends in simple, chronological order.<br \/>Both tests appear to have taught Facebook\u2019s researchers the same lesson: Users are better off with Facebook\u2019s software calling the shots.<br \/>The internal research documents, some previously unreported, help to explain why Facebook seems so wedded to its automated ranking system, known as the news feed algorithm. That system is under intense public scrutiny.<br \/>In testimony to U.S. Congress and abroad, whistleblower Frances Haugen has pointed to the algorithm as central to the social network\u2019s problems, arguing that it systematically amplifies and rewards hateful, divisive, misleading and sometimes outright false content by putting it at the top of users\u2019 feeds. And previously reported internal documents, which Haugen provided to regulators and media outlets, including The Washington Post, have shown how Facebook <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/?itid=lk_inline_manual_5\" target=\"_blank\" rel=\"noopener\">crafts its ranking system<\/a> to keep users hooked, sometimes <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/?itid=lk_inline_manual_5\" target=\"_blank\" rel=\"noopener\">at the cost of angering or misinforming them<\/a>.<br \/>A growing number of lawmakers in both parties now think users should have an option to disable such automated ranking systems \u2014 for good. A bill introduced in the House of Representatives this week would require social media companies to offer a version of their services that <a href=\"https:\/\/www.axios.com\/algorithm-bill-house-bipartisan-5293581e-430f-4ea1-8477-bd9adb63519c.html\" target=\"_blank\" rel=\"noopener\">doesn\u2019t rely on opaque algorithms<\/a> to decide what users see. It joins a similar bill in the Senate. Both are sponsored by high-ranking members of both parties, giving the legislation a viable path to become law. (They are distinct from previous proposals that seek to regulate algorithms through other means, such as by <a href=\"https:\/\/www.washingtonpost.com\/politics\/2021\/10\/14\/top-democrats-unveil-bill-rein-tech-companies-malicious-algorithms\/?itid=lk_inline_manual_6\" target=\"_blank\" rel=\"noopener\">allowing platforms to be sued<\/a> when they amplify illegal content.)<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/12\/congress-regulate-facebook-algorithm\/?itid=lk_interstitial_manual_7\">Lawmakers\u2019 latest idea to fix Facebook: Regulate the algorithm<\/a><\/span><br \/>The political<b> <\/b>push raises an old question for Facebook: Why not just give users the power to turn off their feed ranking algorithms voluntarily? Would letting users opt to see every post from the people they follow, in chronological order, be so bad?<br \/>The documents suggest that Facebook\u2019s defense of algorithmic rankings stems not only from its business interests, but from a paternalistic conviction, backed by data, that its sophisticated personalization software knows what users want better than the users themselves. It\u2019s a view that likely extends beyond Facebook: Rivals such as Twitter, TikTok and YouTube rely heavily on automated content recommendation systems, as does Facebook\u2019s corporate sibling Instagram.<br \/>But critics say this view misses something important: the value of giving users more agency over their information diet.<br \/>Since 2009, three years after it launched the news feed, Facebook has used <a href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2021\/how-facebook-algorithm-works\/?itid=ap_willoremus&#038;itid=lk_inline_manual_14\" target=\"_blank\" rel=\"noopener\">software that predicts which posts each user will find most interesting<\/a> and places those at the top of their feeds while burying others. That system, which has evolved in complexity to take in as many as 10,000 pieces of information about each post, has fueled the news feed\u2019s growth into a dominant information source.<br \/>The proliferation of false information, conspiracy theories and partisan propaganda on Facebook and other social networks has led some to wonder whether we wouldn\u2019t all be better off with a simpler, older system: one that simply shows people all the messages, pictures and videos from everyone they follow, in the order they were posted. That was more or less how Instagram and Twitter worked until 2016. But Facebook has long resisted it.<br \/>\u201cResearch we\u2019ve done shows that unranked feeds can lead to integrity issues and other problems,\u201d spokeswoman Ariana Anthony said, asked why Facebook won\u2019t let users turn off ranking permanently.<br \/>Internal documents make clear that Facebook\u2019s decisions around feed ranking have not always been guided by concerns about \u201cintegrity,\u201d which is Facebook\u2019s term for content that may be harmful or misleading. Rather, they appear to have been informed mostly by data on user engagement, at least until recently.<br \/>\u201cWhenever we\u2019ve tried to compare ranked and unranked feeds, ranked feeds just seem better,\u201d wrote an employee in a memo titled, \u201cIs ranking good?\u201d, which was posted to the company\u2019s internal network, Facebook Workplace, in 2018. That employee, who said they had worked on and studied the news feed for two years, went on to question whether automated ranking might also come with costs that are harder to measure than the benefits. \u201cEven asking this question feels slightly blasphemous at Facebook,\u201d they added.<br \/>In 2014, another internal report, titled \u201cFeed ranking is good,\u201d summarized the results of tests that found allowing users to turn off the algorithm led them to spend less time in their news feeds, post less often and interact less. Ultimately, they began logging into Facebook less often, imperiling the years-long growth in user engagement that has long powered the company\u2019s lucrative advertising business. Without an algorithm deciding which posts to show at the top of users\u2019 feeds, concluded the report\u2019s author, whose name was redacted, \u201cFacebook would probably be shrinking.\u201d<br \/>What many users may not realize is that Facebook actually does offer an option to see a mostly chronological feed, called \u201cmost recent,\u201d if you <a href=\"https:\/\/www.facebook.com\/help\/218728138156311\" target=\"_blank\" rel=\"noopener\">select it from a settings menu<\/a>. To reach it today on Facebook\u2019s mobile app, you have to tap the tiny \u201cmenu\u201d icon at the bottom of your feed, then find and select \u201cmost recent.\u201d A shortcut that Facebook introduced in March, called the \u201cfeed filter bar,\u201d did not work at all on this reporter\u2019s account.<br \/>But there\u2019s a catch: The setting only applies for as long as you stay logged in. When you leave and come back, the ranking algorithm will be back on.<br \/>In the 2014 test, which has not been previously reported, the company toyed with honoring the \u201cmost recent\u201d setting for longer and shorter periods of time after a user selected it from the settings \u2014 that is, with leaving the ranking algorithm off for longer and shorter periods before reverting to it. The results were not encouraging, from Facebook\u2019s standpoint. The longer Facebook left the user\u2019s feed in chronological order, the less time they spent on it, the less they posted, and the less often they returned to Facebook.<br \/>In a comment on the report, one Facebook employee asked whether the company would be better off removing the chronological feed option altogether: \u201cIt seems like a really bad experience to click \u2018Most recent\u2019 and then have it default back after 12 hours. This seems like it would be more frustrating than not having the option at all.\u201d<br \/><span class=\"font--article-body font-copy hide-for-print ma-0 pb-md db italic interstitial\"><a data-qa=\"interstitial-link\" href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2021\/how-facebook-algorithm-works\/?itid=lk_interstitial_manual_25\">How Facebook shapes your feed<\/a><\/span><br \/>A separate report from 2018, first described by <a href=\"https:\/\/bigtechnology.substack.com\/p\/facebook-removed-the-news-feed-algorithm\" target=\"_blank\" rel=\"noopener\">Alex Kantrowitz\u2019s newsletter Big Technology<\/a>, found that turning off the algorithm unilaterally for a subset of Facebook users, and showing them posts mostly in the order they were posted, led to \u201cmassive engagement drops.\u201d Notably, it also found that users saw more low-quality content in their feeds, at least at first, although the company\u2019s researchers were able to mitigate that with more aggressive \u201cintegrity\u201d measures.<br \/>That last finding has since become Facebook\u2019s go-to justification for its ranking algorithm.<br \/>Nick Clegg, the company\u2019s vice president of global affairs, said in <a href=\"https:\/\/twitter.com\/ThisWeekABC\/status\/1447210036422393861\" target=\"_blank\" rel=\"noopener\">a TV interview last month<\/a> that if Facebook were to remove the news feed algorithm, \u201cthe first thing that would happen is that people would see more, not less, hate speech; more, not less, misinformation; more, not less, harmful content. Why? Because those algorithmic systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content.\u201d<br \/>Some critics say that\u2019s a straw-man argument. Simply removing automated rankings for a subset of users, on a social network that has been built to rely heavily on those systems, is not the same as designing a service to work well without them, said Ben Grosser, a professor of new media at University of Illinois at Urbana-Champaign. Those users\u2019 feeds are no longer curated, but the posts they\u2019re seeing are still influenced by the algorithm\u2019s reward systems. That is, they\u2019re still seeing content from people and publishers who are vying for the likes, shares and comments that drive Facebook\u2019s recommendations.<br \/>And because the algorithm has always been there, Facebook users haven\u2019t been given the time or the tools to curate their feeds for themselves in thoughtful ways. In other words, Facebook has never really given a chronological news feed a fair shot to succeed.<br \/>Grosser runs a<a href=\"https:\/\/bengrosser.com\/projects\/minus\/\" target=\"_blank\" rel=\"noopener\"> small, experimental social network called \u201cMinus,\u201d<\/a> which has a chronological feed, no likes or other visible reward system, and no ranking algorithm.<br \/>\u201cMy experience from watching a chronological feed within a social network that isn\u2019t always trying to optimize for growth is that a lot of these problems\u201d \u2014 such as hate speech, trolling and manipulative media \u2014 \u201cjust don\u2019t exist.\u201d<br \/>Facebook is not the only social platform with an opaque ranking algorithm, of course. Twitter also uses machine-learning software to rank the tweets people see in their timelines. Like Facebook, it offers an option to see tweets in chronological order. In Twitter\u2019s case, that setting is much more accessible, requiring a single tap on the \u201csparkle\u201d icon above the main feed. TikTok\u2019s \u201cFor You\u201d page, meanwhile, is entirely algorithmic, with no option to turn off automated rankings. The same is true of Instagram.<br \/>Facebook has not taken an official stand on the legislation that would require social networks to offer a chronological feed option, but Clegg said in <a href=\"https:\/\/www.usatoday.com\/story\/opinion\/todaysdebate\/2021\/10\/12\/facebook-congress-pass-new-internet-regulations\/6092534001\/\" target=\"_blank\" rel=\"noopener\">an op-ed last month<\/a> that the company is open to regulation around algorithms,<b> <\/b>transparency, and user controls.<br \/>Twitter, for its part, signaled potential support for the bills.<br \/>\u201cWe agree that increased transparency and choice in tech are important, and we\u2019re encouraged that Congress is focusing on these issues,\u201d said Lauren Culbertson, Twitter\u2019s head of U.S. public policy. \u201cWe firmly believe that people should have meaningful control over their experience on Twitter, and that people should be provided with the information they need to make informed choices.\u201d<br \/>Interesting as Facebook\u2019s own research on chronological feeds might be, it shouldn\u2019t be considered definitive for purposes of policymaking, said Nathalie Mar\u00e9chal, senior policy and partnerships manager for the nonprofit Ranking Digital Rights.<br \/>\u201cOnly companies themselves can do the experiments to find the answers. And as talented as industry researchers are, we can\u2019t trust executives to make decisions in the public interest based on that research, or to let the public and policymakers access that research.\u201d<br \/>\u201cI think users have the right to expect social media experiences free of recommendation algorithms,\u201d Mar\u00e9chal added. \u201cAs a user, I want to have as much control over my own experience as possible, and recommendation algorithms take that control away from me.\u201d<br \/><i>Correction: Twitter launched its algorithmic timeline in 2016. An earlier version of this story incorrectly said it launched in 2017.<\/i><br \/>The <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook Papers<\/b><\/a> are a set of internal documents that were provided to Congress in redacted form by Frances Haugen\u2019s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post.<br \/>The trove of documents show how <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/mark-zuckerberg-facebook-whistleblower\/\" target=\"_blank\" rel=\"noopener\"><b>Facebook CEO Mark Zuckerberg<\/b><\/a> has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.<br \/>The documents also provided new details of the social media platform\u2019s role in fomenting the <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/\" target=\"_blank\" rel=\"noopener\"><b>storming of the U.S. Capitol<\/b><\/a>.<br \/>Facebook engineers gave <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/\" target=\"_blank\" rel=\"noopener\"><b>extra value to emoji reactions, including \u2018angry,\u2019<\/b><\/a> pushing more emotional and provocative content into users\u2019 news feeds.<br \/><b>Read more from The Post\u2019s investigation:<\/b><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/25\/what-are-the-facebook-papers\/\" target=\"_blank\" rel=\"noopener\">Key takeaways from the Facebook Papers<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/24\/india-facebook-misinformation-hate-speech\/\">How Facebook neglected the rest of the world, fueling hate speech and violence in India<\/a><br \/><a href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2021\/how-facebook-algorithm-works\/\" target=\"_blank\" rel=\"noopener\">How Facebook shapes your feed<\/a><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><br \/>News<span class=\"pl-xxs\">\u2022<\/span><\/p>\n<p><a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/11\/13\/facebook-news-feed-algorithm-how-to-turn-it-off\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>In at least two experiments over the years, Facebook has explored what happens when it turns off its controversial news feed ranking system \u2014 the software that decides for each user which posts they\u2019ll see and in what order, internal documents show. That leaves users to see all the posts from all of their friends [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-695","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/695","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=695"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/695\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=695"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=695"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=695"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}