{"id":724,"date":"2021-11-19T10:12:21","date_gmt":"2021-11-19T09:12:21","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/facebook-papers-history-will-not-judge-us-kindly-the-atlantic\/"},"modified":"2021-11-19T10:12:21","modified_gmt":"2021-11-19T09:12:21","slug":"facebook-papers-history-will-not-judge-us-kindly-the-atlantic","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/facebook-papers-history-will-not-judge-us-kindly-the-atlantic\/","title":{"rendered":"Facebook Papers: \u2018History Will Not Judge Us Kindly\u2019 &#8211; The Atlantic"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p><a href=\"https:\/\/www.theatlantic.com\/subscriber-newsletters\/\" class=\"BreakingNewsBarDisplay_link__2Hk9v\" data-action=\"click link - breaking bar\" data-label=\"https:\/\/www.theatlantic.com\/subscriber-newsletters\/\">Only through November 30: Try subscriber newsletters for free<\/a><br \/>Thousands of pages of internal documents offer the clearest picture yet of how Facebook endangers American democracy\u2014and show that the company\u2019s own employees know it.<br \/><span class=\"AboutTheAuthors_label__2SWZW\">About the author: <\/span><span><span><a href=\"https:\/\/www.theatlantic.com\/author\/adrienne-lafrance\/\" class=\"author-link\" data-label=\"https:\/\/www.theatlantic.com\/author\/adrienne-lafrance\/\" data-action=\"click author - name\"  >Adrienne LaFrance<\/a> is the executive editor of <em>The Atlantic<\/em>. She was previously a senior editor and staff writer at <em>The Atlantic, <\/em>and the editor of TheAtlantic.com.<\/span><\/span> <br \/>B<span class=\"smallcaps\">efore I tell you what happened<\/span> at exactly 2:28 p.m. on Wednesday, January 6, 2021, at the White House\u2014and how it elicited a very specific reaction, some 2,400 miles away, in Menlo Park, California\u2014you need to remember the mayhem of that day, the exuberance of the mob as it gave itself over to violence, and how several things seemed to happen all at once.<br \/>At 2:10 p.m., a live microphone captured a Senate aide\u2019s panicked warning that \u201cprotesters are in the building,\u201d and both houses of Congress began evacuating.<br \/>At 2:13 p.m., Vice President Mike Pence was hurried off the Senate floor and out of the chamber.<br \/>At 2:15 p.m., thunderous chants were heard: \u201cHang Mike Pence! Hang Mike Pence!\u201d<br \/>At the White House, President Donald Trump was watching the insurrection live on television. The spectacle excited him. Which brings us to 2:28 p.m., the moment when Trump shared a message he had just tweeted with his 35 million Facebook followers: \u201cMike Pence didn\u2019t have the courage to do what should have been done to protect our Country and our Constitution \u2026 USA demands the truth!\u201d<br \/><a href=\"https:\/\/www.theatlantic.com\/ideas\/archive\/2021\/01\/attempted-coup\/617570\/\">David A. Graham: This is a coup<\/a><br \/>Even for the Americans inured to the president\u2019s thumbed outbursts, Trump\u2019s attack against his own vice president\u2014at a moment when Pence was being hunted by the mob Trump sent to the Capitol\u2014was something else entirely. Horrified Facebook employees scrambled to enact \u201cbreak the glass\u201d measures, steps they could take to quell the further use of their platform for inciting violence. That evening, Mark Zuckerberg, Facebook\u2019s founder and CEO, posted a message on Facebook\u2019s internal chat platform, known as Workplace, under the heading \u201cEmployee FYI.\u201d<br \/>\u201cThis is a dark moment in our nation\u2019s history,\u201d Zuckerberg wrote, \u201cand I know many of you are frightened and concerned about what\u2019s happening in Washington, DC. I\u2019m personally saddened by this mob violence.\u201d<br \/>Facebook staffers weren\u2019t sad, though. They were angry, and they were very specifically angry at Facebook. Their message was clear: <i>This is our fault.<\/i><br \/>Chief Technology Officer Mike Schroepfer asked employees to \u201chang in there\u201d as the company figured out its response. \u201cWe have been \u2018hanging in there\u2019 for years,\u201d one person replied. \u201cWe must demand more action from our leaders. At this point, faith alone is not sufficient.\u201d<br \/>\u201cAll due respect, but haven\u2019t we had enough time to figure out how to manage discourse without enabling violence?\u201d another staffer responded. \u201cWe\u2019ve been fueling this fire for a long time and we shouldn\u2019t be surprised it\u2019s now out of control.\u201d<br \/>\u201cI\u2019m tired of platitudes; I want action items,\u201d another staffer wrote. \u201cWe\u2019re not a neutral entity.\u201d<br \/>\u201cOne of the darkest days in the history of democracy and self-governance,\u201d yet another staffer wrote. \u201cHistory will not judge us kindly.\u201d<br \/>Facebook employees have long understood that their company undermines democratic norms and restraints in America and across the globe. Facebook\u2019s hypocrisies, and its hunger for power and market domination, are not secret. Nor is the company\u2019s conflation of free speech and algorithmic amplification. But the events of January 6 proved for many people\u2014including many in Facebook\u2019s workforce\u2014to be a breaking point.<br \/><i>The Atlantic <\/i>reviewed thousands of pages of documents from Facebook, including internal conversations and research conducted by the company, from 2017 to 2021. Frances Haugen, the whistleblower and former Facebook engineer who testified before Congress earlier this month, filed a series of disclosures about Facebook to the Securities and Exchange Commission and to Congress before her testimony. Redacted versions of those documents were obtained by a consortium of more than a dozen news organizations, including <i>The Atlantic<\/i>. The names of Facebook employees are mostly blacked out.<br \/>The documents are astonishing for two reasons: First, because their sheer volume is unbelievable. And second, because these documents leave little room for doubt about Facebook\u2019s crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rise of Facebook, of course. But Facebook makes it much easier for authoritarians to win.<br \/>Again and again, the Facebook Papers show staffers sounding alarms about the dangers posed by the platform\u2014how Facebook amplifies extremism and misinformation, how it incites violence, how it encourages radicalization and political polarization. Again and again, staffers reckon with the ways in which Facebook\u2019s decisions stoke these harms, and they plead with leadership to do more.<i> <\/i><br \/>And again and again, staffers say, Facebook\u2019s leaders ignore them.<br \/><span class=\"smallcaps\">By nightfall on January 6, 2021,<\/span> the siege had been reversed, though not without fatalities. Washington\u2019s mayor had issued a citywide curfew and the National Guard was patrolling the streets. Facebook announced that it would lock Trump\u2019s account, effectively preventing him from posting on the platform for 24 hours.<br \/>\u201cDo you genuinely think 24 hours is a meaningful ban?\u201d one Facebook staffer wrote on an internal message board. The staffer then turned, just as others had, to the years of failures and inaction that had preceded that day. \u201cHow are we expected to ignore when leadership overrides research based policy decisions to better serve people like the groups inciting violence today. Rank and file workers have done their part to identify changes to improve our platform but have been actively held back. Can you offer any reason we can expect this to change in the future.\u201d<br \/>It was a question without a question mark. The employee seemed to know that there wouldn\u2019t be a satisfying answer.<br \/>Facebook later extended the ban at least until the end of Trump\u2019s presidential term, and then, when Facebook\u2019s Oversight Board ruled against imposing an indefinite ban, it extended the temporary ban until at least January 7, 2023. But for some Facebook employees, the decision to crack down on Trump for inciting violence was comically overdue. Facebook had finally acted, but to many at the company, it was too little, too late. For months, Trump had incited the insurrection\u2014in plain sight, <i>on Facebook. <\/i><br \/>Facebook has dismissed the concerns of its employees in manifold ways. One of its cleverer tactics is to argue that staffers who have raised the alarm about the damage done by their employer are simply enjoying Facebook\u2019s \u201cvery open culture,\u201d in which people are encouraged to share their opinions, a spokesperson told me. This stance allows Facebook to claim transparency while ignoring the substance of the complaints, and the implication of the complaints: that many of Facebook\u2019s employees believe their company operates without a moral compass.<br \/><a href=\"https:\/\/www.theatlantic.com\/magazine\/archive\/2021\/11\/facebook-authoritarian-hostile-foreign-power\/620168\/\">Adrienne LaFrance: The largest autocracy on Earth<\/a><br \/>\u201cEmployees have been crying out for months to start treating high-level political figures the same way we treat each other on the platform,\u201d one employee wrote in the January 6 chat. \u201cThat\u2019s all we\u2019re asking for \u2026 Today, a coup was attempted against the United States. I hope the circumstances aren\u2019t even more dire next time we speak.\u201d<br \/><span class=\"smallcaps\">rewind two months<\/span> to November 4, 2020, the day after the presidential election. The outcome of the election was still unknown when a 30-year-old political activist created a Facebook group called \u201cStop the Steal.\u201d<br \/>\u201cDemocrats are scheming to disenfranchise and nullify Republican votes,\u201d the group\u2019s manifesto read. \u201cIt\u2019s up to us, the American people, to fight and to put a stop to it.\u201d Within hours, \u201cStop the Steal\u201d was growing at a mind-scrambling rate. At one point it was acquiring 100 new members every 10 seconds. It soon became one of the fastest-growing groups in Facebook history.<br \/>As \u201cStop the Steal\u201d metastasized, Facebook employees traded messages on the company\u2019s internal chat platform, expressing anxiety about their role in spreading election misinformation. \u201cNot only do we not do something about combustible election misinformation in comments,\u201d one wrote on November 5; \u201cwe amplify and give them broader distribution. Why?\u201d<br \/>By then, less than 24 hours after the group\u2019s creation, \u201cStop the Steal\u201d had grown to 333,000 members, and the group\u2019s administrator couldn\u2019t keep up with the pace of commenting. Facebook employees were worried that \u201cStop the Steal\u201d members were inciting violence, and the group came to the attention of executives. Facebook, to its credit, promptly shut down the group. But we now know that \u201cStop the Steal\u201d had already reached too many people, too quickly, to be contained. The movement jumped from one platform to another. And even when the group was removed by Facebook, the platform remained a key hub for people to coordinate the attack on the U.S. Capitol.<br \/>After the best-known \u201cStop the Steal\u201d Facebook group was dismantled, copycat groups sprang up. All the while, the movement was encouraged by President Trump, who posted to Facebook and Twitter, sometimes a dozen times a day, his complaint always the same\u2014he won, and Joe Biden lost. His demand was always the same as well: It was time for his supporters to fight for him and for their country.<\/p>\n<p>N<span class=\"smallcaps\">ever before in the history<\/span> of the Justice Department has an investigation been so tangled up with social media. Facebook is omnipresent in the related court documents, woven throughout the stories of how people came to be involved in the riot in the first place, and reappearing in accounts of chaos and bloodshed. More than 600 people have been charged with crimes in connection to January 6. Court documents also detail how Facebook provided investigators with identifying information about its users, as well as metadata that investigators used to confirm alleged perpetrators\u2019 whereabouts that day. Taken in aggregate, these court documents from January 6 are themselves a kind of facebook, one filled with selfies posted on Facebook apps over the course of the insurrection.<br \/><a href=\"https:\/\/www.theatlantic.com\/international\/archive\/2021\/05\/facebook-oversight-board-trump-problem\/618809\/\">Helen Lewis: The problem is Facebook<\/a><br \/>On a bright, chilly Wednesday weeks after the insurrection, when FBI agents finally rolled up to Russell Dean Alford\u2019s Paint &amp; Body Shop in Hokes Bluff, Alabama, <a href=\"https:\/\/www.justice.gov\/usao-dc\/case-multi-defendant\/file\/1393326\/download\">they said<\/a> Alford\u2019s reaction was this: \u201cI wondered when y\u2019all were going to show up. Guess you\u2019ve seen the videos on my Facebook page.\u201d Alford pleaded not guilty to four federal charges, including knowingly entering a restricted building and disorderly conduct.<br \/>Not only were the perpetrators live-streaming their crimes as they committed them, but federal court records show that those who have been indicted spent many weeks stoking violence on Facebook with posts such as \u201cNO EXCUSES! NO RETREAT! NO SURRENDER! TAKE THE STREETS! TAKE BACK OUR COUNTRY! 1\/6\/2021=7\/4\/1776\u201d and \u201cGrow a pair of balls and take back your government!\u201d<br \/>When you stitch together the stories that spanned the period between Joe Biden\u2019s election and his inauguration, it\u2019s easy to see Facebook as instrumental to the attack on January 6. (A spokesperson told me that the notion that Facebook played an instrumental role in the insurrection is \u201cabsurd.\u201d) Consider, for example, the case of Daniel Paul Gray. According to an FBI agent\u2019s affidavit, Gray posted several times on Facebook in December about his plans for January 6, commenting on one post, \u201cOn the 6th a f[*]cking sh[*]t ton of us are going to Washington to shut the entire city down. It\u2019s gonna be insane I literally can\u2019t wait.\u201d In a private message, he bragged that he\u2019d just joined a militia and also sent a message saying, \u201care you gonna be in DC on the 6th like trump asked us to be?\u201d Gray was later indicted on nine federal charges, including obstruction of an official proceeding, engaging in acts of physical violence, violent entry, assault, and obstruction of law enforcement. He has pleaded not guilty to all of them.<br \/>Then there\u2019s the case of Cody Page Carter Connell, who allegedly encouraged his Facebook friends to join him in D.C. on January 6. Connell ended up charged with eight federal crimes, and he pleaded not guilty to all of them. After the insurrection, according to an FBI affidavit, he boasted on Facebook about what he\u2019d done.<br \/>\u201cWe pushed the cops against the wall, they dropped all their gear and left,\u201d he wrote in one message.<br \/>\u201cYall boys something serious, lol,\u201d someone replied. \u201cIt lookin like a civil war yet?\u201d<br \/>Connell\u2019s response: \u201cIt\u2019s gonna come to it.\u201d<br \/>All over America, people used Facebook to organize convoys to D.C., and to fill the buses they rented for their trips. Facebook users shared and reshared messages like this one, which appeared before dawn on Christmas Eve in a Facebook group for the Lebanon Maine Truth Seekers:<br \/>This election was stolen and we are being slow walked towards Chinese ownership by an establishment that is treasonous and all too willing to gaslight the public into believing the theft was somehow the will of the people. Would there be an interest locally in organizing a caravan to Washington DC for the Electoral College vote count on Jan 6th, 2021? I am arranging the time off and will be a driver if anyone wishes to hitch a ride, or a lead for a caravan of vehicles. If a call went out for able bodies, would there be an answer? Merry Christmas.<br \/>The post was signed by Kyle Fitzsimons, who was later indicted on charges including attacking police officers on January 6. Fitzsimons has pleaded not guilty to all eight federal charges against him.<br \/>You may be thinking: <i>It\u2019s 2021; of course people used Facebook to plan the insurrection. It\u2019s what they use to plan all aspects of their lives.<\/i> But what emerges from a close reading of Facebook documents, and observation of the manner in which the company connects large groups of people quickly, is that Facebook isn\u2019t a passive tool but a catalyst. Had the organizers tried to plan the rally using other technologies of earlier eras, such as telephones, they would have had to identify and reach out individually to each prospective participant, then persuade them to travel to Washington. Facebook made people\u2019s efforts at coordination highly visible on a global scale. The platform not only helped them recruit participants but offered people a sense of strength in numbers. Facebook proved to be the perfect hype machine for the coup-inclined.<br \/>Among those charged with answering Trump\u2019s call for revolution were 17 people from Florida, Ohio, North Carolina, Georgia, Alabama, Texas, and Virginia who allegedly coordinated on Facebook and other social platforms to join forces with the far-right militia known as the Oath Keepers. One of these people, 52-year-old Kelly Meggs from rural Florida, allegedly participated with his wife in weapons training to prepare for January 6.<br \/>\u201cTrump said It\u2019s gonna be wild!!!!!!!\u201d Meggs wrote in a Facebook message on December 22, according to an indictment. \u201cIt\u2019s gonna be wild!!!!!!! He wants us to make it WILD that\u2019s what he\u2019s saying. He called us all to the Capitol and wants us to make it wild!!! Sir Yes Sir!!! Gentlemen we are heading to DC pack your shit!!\u201d Meggs and his Facebook friends arrived in Washington with paramilitary gear and battle-ready supplies\u2014including radio equipment, camouflage combat uniforms, helmets, eye protection, and tactical vests with plates. They\u2019re charged with conspiracy against the United States. Meggs has pleaded not guilty to all charges. His wife, Connie Meggs, has a trial date set for January 2022.<br \/>Ronald Mele, a 51-year-old California man, also used Facebook to share his plans for the insurrection, writing in a December Facebook post that he was taking a road trip to Washington \u201cto support our President on the 6th and days to follow just in case,\u201d according to his federal indictment. <a href=\"https:\/\/www.justice.gov\/opa\/press-release\/file\/1403191\/download\">Prosecutors say <\/a>he and five other men mostly used the chat app Telegram to make their plans\u2014debating which firearms, shotgun shells, and other weapons to bring with them and referring to themselves as soldiers in the \u201cDC Brigade\u201d\u2014and three of them posted to Instagram and Facebook about their plans as well. On January 2, four members of the group met at Mele\u2019s house in Temecula, about an hour north of San Diego. Before they loaded into an SUV and set out across the country, someone suggested that they take a group photo. The men posed together, making hand gestures associated with the Three Percenters, a far-right militia movement that\u2019s classified as a terrorist organization in Canada. (Mele has pleaded not guilty to all four charges against him.)<br \/>On January 6, federal prosecutors say, members of the DC Brigade were among the rioters who broke through the final police line, giving the mob access to the West Terrace of the Capitol. At 2:30 p.m., just after President Trump egged on the rioters on Facebook, Mele and company were on the West Terrace celebrating, taking selfies, and shouting at fellow rioters to go ahead and enter the Capitol. One of the men in the group, Alan Hostetter, a 56-year-old from San Clemente, posted a selfie to his Instagram account, with a crowd of rioters in the background. Hostetter, who has pleaded not guilty to all charges, tapped out a caption to go with the photo: \u201cThis was the \u2018shot heard \u2019round the world!\u2019 \u2026 the 2021 version of 1776. That war lasted 8 years. We are just getting warmed up.\u201d<br \/>I<br \/>n November 2019, Facebook staffers noticed they had a serious problem. Facebook offers a collection of one-tap emoji reactions. Today, they include \u201clike,\u201d \u201clove,\u201d \u201ccare,\u201d \u201chaha,\u201d \u201cwow,\u201d \u201csad,\u201d and \u201cangry.\u201d Company researchers had found that the posts dominated by \u201cangry\u201d reactions were substantially more likely to go against community standards, including prohibitions on various types of misinformation, according to internal documents.<br \/>But Facebook was slow to act. In July 2020, researchers presented the findings of a series of experiments. At the time, Facebook was already weighting the reactions other than \u201clike\u201d more heavily in its algorithm\u2014meaning posts that got an \u201cangry\u201d reaction were more likely to show up in users\u2019 News Feeds than posts that simply got a \u201clike.\u201d Anger-inducing content didn\u2019t spread just because people were more likely to share things that made them angry; the algorithm gave anger-inducing content an edge. Facebook\u2019s Integrity workers\u2014employees tasked with tackling problems such as misinformation and espionage on the platform\u2014concluded that they had good reason to believe targeting posts that induced anger would help stop the spread of harmful content.<br \/>By dialing anger\u2019s weight back to zero in the algorithm, the researchers found, they could keep posts to which people reacted angrily from being viewed by as many users. That, in turn, translated to a significant (up to 5 percent) reduction in the hate speech, civic misinformation, bullying, and violent posts\u2014all of which are correlated with offline violence\u2014to which users were exposed. Facebook rolled out the change in early September 2020, documents show; a Facebook spokesperson confirmed that the change has remained in effect. It was a real victory for employees of the Integrity team.<br \/>But it doesn\u2019t normally work out that way. In April 2020, according to Frances Haugen\u2019s filings with the SEC, Facebook employees had recommended tweaking the algorithm so that the News Feed would deprioritize the surfacing of content for people based on their Facebook friends\u2019 behavior. The idea was that a person\u2019s News Feed should be shaped more by people and groups that a person had chosen to follow. Up until that point, if your Facebook friend saw a conspiracy theory and reacted to it, Facebook\u2019s algorithm might show it to you, too. The algorithm treated any engagement in your network as a signal that something was worth sharing. But now Facebook workers wanted to build circuit breakers to slow this form of sharing.<br \/>Experiments showed that this change would impede the distribution of hateful, polarizing, and violence-inciting content in people\u2019s News Feeds. But Zuckerberg \u201crejected this intervention that could have reduced the risk of violence in the 2020 election,\u201d Haugen\u2019s SEC filing says. An internal message characterizing Zuckerberg\u2019s reasoning says he wanted to avoid new features that would get in the way of \u201cmeaningful social interactions.\u201d But according to Facebook\u2019s definition, its employees say, engagement is considered \u201cmeaningful\u201d even when it entails bullying, hate speech, and reshares of harmful content.<br \/>This episode, like Facebook\u2019s response to the incitement that proliferated between the election and January 6, reflects a fundamental problem with the platform. Facebook\u2019s<a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2020\/12\/facebook-doomsday-machine\/617384\/\"> megascale<\/a> allows the company to influence the speech and thought patterns of billions of people. What the world is seeing now, through the window provided by reams of internal documents, is that Facebook catalogs and studies the harm it inflicts on people. And then it keeps harming people anyway.<br \/>\u201cI am worried that Mark\u2019s continuing pattern of answering a different question than the question that was asked is a symptom of some larger problem,\u201d wrote one Facebook employee in an internal post in June 2020, referring to Zuckerberg. \u201cI sincerely hope that I am wrong, and I\u2019m still hopeful for progress. But I also fully understand my colleagues who have given up on this company, and I can\u2019t blame them for leaving. Facebook is not neutral, and working here isn\u2019t either.\u201d<br \/>\u201cI just wish we could hear the truth directly,\u201d another added. \u201cAnything feels like we (the employees) are being intentionally deceived.\u201d<br \/><a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2017\/10\/what-facebook-did\/542502\/\">Read: What Facebook did to American democracy<\/a><br \/>I\u2019ve been covering Facebook for a decade now, and the challenges it must navigate are novel and singularly complex. One of the most important, and heartening, revelations of the Facebook Papers is that many Facebook workers are trying conscientiously to solve these problems. One of the disheartening features of these documents is that these same employees have little or no faith in Facebook leadership. It is quite a thing to see, the sheer number of Facebook employees\u2014people who presumably understand their company as well as or better than outside observers\u2014who believe their employer to be morally bankrupt.<br \/>I spoke with several former Facebook employees who described the company\u2019s metrics-driven culture as extreme, even by Silicon Valley standards. (I agreed not to name them, because they feared retaliation and ostracization from Facebook for talking about the company\u2019s inner workings.) Facebook workers are under tremendous pressure to quantitatively demonstrate their individual contributions to the company\u2019s growth goals, they told me. New products and features aren\u2019t approved unless the staffers pitching them demonstrate how they will drive engagement. As a result, Facebook has stoked an algorithm arms race within its ranks, pitting core product-and-engineering teams, such as the News Feed team, against their colleagues on Integrity teams, who are tasked with mitigating harm on the platform. These teams establish goals that are often in direct conflict with each other.<br \/>One of Facebook\u2019s Integrity staffers wrote at length about this dynamic in a goodbye note to colleagues in August 2020, describing how risks to Facebook users \u201cfester\u201d because of the \u201casymmetrical\u201d burden placed on employees to \u201cdemonstrate legitimacy and user value\u201d before launching any harm-mitigation tactics\u2014a burden not shared by those developing new features or algorithm changes with growth and engagement in mind. The note said:<br \/>We were willing to act only after things had spiraled into a dire state \u2026 Personally, during the time that we hesitated, I\u2019ve seen folks from my hometown go further and further down the rabbithole of QAnon and Covid anti-mask\/anti-vax conspiracy on FB. It has been painful to observe.<br \/>Current and former Facebook employees describe the same fundamentally broken culture\u2014one in which effective tactics for making Facebook safer are rolled back by leadership or never approved in the first place. (A Facebook spokesperson rejected the notion that it deprioritizes the well-being of its users.) That broken culture has produced a broken platform: an algorithmic ecosystem in which users are pushed toward ever more extreme content, and where Facebook knowingly exposes its users to conspiracy theories, disinformation, and incitement to violence.<br \/>One example is a program that amounts to a whitelist for VIPs on Facebook, allowing some of the users most likely to spread misinformation to break Facebook\u2019s rules without facing consequences. Under the program, internal documents show, millions of high-profile users\u2014including politicians\u2014are left alone by Facebook even when they incite violence. Some employees have flagged for their superiors how dangerous this is, explaining in one internal document that Facebook had solid evidence showing that when \u201ca piece of content is shared by a co-partisan politician, it tends to be perceived as more trustworthy, interesting, and helpful than if it\u2019s shared by an ordinary citizen.\u201d In other words, whitelisting influential users with massive followings on Facebook isn\u2019t just a secret and uneven application of Facebook\u2019s rules; it amounts to \u201cprotecting content that is especially likely to deceive, and hence to harm, people on our platforms.\u201d<br \/>Facebook workers tried and failed to end the program. Only when its existence was <a href=\"https:\/\/www.wsj.com\/articles\/facebook-files-xcheck-zuckerberg-elite-rules-11631541353\">reported in September by <i>The<\/i> <i>Wall Street Journal<\/i><\/a> did Facebook\u2019s Oversight Board ask leadership for more information about the practice. Last week, the board <a href=\"https:\/\/oversightboard.com\/news\/215139350722703-oversight-board-demands-more-transparency-from-facebook\/\">publicly rebuked<\/a> Facebook for not being \u201cfully forthcoming\u201d about the program. (Although Oversight Board members are selected by Facebook and paid by Facebook, the company characterizes their work as independent.)<br \/>The Facebook Papers show that workers agonized over trade-offs between what they saw as doing the right thing for the world and doing the right thing for their employer. \u201cI am so torn,\u201d one employee wrote in December 2020 in response to a colleague\u2019s comments on how to fight Trump\u2019s hate speech and incitements to violence. \u201cFollowing these recommendations could hasten our own demise in a variety of ways, which might interfere [with] all the other good we do in the world. How do you weigh these impacts?\u201d Messages show workers wanting Facebook to make honorable choices, and worrying that leadership is incapable of doing so. At the same time, many clearly believe that Facebook is still a net force for good, and they also worry about hurting the platform\u2019s growth.<br \/>These worries have been exacerbated lately by fears about a decline in new posts on Facebook, two former employees who left the company in recent years told me. People are posting new material less frequently to Facebook, and its users are on average older than those of other social platforms. The explosive popularity of platforms such as TikTok, especially among younger people, has rattled Facebook leadership. All of this makes the platform rely more heavily on ways it can manipulate what its users see in order to reach its goals. This explains why Facebook is so dependent on the infrastructure of groups, as well as making reshares highly visible, to keep people hooked.<br \/>But this approach poses a major problem for the overall quality of the site, and former Facebook employees repeatedly told me that groups pose one of the biggest threats of all to Facebook users. In a particularly fascinating document, Facebook workers outline the downsides of \u201ccommunity,\u201d a buzzword Zuckerberg often deploys as a way to justify the platform\u2019s existence. Zuckerberg has defined Facebook\u2019s mission as making \u201csocial infrastructure to give people the power to build a global community that works for all of us,\u201d but in internal research documents his employees point out that communities aren\u2019t always good for society:<br \/>When part of a community, individuals typically act in a prosocial manner. They conform, they forge alliances, they cooperate, they organize, they display loyalty, they expect obedience, they share information, they influence others, and so on. Being in a group changes their behavior, their abilities, and, importantly, their capability to harm themselves or others \u2026 Thus, when people come together and form communities around harmful topics or identities, the potential for harm can be greater.<br \/>The infrastructure choices that Facebook is making to keep its platform relevant are driving down the quality of the site, and exposing its users to more dangers. Those dangers are also unevenly distributed, because of the manner in which certain subpopulations are algorithmically ushered toward like-minded groups. And the subpopulations of Facebook users who are most exposed to dangerous content are also most likely to be in groups where it won\u2019t get reported.<br \/>Many Facebook employees believe that their company is hurting people. Many have believed this for years. And even <i>they<\/i> can\u2019t stop it. \u201cWe can\u2019t pretend we don\u2019t see information consumption patterns, and how deeply problematic they are for the longevity of democratic discourse,\u201d a user-experience researcher wrote in an internal comment thread in 2019, in response to a <a href=\"https:\/\/www.nytimes.com\/2020\/01\/07\/technology\/facebook-andrew-bosworth-memo.html\">now-infamous memo<\/a> from Andrew \u201cBoz\u201d Bosworth, a longtime Facebook executive. \u201cThere is no neutral position at this stage, it would be powerfully immoral to commit to amorality.\u201d<br \/>I<span class=\"smallcaps\">n the months since<\/span> January 6, Mark Zuckerberg has made a point of highlighting Facebook\u2019s willingness to help federal investigators with their work. \u201cI believe that the former president should be responsible for his words, and the people who broke the law should be responsible for their actions,\u201d Zuckerberg said in <a href=\"https:\/\/www.c-span.org\/video\/?510053-1\/house-hearing-combating-online-misinformation-disinformation&amp;live=\">congressional testimony<\/a> last spring. \u201cSo that leaves the question of the broader information ecosystem. Now, I can\u2019t speak for everyone else\u2014the TV channels, radio stations, news outlets, websites, and other apps. But I can tell you what we did. Before January 6, we worked with law enforcement to identify and address threats. During and after the attack, we provided extensive support in identifying the insurrectionists, and removed posts supporting violence. We didn\u2019t catch everything, but we made our services inhospitable to those who might do harm.\u201d<br \/><a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2019\/05\/how-powerful-mark-zuckerberg\/589129\/\">Read: Mark Zuckerberg\u2019s power is unprecedented<\/a><br \/>Zuckerberg\u2019s positioning of Facebook\u2019s role in the insurrection is odd. He lumps his company in with traditional media organizations\u2014something he\u2019s ordinarily loath to do, lest the platform be expected to take more responsibility for the quality of the content that appears on it\u2014and suggests that Facebook did more, and did better, than journalism outlets in its response to January 6. What he fails to say is that journalism outlets would never be in the position to help investigators this way, because insurrectionists don\u2019t typically use newspapers and magazines to recruit people for coups.<br \/>In hindsight, it is easy to say that Facebook should have made itself far more hostile to insurrectionists before they carried out their attack. But people post passionately about lawful protests all the time. How is Facebook to know which protests will spill into violence and which won\u2019t? The answer here is simple: because its own staffers have obsessively studied this question, and they\u2019re confident that they\u2019ve already found ways to make Facebook safer.<br \/>Facebook wants people to believe that the public must choose between Facebook as it is, on the one hand, and free speech, on the other. This is a false choice. Facebook has a sophisticated understanding of measures it could take to make its platform safer without resorting to broad or ideologically driven censorship tactics.<br \/>Facebook knows that no two people see the same version of the platform, and that certain subpopulations experience far more dangerous versions than others do. Facebook knows that people who are isolated\u2014recently widowed or divorced, say, or geographically distant from loved ones\u2014are disproportionately at risk of being exposed to harmful content on the platform. It knows that repeat offenders are disproportionately responsible for spreading misinformation. And it knows that 3 percent of Facebook users in the United States are super-consumers of conspiracy theories, accounting for 37 percent of known consumption of misinformation on the platform.<br \/>The most viral content on Facebook is basically untouchable\u2014some is so viral that even turning down the distribution knob by 90 percent wouldn\u2019t make a dent in its ability to ricochet around the internet. (A Facebook spokesperson told me that although the platform sometimes reduces how often people see content that has been shared by a chain of two or more people, it is reluctant to apply that solution more broadly: \u201cWhile we have other systems that demote content that might violate our specific policies, like hate speech or nudity, this intervention reduces all content with equal strength. Because it is so blunt, and reduces positive and completely benign speech alongside potentially inflammatory or violent rhetoric, we use it sparingly.\u201d)<br \/>Facebook knows that there are harmful activities taking place on the platform that don\u2019t break any rules, including much of the coordination leading up to January 6. And it knows that its interventions touch only a minuscule fraction of Facebook content anyway. Facebook knows that it is sometimes used to facilitate large-scale societal violence. And it knows that it has acted too slowly to prevent such violence in the past.<br \/>Facebook could ban reshares. It could consistently enforce its policies regardless of a user\u2019s political power. It could choose to optimize its platform for safety and quality rather than for growth. It could tweak its algorithm to prevent widespread distribution of harmful content. Facebook could create a transparent dashboard so that all of its users can see what\u2019s going viral in real time. It could make public its rules for how frequently groups can post and how quickly they can grow. It could also automatically throttle groups when they\u2019re growing too fast, and cap the rate of virality for content that\u2019s spreading too quickly.<br \/>Facebook could shift the burden of proof toward people and communities to demonstrate that they\u2019re good actors\u2014and treat reach as a privilege, not a right. Facebook could say that its platform is not for everyone. It could sound an alarm for those who wander into the most dangerous corners of Facebook, and those who encounter disproportionately high levels of harmful content. It could hold its employees accountable for preventing users from finding these too-harmful versions of the platform, thereby preventing those versions from existing.<br \/>It could do all of these things. But it doesn\u2019t.<br \/>Facebook certainly isn\u2019t the only harmful entity on the social web. Extremism thrives on other social platforms as well, and plenty of them are fueled by algorithms that are equally opaque. Lately, people have been debating just how nefarious Facebook really is. One argument goes something like this:<i> Facebook\u2019s algorithms aren\u2019t magic, its ad targeting isn\u2019t even that good, and most people aren\u2019t that stupid. <\/i><br \/>All of this may be true, but that shouldn\u2019t be reassuring. An algorithm may just be a big dumb means to an end, a clunky way of maneuvering a massive, dynamic network toward a desired outcome. But Facebook\u2019s enormous size gives it tremendous, unstable power. Facebook takes whole populations of people, pushes them toward radicalism, and then steers the radicalized toward one another. For those who found themselves in the \u201cStop the Steal\u201d corners of Facebook in November and December of last year, the enthusiasm, the sense of solidarity, must have been overwhelming and thrilling. Facebook had taken warped reality and distributed it at scale.<br \/>I\u2019ve sometimes compared Facebook to <a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2020\/12\/facebook-doomsday-machine\/617384\/\">a Doomsday Machine<\/a> in that it is technologically simple and unbelievably dangerous\u2014a black box of sensors designed to suck in environmental cues and deliver mutually assured destruction. When the most powerful company in the world possesses an instrument for manipulating billions of people\u2014an instrument that only it can control, and that its own employees say is badly broken and dangerous\u2014we should take notice.<br \/>The lesson for individuals is this: You must be vigilant about the informational streams you swim in, deliberate about how you spend your precious attention, unforgiving of those who weaponize your emotions and cognition for their own profit, and deeply untrusting of any scenario in which you\u2019re surrounded by a mob of people who agree with everything you\u2019re saying.<br \/>And the lesson for Facebook is that the public is beginning to recognize that it deserves much greater insight into how the platform\u2019s machinery is designed and deployed. Indeed, that\u2019s the only way to avoid further catastrophe. Without seeing how Facebook works at a finer resolution, in real time, we won\u2019t be able to understand how to make the social web compatible with democracy.<\/p>\n<p><a href=\"https:\/\/www.theatlantic.com\/ideas\/archive\/2021\/10\/facebook-papers-democracy-election-zuckerberg\/620478\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Only through November 30: Try subscriber newsletters for freeThousands of pages of internal documents offer the clearest picture yet of how Facebook endangers American democracy\u2014and show that the company\u2019s own employees know it.About the author: Adrienne LaFrance is the executive editor of The Atlantic. She was previously a senior editor and staff writer at The [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-724","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/724","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=724"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/724\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=724"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=724"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=724"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}