{"id":972,"date":"2021-11-21T03:32:16","date_gmt":"2021-11-21T02:32:16","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/21\/how-facebook-and-google-fund-global-misinformation-mit-technology-review\/"},"modified":"2021-11-21T03:32:16","modified_gmt":"2021-11-21T02:32:16","slug":"how-facebook-and-google-fund-global-misinformation-mit-technology-review","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/21\/how-facebook-and-google-fund-global-misinformation-mit-technology-review\/","title":{"rendered":"How Facebook and Google fund global misinformation &#8211; MIT Technology Review"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p><span style=\"font-weight:400\">The tech giants are paying millions of dollars to the operators of clickbait pages, bankrolling the deterioration of information ecosystems around the world.<\/span><br \/>Myanmar, March 2021.<br \/>A month after the fall of the democratic government.<br \/>A Facebook Live video showed hundreds of people protesting against the military coup on the streets of Myanmar.<br \/>It had nearly 50,000 shares and over 1.5 million views, in a country with a little over 54 million people.<br \/>Observers, unable to see the events on the ground, used the footage, along with hundreds of other live feeds, to track and document the unfolding situation. (MIT Technology Review blurred the names and images of the posters to avoid jeopardizing their safety.)<br \/>But less than a day later, the same video would be broadcast again multiple times, each still claiming to be live.<br \/>In the middle of a massive political crisis, there was no longer a way to discern what was real and what wasn\u2019t.<\/p>\n<p>In 2015, six of the 10 websites in Myanmar getting the most engagement on Facebook were from legitimate media, according to data from CrowdTangle, a Facebook-run tool. A year later, Facebook (which recently rebranded to Meta) offered global access to Instant Articles, a program publishers could use to monetize their content.<br \/>One year after that rollout, legitimate publishers accounted for only two of the top 10 publishers on Facebook in Myanmar. By 2018, they accounted for zero. All the engagement had instead gone to fake news and clickbait websites. In a country where Facebook is synonymous with the internet, the low-grade content overwhelmed other information sources.<br \/>It was during this rapid degradation of Myanmar\u2019s digital environment that a militant group of Rohingya\u2014a predominantly Muslim ethnic minority\u2014attacked and killed a dozen members of the security forces, in August of 2017. As police and military began to crack down on the Rohingya and push out anti-Muslim propaganda, fake news articles capitalizing on the sentiment went viral. They claimed that Muslims were armed, that they were gathering in mobs 1,000 strong, that they were around the corner coming to kill you.<br \/>It\u2019s still not clear today whether the fake news came primarily from political actors or from financially motivated ones. But either way, the sheer volume of fake news and clickbait acted like fuel on the flames of already dangerously high ethnic and religious tensions. It shifted public opinion and escalated the conflict, which ultimately led to the death of 10,000 Rohingya, by conservative estimates, and the displacement of 700,000 more.<br \/>In 2018, a United Nations investigation determined that the violence against the Rohingya constituted a genocide and that Facebook had played a \u201cdetermining role\u201d in the atrocities. Months later, Facebook admitted it hadn\u2019t done enough \u201cto help prevent our platform from being used to foment division and incite offline violence.\u201d<br \/>Over the last few weeks, the revelations from the Facebook Papers, a collection of internal documents provided to Congress and a consortium of news organizations by whistleblower Frances Haugen, have reaffirmed what civil society groups have been saying for years: Facebook\u2019s algorithmic amplification of inflammatory content, combined with its failure to prioritize content moderation outside the US and Europe, has fueled the spread of hate speech and misinformation, dangerously destabilizing countries around the world.<br \/>But there\u2019s a crucial piece missing from the story. Facebook isn\u2019t just amplifying misinformation.<br \/>The company is also funding it.<br \/>An MIT Technology Review<em> <\/em>investigation, based on expert interviews, data analyses, and documents that were not included in the Facebook Papers, has found that Facebook and Google are paying millions of ad dollars to bankroll clickbait actors, fueling the deterioration of information ecosystems around the world.<br \/>Facebook <a href=\"https:\/\/www.facebook.com\/formedia\/blog\/introducing-instant-articles\">launched its Instant Articles program<\/a> in 2015 with a handful of US and European publishers. The company billed the program as a way to improve article load times and create a slicker user experience.<br \/>The company\u2019s AI algorithms gave it an insatiable habit for lies and hate speech. Now the man who built them can&#x27;t fix the problem.<br \/>That was the public sell. But the move also conveniently captured advertising dollars from Google. Before Instant Articles, articles posted on Facebook would redirect to a browser, where they\u2019d open up on the publisher\u2019s own website. The ad provider, usually Google, would then cash in on any ad views or clicks. With the new scheme, articles would open up directly within the Facebook app, and Facebook would own the ad space. If a participating publisher had also opted in to monetizing with Facebook\u2019s advertising network, called Audience Network, Facebook could insert ads into the publisher\u2019s stories and take a 30% cut of the revenue.\u00a0<br \/>Instant Articles quickly fell out of favor with its original cohort of big mainstream publishers. For them, the payouts weren\u2019t high enough compared with other available forms of monetization. But that was not true for publishers in the Global South, which Facebook began <a href=\"https:\/\/www.wsj.com\/articles\/facebook-opens-up-instant-articles-to-all-publishers-1455732001\">accepting into the program in 2016<\/a>. In 2018, the company reported paying out <a href=\"https:\/\/web.archive.org\/web\/20200106054957\/https:\/\/www.facebook.com\/audiencenetwork\/\">$1.5 billion<\/a> to publishers and app developers (who can also participate in Audience Network). By 2019, that figure had reached <a href=\"https:\/\/www.facebook.com\/audiencenetwork\/resources\/blog\/preparing-audience-network-for-ios14\">multiple billions<\/a>.<br \/>Early on, Facebook performed little quality control on the types of publishers joining the program. The platform\u2019s design also didn\u2019t sufficiently penalize users for posting identical content across Facebook pages\u2014in fact, it rewarded the behavior. Posting the same article on multiple pages could as much as double the number of users who clicked on it and generated ad revenue.<br \/>Clickbait farms around the world seized on this flaw as a strategy\u2014one they still use today.<br \/>A farm will create a website or multiple websites\u2026<br \/>\u2026for publishing predominantly plagiarized content.<br \/>It registers them with           <!-- --> <span id=\"instant-articles\" class=\"csti-underline\">Instant Articles<\/span> <!-- -->and           <!-- --> <span id=\"audience-network\" class=\"csti-underline\">Audience Network<\/span>,         <br \/>which inserts ads into their articles.<br \/>Then it posts those articles across a cluster of as many as dozens of Facebook pages at a time.<br \/>Clickbait actors cropped up in Myanmar overnight. With the right recipe for producing engaging and evocative content, they could generate thousands of US dollars a month in ad revenue, or 10 times the average monthly salary\u2014paid to them directly by Facebook.<br \/>\u00f0\u0178\u201d\u00b4Scammers used to make their $$ from naive people. Now they get their payments straight from some of the world&rsquo;s biggest tech companies. Sorry David &#8212; but this is NOT equivalent. <a href=\"https:\/\/t.co\/mhMZMTNi6e\">https:\/\/t.co\/mhMZMTNi6e<\/a> <a href=\"https:\/\/t.co\/hgqYBcHw8U\">pic.twitter.com\/hgqYBcHw8U<\/a><br \/>If this is wild &#8211; let&rsquo;s check out <a href=\"https:\/\/twitter.com\/Google?ref_src=twsrc%5Etfw\">@google<\/a>, who officially says it does not enable monetisation in Myanmar (too risky) but in practice turns a blind eye and finds no issue with making payments into Myanmar bank accounts \u00f0\u0178\u00a4\u00a6\u00e2\u20ac\u008d\u00e2\u2122\u20ac\u00ef\u00b8\u008f<br \/>The rules \u00f0\u0178\u2018\u2030<a href=\"https:\/\/t.co\/vCdrTpkGbf\">https:\/\/t.co\/vCdrTpkGbf<\/a> <a href=\"https:\/\/t.co\/gX2FbeIa00\">https:\/\/t.co\/gX2FbeIa00<\/a> <a href=\"https:\/\/t.co\/dYV3eJp2eH\">pic.twitter.com\/dYV3eJp2eH<\/a><br \/>An internal company document, first <a href=\"https:\/\/www.technologyreview.com\/2021\/09\/16\/1035851\/facebook-troll-farms-report-us-2020-election\/\">reported by MIT Technology Review in October<\/a>, shows that Facebook was aware of the problem as early as 2019. The author, former Facebook data scientist Jeff Allen, found that these exact tactics had allowed clickbait farms in Macedonia and Kosovo to reach nearly half a million Americans a year before the 2020 election. The farms had also made their way into Instant Articles and Ad Breaks, a similar monetization program for inserting ads into Facebook videos. At one point, as many as 60% of the domains enrolled in Instant Articles were using the spammy writing tactics employed by clickbait farms, the report said. Allen, bound by a nondisclosure agreement with Facebook, did not comment on the report.<br \/>Despite pressure from both internal and external researchers, Facebook struggled to stem the abuse. Meanwhile, the company was rolling out more monetization programs to open up new streams of revenue. Besides Ad Breaks for videos, there was IGTV Monetization for Instagram and In-Stream Ads for Live videos. \u201cThat reckless push for user growth we saw\u2014now we are seeing a reckless push for publisher growth,\u201d says Victoire Rio, a digital rights researcher fighting platform-induced harms in Myanmar and other countries in the Global South.<br \/>MIT Technology Review has found that the problem is now happening on a global scale. Thousands of clickbait operations have sprung up, primarily in countries where Facebook\u2019s payouts provide a larger and steadier source of income than other forms of available work. Some are teams of people while others are individuals, abetted by cheap automated tools that help them create and distribute articles at mass scale. They\u2019re no longer limited to publishing articles, either. They push out Live videos and run Instagram accounts, which they monetize directly or use to drive more traffic to their sites.<br \/>Google is also culpable. Its AdSense program fueled the Macedonia- and Kosovo-based farms that targeted American audiences in the lead-up to the 2016 presidential election. And it\u2019s AdSense that is incentivizing new clickbait actors on YouTube to post outrageous content and viral misinformation.<br \/>Many clickbait farms today now monetize with both Instant Articles and AdSense, receiving payouts from both companies. And because Facebook\u2019s and YouTube\u2019s algorithms <a href=\"https:\/\/www.technologyreview.com\/2021\/03\/11\/1020600\/facebook-responsible-ai-misinformation\/\">boost whatever is engaging to users<\/a>, they\u2019ve created an information ecosystem where content that goes viral on one platform will often be recycled on the other to maximize distribution and revenue.<br \/>\u201cThese actors wouldn\u2019t exist if it wasn\u2019t for the platforms,\u201d Rio says.<br \/>\u201cThis is not normal. This is not healthy.\u201d<br \/>In response to the detailed evidence we provided to each company of this behavior, Meta spokesperson Joe Osborne disputed our core findings, saying we\u2019d misunderstood the issue. \u201cRegardless, we\u2019ve invested in building new expert-driven and scalable solutions to these complex issues for many years, and will continue doing so,\u201d he said.<br \/>Google confirmed that the behavior violated its policies and terminated all of the YouTube channels MIT Technology Review identified as spreading misinformation. \u201cWe work hard to protect viewers from clickbait or misleading content across our platforms and have invested heavily in systems that are designed to elevate authoritative information,\u201d YouTube spokesperson Ivy Choi said.<br \/>Clickbait farms are not just targeting their home countries. Following the example of actors from Macedonia and Kosovo, the newest operators have realized they need to understand neither a country\u2019s local context nor its language to turn political outrage into income.<br \/>MIT Technology Review partnered with Allen, who now leads a nonprofit called the <a href=\"https:\/\/integrityinstitute.org\/\">Integrity Institute<\/a> that conducts research on platform abuse, to identify possible clickbait actors on Facebook. We focused on pages run out of Cambodia and Vietnam\u2014two of the countries where clickbait operations are now cashing in on the situation in Myanmar.<br \/>We obtained data from CrowdTangle, whose development team <a href=\"https:\/\/www.nytimes.com\/2021\/07\/14\/technology\/facebook-data.html\">the company broke up earlier this year<\/a>, and from <a href=\"https:\/\/www.facebook.com\/business\/help\/1738245236284278?id=1769156093197771\">Facebook\u2019s Publisher Lists<\/a>, which record which publishers are registered in monetization programs. Allen wrote a custom clustering algorithm to find pages posting content in a highly coordinated manner and targeting speakers of languages used primarily outside the countries where the operations are based. We then analyzed which clusters had at least one page registered in a monetization program or were heavily promoting content from a page registered with a program.<br \/>We found over 2,000 pages in both countries engaged in this clickbait-like behavior. (That could be an undercount, because not all Facebook pages are tracked by CrowdTangle.) Many have millions of followers and likely reach even more users. In his 2019 report, Allen found that <a href=\"https:\/\/www.technologyreview.com\/2021\/09\/16\/1035851\/facebook-troll-farms-report-us-2020-election\/\">75% of users<\/a> who were exposed to clickbait content from farms run in Macedonia and Kosovo had never followed any of the pages. Facebook\u2019s content-recommendation system had instead pushed it into their news feeds.<br \/>When MIT Technology Review sent Facebook a list of these pages and a detailed explanation of our methodology, Osborne called the analysis \u201cflawed.\u201d \u201cWhile some Pages here may have been on our publisher lists, many of them didn\u2019t actually monetize on Facebook,\u201d he said.\u00a0<br \/>Indeed, these numbers do not indicate that all of these pages generated ad revenue. Instead, it is an estimate, based on data Facebook has made publicly available, of the number of pages associated with clickbait actors in Cambodia and Vietnam that Facebook has made eligible to monetize on the platform.<br \/>Osborne also confirmed that more of the Cambodia-run clickbait-like pages we found had directly registered with one of Facebook\u2019s monetization programs than we previously believed. In our analysis, we found 35% of the pages in our clusters had done so in the last two years. The other 65% would have <em>indirectly<\/em> generated ad revenue by heavily promoting content from the registered page to a wider audience. Osborne said that in fact about <em>half<\/em> of the pages we found, or roughly 150 more pages, had directly registered at one point with a monetization program, primarily Instant Articles.<br \/>Shortly after we approached Facebook, some of the Cambodian operators of these pages began complaining in online forums that their pages had been booted out of Instant Articles. Osborne declined to respond to our questions about the latest enforcement actions the company has taken.<br \/>Facebook has continuously sought to weed these actors out of its programs. For example, only 30 of the Cambodia-run pages are still monetizing, Osborne said. But our data from Facebook\u2019s publisher lists shows enforcement is often delayed and incomplete\u2014clickbait pages can stay within monetization programs for hundreds of days before they are taken down. The same actors will also spin up new pages once their old ones have demonetized.<br \/>Allen is now <a href=\"https:\/\/github.com\/jdallen83\/sm_content_clustering\">open-sourcing the code<\/a> we used to encourage other independent researchers to refine and build on our work.<br \/>To support MIT Technology Review&rsquo;s journalism, please consider <a href=\"https:\/\/forms.technologyreview.com\/subscriptions\/\">becoming a subscriber<\/a>.<br \/>Using the same methodology, we also found\u00a0 more than 400 foreign-run pages targeting predominantly US audiences in clusters that appeared in Facebook\u2019s Publisher lists over the last two years. (We did not include pages from countries whose primary language is English.) The set includes a monetizing cluster run in part out of Macedonia aimed at women and the LGBTQ community. It has eight Facebook pages, including two verified ones with over 1.7 million and 1.5 million followers respectively, and posts content from five websites, each registered with Google AdSense and Audience Network. It also has three Instagram accounts, which monetize through gift shops and collaborations and by directing users to the same largely plagiarized websites. Admins of the Facebook pages and Instagram accounts did not respond to our requests for comment.<br \/>Osborne said Facebook is now investigating the accounts after we brought them to the company\u2019s attention. Choi said Google has removed AdSense ads from hundreds of pages on these sites in the past because of policy violations but that the sites themselves are still allowed to monetize based on the company\u2019s regular reviews.<br \/>While it\u2019s possible that the Macedonians who run the pages do indeed care about US politics and about women\u2019s and LGBTQ rights, the content is undeniably generating revenue. This means what they promote is most likely guided by what wins and loses with Facebook\u2019s news feed algorithm.<br \/>The activity of a single page or cluster of pages may not feel significant, says Camille Fran\u00e7ois, a researcher at Columbia University who studies organized disinformation campaigns on social media. But when hundreds or thousands of actors are doing the same thing, amplifying the same content, and reaching millions of audience members, it can affect the public conversation. \u201cWhat people see as the domestic conversation on a topic can actually be something completely different,\u201d Fran\u00e7ois says. \u201cIt\u2019s a bunch of paid people pretending to not have any relationship with one another, optimizing what to post.\u201d<br \/>Osborne said Facebook has created several new policies and enforcement protocols in the last two years to address this issue, including penalizing pages run out of one country that behave as if they are local to another, as well as penalizing pages that build an audience on the basis of one topic and then pivot to another. But both Allen and Rio say the company\u2019s actions have failed to close fundamental loopholes in the platform\u2019s policies and designs\u2014vulnerabilities that are fueling a global information crisis.<br \/>\u201cIt\u2019s affecting countries first and foremost outside the US but presents a massive risk to the US long term as well,\u201d Rio says. \u201cIt\u2019s going to affect pretty much anywhere in the world when there are heightened events like an election.\u201d<br \/>In response to MIT Technology Review\u2019s initial reporting on Allen\u2019s 2019 internal report, which we <a href=\"https:\/\/www.technologyreview.com\/2021\/09\/16\/1035851\/facebook-troll-farms-report-us-2020-election\/\">published in full<\/a>, David Agranovich, the director of global threat disruption at Facebook, <a href=\"https:\/\/twitter.com\/DavidAgranovich\/status\/1439020401946816516?s=20\">tweeted<\/a>, \u201cThe pages referenced here, based on our own 2019 research, are financially motivated spammers, not overt influence ops. Both of these are serious challenges, but they\u2019re different. Conflating them doesn\u2019t help anyone.\u201d Osborne repeated that we were conflating the two groups in response to our findings.<br \/>But disinformation experts say it\u2019s misleading to draw a hard line between financially motivated spammers and political influence operations. <br \/>There is a distinction in intent: financially motivated spammers are agnostic about the content they publish. They go wherever the clicks and money are, letting Facebook\u2019s news feed algorithm dictate which topics they\u2019ll cover next. Political operations are instead targeted toward pushing a specific agenda.<br \/>But in practice <a href=\"https:\/\/tsjournal.org\/index.php\/jots\/article\/view\/17\/8\">it doesn\u2019t matter<\/a>: in their tactics and impact, they often look the same. On an average day, a financially motivated clickbait site might be populated with celebrity news, cute animals, or highly emotional stories\u2014all reliable drivers of traffic. Then, when political turmoil strikes, they drift toward hyperpartisan news, misinformation, and outrage bait because it gets more engagement.<br \/>The Macedonian page cluster is a prime example. Most of the time the content promotes women\u2019s and LGTBQ rights. But around the time of events like the 2020 election, the January 6 insurrection, and the passage of Texas\u2019s antiabortion \u201cheartbeat bill,\u201d the cluster amplified particularly pointed political content. Many of its articles have been widely circulated by legitimate pages with huge followings, including those run by Occupy Democrats, the Union of Concerned Scientists, and Women\u2019s March Global.\u00a0<\/p>\n<p>Political influence operations, meanwhile, might post celebrity and animal content to build out Facebook pages with large followings. They then also pivot to politics during sensitive political events, capitalizing on the huge audiences already at their disposal.<br \/>Political operatives will sometimes also pay financially motivated spammers to broadcast propaganda on their Facebook pages, or buy pages to repurpose them for influence campaigns. Rio has already seen evidence of a black market where clickbait actors can sell their large Facebook audiences.<br \/>In other words, pages look innocuous until they don\u2019t. \u201cWe have empowered inauthentic actors to accumulate huge followings for largely unknown purposes,\u201d Allen wrote in the report.<br \/>This shift has happened many times in Myanmar since the rise of clickbait farms, in particular during the Rohingya crisis and again in the lead-up to and aftermath of this year\u2019s military coup. (The latter was precipitated by events much like those leading to the US January 6 insurrection, including <a href=\"https:\/\/restofworld.org\/2021\/how-misinformation-fueled-a-coup-in-myanmar\/\">widespread fake claims of a stolen election<\/a>.)<br \/>In October 2020, Facebook took down a number of pages and groups engaged in coordinated clickbait behavior in Myanmar. In <a href=\"https:\/\/graphika.com\/reports\/myanmar-inauthentic-behavior-takedown\/\">an analysis<\/a> of those assets, Graphika, a research firm that studies the spread of information online, found that the pages focused predominantly on celebrity news and gossip but pushed out political propaganda, dangerous anti-Muslim rhetoric, and covid-19 misinformation during key moments of crisis. Dozens of pages had more than 1 million followers each, with the largest reaching over 5 million.<br \/>The same phenomenon played out in the Philippines in the lead-up to president Rodrigo Duterte\u2019s 2016 election. Duterte has been compared to Donald Trump for his populist politics, bombastic rhetoric, and authoritarian leanings. During his campaign, a clickbait farm, registered formally as the company Twinmark Media, shifted from covering celebrities and entertainment to promoting him and his ideology.<br \/>At the time, it was widely believed that politicians had hired Twinmark to conduct an influence campaign. But in interviews with <a href=\"https:\/\/news.abs-cbn.com\/news\/02\/26\/19\/exclusive-twinmark-media-earned-millions-of-dollars-before-facebook-takedown\">journalists<\/a> and <a href=\"https:\/\/stratcomcoe.org\/publications\/four-work-models-of-political-trolling-in-the-philippines\/40\">researchers<\/a>, former Twinmark employees admitted they were simply chasing profit. Through experimentation, the employees discovered that pro-Duterte content excelled during a heated election. They even paid other celebrities and influencers to share their articles to get more clicks and generate more ad revenue, <a href=\"https:\/\/stratcomcoe.org\/publications\/four-work-models-of-political-trolling-in-the-philippines\/40\">according to research<\/a> from media and communication scholars Jonathan Ong and Jason Vincent A. Caba\u00f1es.<br \/>In the final months of the campaign, Duterte dominated the political discourse on social media. Facebook itself named him the \u201c<a href=\"https:\/\/web.archive.org\/web\/20180422224044\/http:\/\/news.abs-cbn.com\/halalan2016\/focus\/04\/22\/16\/how-social-media-is-shaping-the-2016-elections\">undisputed king of Facebook conversations<\/a>\u201d when it found he was the subject of <a href=\"https:\/\/www.gmanetwork.com\/news\/hashtag\/content\/563679\/facebook-data-duterte-still-most-discussed-presidential-candidate\/story\/\">68% of all election-related discussions<\/a>, compared with 46% for his next closest rival.<br \/>Three months after the election, Maria Ressa, CEO of the media company Rappler, who won the Nobel Peace Prize this year for her work fighting disinformation, <a href=\"https:\/\/www.rappler.com\/nation\/propaganda-war-weaponizing-internet\">published a piece<\/a> describing how a concert of coordinated clickbait and propaganda on Facebook \u201cshift[ed] public opinion on key issues.\u201d<br \/>\u201cIt\u2019s a strategy of \u2018death by a thousand cuts\u2019\u2014a chipping away at facts, using half-truths that fabricate an alternative reality by merging the power of bots and fake accounts on social media to manipulate real people,\u201d she wrote.\u00a0<br \/>In 2019, Facebook <a href=\"https:\/\/news.abs-cbn.com\/news\/02\/26\/19\/exclusive-twinmark-media-earned-millions-of-dollars-before-facebook-takedown\">finally took down<\/a> 220 Facebook pages, 73 Facebook accounts, and 29 Instagram accounts linked to Twinmark Media. By then, Facebook and Google had already paid the farm as much as $8 million (<a href=\"https:\/\/news.abs-cbn.com\/news\/02\/26\/19\/exclusive-twinmark-media-earned-millions-of-dollars-before-facebook-takedown\">400 million Philippine pesos<\/a>).<br \/>Neither Facebook nor Google confirmed this amount. Meta\u2019s Osborne disputed the characterization that Facebook had influenced the election.<br \/>Facebook made a major effort to weed clickbait farms out of Instant Articles and Ad Breaks in the first half of 2019, according to Allen\u2019s internal report. Specifically, it began checking publishers for content originality and demonetizing those who posted largely unoriginal content.<br \/>But these automated checks are limited. They primarily focus on assessing the originality of videos, and not, for example, on whether an article has been plagiarized. Even if they did, such systems would only be as good as the company\u2019s artificial-intelligence capabilities in a given language. Countries with languages not prioritized by the AI research community receive far less attention, if any at all. \u201cIn the case of Ethiopia there are 100 million people and six languages. Facebook only supports two of those languages for integrity systems,\u201d Haugen said during her testimony to Congress.<br \/>Rio says there are also loopholes in enforcement. Violators are taken out of the program but not off the platform, and they can appeal to be reinstated. The appeals are processed by a separate team from the one that does the enforcing and performs only basic topical checks before reinstating the actor. (Facebook did not respond to questions about what these checks actually look for.) As a result, it can take mere hours for a clickbait operator to rejoin again and again after removal. \u201cSomehow all of the teams don\u2019t talk to each other,\u201d she says.<br \/>This is how Rio found herself in a state of panic in March of this year. A month after the military had arrested former democratic leader Aung San Suu Kyi and seized control of the government, protesters were still violently clashing with the new regime. The military was sporadically cutting access to the internet and broadcast networks, and Rio was terrified for the safety of her friends in the country.<br \/>She began looking for them in Facebook Live videos. \u201cPeople were really actively watching these videos because this is how you keep track of your loved ones,\u201d she says. She wasn\u2019t concerned to see that the videos were coming from pages with credibility issues; she believed that the streamers were using fake pages to protect their anonymity.<br \/>Then the impossible happened: she saw the same Live video twice. She remembered it because it was horrifying: hundreds of kids, who looked as young as 10, in a line with their hands on their heads, being loaded into military trucks.<br \/>When she dug into it, she discovered that the videos were not live at all. Live videos are meant to indicate a real-time broadcast and include important metadata about the time and place of the activity. These videos had been downloaded from elsewhere and rebroadcast on Facebook using third-party tools to make them look like livestreams.<br \/>Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.<br \/>There were hundreds of them, racking up tens of thousands of engagements and hundreds of thousands of views. As of early November, MIT Technology Review found dozens of duplicate fake Live videos from this time frame still up. One duplicate pair with over 200,000 and 160,000 views, respectively, proclaimed in Burmese, \u201cI am the only one who broadcasts live from all over the country in real time.\u201d Facebook took several of them down after we brought them to its attention but dozens more, as well as the pages that posted them, still remain. Osborne said the company is aware of the issue and has significantly reduced these fake Lives and their distribution over the past year.\u00a0<br \/>Ironically, Rio believes, the videos were likely ripped from footage of the crisis uploaded to YouTube as human rights evidence. The scenes, in other words, are indeed from Myanmar\u2014but they were all being posted from Vietnam and Cambodia.<br \/>Over the past half-year, Rio has tracked and identified several page clusters run out of Vietnam and Cambodia. Many used fake Live videos to rapidly build their follower numbers and drive viewers to join Facebook groups disguised as pro-democracy communities. Rio now worries that Facebook\u2019s latest rollout of in-stream ads in Live videos will further incentivize clickbait actors to fake them. One Cambodian cluster with 18 pages began posting highly damaging political misinformation, reaching a total of 16 million engagements and an audience of 1.6 million in four months. Facebook took all 18 pages down in March but new clusters continue to spin up while others remain.<br \/>For all Rio knows, these Vietnamese and Cambodian actors do not speak Burmese. They likely do not understand Burmese culture or the country\u2019s politics. The bottom line is they don\u2019t need to. Not when they\u2019re stealing their content.<br \/>Rio has since found several of the Cambodians\u2019 private Facebook and Telegram groups (one with upward of 3,000 individuals), where they trade tools and tips about the best money-making strategies. MIT Technology Review reviewed the documents, images, and videos she gathered, and hired a Khmer translator to interpret a tutorial video that walks viewers step by step through a clickbait workflow.<br \/>The materials show how the Cambodian operators gather research on the best-performing content in each country and plagiarize them for their clickbait websites. One Google Drive folder shared within the community has two dozen spreadsheets of links to the most popular Facebook groups in 20 countries, including the US, the UK, Australia, India, France, Germany, Mexico, and Brazil.<br \/>The tutorial video also shows how they find the most viral YouTube videos in different languages and use an automated tool to convert each one into an article for their site. We found 29 YouTube channels spreading political misinformation about the current political situation in Myanmar, for example, that were being converted into clickbait articles and redistributed to new audiences on Facebook.<br \/>After we brought the channels to its attention, YouTube terminated all of them for violating its community guidelines, including seven that it determined were part of coordinated influence operations linked to Myanmar. Choi noted that YouTube had previously also stopped serving ads on nearly 2,000 videos across these channels. \u201cWe continue to actively monitor our platforms to prevent bad actors looking to abuse our network for profit,\u201d she said.<br \/>Then there are other tools, including one that allows prerecorded videos to appear as fake Facebook Live videos. Another randomly generates <a href=\"https:\/\/www.mmoapi.com\/contact-generator\/random-male-contact-in-united-states\">profile details for US men<\/a>, including image, name, birthday, Social Security number, phone number, and address, so yet another tool can mass-produce fake Facebook accounts using some of that information.<br \/>It\u2019s now so easy to do that many Cambodian actors operate solo. Rio calls them micro-entrepreneurs. In the most extreme scenario, she\u2019s seen individuals manage as many as 11,000 Facebook accounts on their own.<br \/>Successful micro-entrepreneurs are also training others to do this work in their community. \u201cIt\u2019s going to get worse,\u201d she says. \u00ab\u00a0Any Joe in the world could be affecting your information environment without you realizing.\u201d<br \/>During her Senate testimony in October of this year, Haugen highlighted the fundamental flaws of Facebook\u2019s content-based approach to platform abuse. The current strategy, focused on what can and cannot appear on the platform, can only be reactive and never comprehensive, she said. Not only does it require Facebook to enumerate every possible form of abuse, but it also requires the company to be proficient at moderating in every language. Facebook has failed on both counts\u2014and the most vulnerable people in the world have paid the greatest price, she said.<br \/>The main culprit, Haugen said, is Facebook\u2019s desire to maximize engagement, which has turned its algorithm and platform design into a giant bullhorn for hate speech and misinformation. An <a href=\"https:\/\/www.technologyreview.com\/2021\/03\/11\/1020600\/facebook-responsible-ai-misinformation\/\">MIT Technology Review investigation<\/a> from earlier this year, based on dozens of interviews with Facebook executives, current and former employees, industry peers, and external experts, corroborates this characterization.<br \/>Her testimony also echoed what Allen wrote in his report\u2014and what Rio and other disinformation experts have repeatedly seen through their research. For clickbait farms, getting into the monetization programs is the first step, but how much they cash in depends on how far Facebook\u2019s content-recommendation systems boost their articles. They would not thrive, nor would they plagiarize such damaging content, if their shady tactics didn\u2019t do so well on the platform.<br \/>As a result, weeding out the farms themselves isn\u2019t the solution: highly motivated actors will always be able to spin up new websites and new pages to get more money. Instead, it\u2019s the algorithms and content reward mechanisms that need addressing.<br \/>In his report, Allen proposed <a href=\"https:\/\/www.technologyreview.com\/2021\/09\/16\/1035851\/facebook-troll-farms-report-us-2020-election\/\">one possible way<\/a> Facebook could do this: by using what\u2019s known as a graph-based authority measure to rank content. This would amplify higher-quality pages like news and media and diminish lower-quality pages like clickbait, reversing the current trend.<br \/>Haugen emphasized that Facebook\u2019s failure to fix its platform was not for want of solutions, tools, or capacity. \u201cFacebook can change but is clearly not going to do so on its own,\u201d she said. \u201cMy fear is that without action, the divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it.\u201d<br \/>(Osborne said Facebook has a fundamentally different approach to Myanmar today with greater expertise in the country\u2019s human rights issues and a dedicated team and technology to detect violating content, like hate speech, in Burmese.)<br \/>In October, the outgoing UN special envoy on Myanmar said the country had <a href=\"https:\/\/www.reuters.com\/world\/asia-pacific\/outgoing-un-envoy-says-myanmar-has-spiraled-into-civil-war-2021-10-21\/\">deteriorated into civil war<\/a>. <a href=\"https:\/\/www.nytimes.com\/2021\/10\/19\/world\/asia\/myanmar-refugees-india.html\">Thousands of people have since fled<\/a> to neighboring countries like Thailand and India. As of mid-November, clickbait actors were continuing to post fake news hourly. In one post, the democratic leader, \u201cMother Suu,\u201d had been assassinated. In another, she had finally been freed.<br \/><em>Special thanks to our team. Design and development by Rachel Stein and Andre Vitorio. Art direction and production by Emily Luong and Stephanie Arnett. Editing by Niall Firth and Mat Honan. Fact checking by Matt Mahoney. Copy editing by Linda Lowenthal.<\/em><svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewbox=\"0 0 1091.84 1091.84\" class=\"monogramTLogo\"><polygon fill=\"#6d6e71\" points=\"363.95 0 363.95 1091.84 727.89 1091.84 727.89 363.95 363.95 0\"><\/polygon><polygon fill=\"#939598\" points=\"363.95 0 728.24 365.18 1091.84 364.13 1091.84 0 363.95 0\"><\/polygon><polygon fill=\"#414042\" points=\"0 0 0 0.03 0 363.95 363.95 363.95 363.95 0 0 0\"><\/polygon><\/svg> <br \/><span style=\"font-weight:400\">Frances Haugen\u2019s testimony at the Senate hearing today raised serious questions about how Facebook\u2019s algorithms work\u2014and echoes many findings from our previous investigation.<\/span><br \/>\u201cThis is not normal. This is not healthy.\u201d<br \/>Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.<br \/>Old and overtly anti-Semitic fantasies are gaining new adherents, and far-right activists have been working to convert anti-lockdown beliefs to anti-Semitism too.<br \/>Discover special offers, top stories,             upcoming events, and more.<br \/>Thank you for submitting your email!<br \/>It looks like something went wrong.<br \/>                 We\u2019re having trouble saving your preferences.                 Try refreshing this page and updating them one                 more time. If you continue to get this message,                 reach out to us at                 <a href=\"mailto:customer-service@technologyreview.com\" class=\"stayConnected__link--2mrtZ\">customer-service@technologyreview.com<\/a> with a list of newsletters you\u2019d like to receive.<br \/>Our mission is to bring about better-informed and more conscious decisions about technology through authoritative, influential, and trustworthy journalism.<br \/><a data-event-category=\"site-footer\" data-event-action=\"button\" data-event-label=\"subscribe-button\" href=\"https:\/\/forms.technologyreview.com\/subscriptions\/\" class=\"link__aquaHover--ZxtM5\"><span>Subscribe<\/span><\/a> <!-- -->to support our journalism.<br \/>\u00a9 <!-- -->2021<!-- --> MIT Technology Review<\/p>\n<p><a href=\"https:\/\/www.technologyreview.com\/2021\/11\/20\/1039076\/facebook-google-disinformation-clickbait\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>The tech giants are paying millions of dollars to the operators of clickbait pages, bankrolling the deterioration of information ecosystems around the world.Myanmar, March 2021.A month after the fall of the democratic government.A Facebook Live video showed hundreds of people protesting against the military coup on the streets of Myanmar.It had nearly 50,000 shares and [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-972","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/972","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=972"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/972\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=972"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=972"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=972"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}