{"id":946,"date":"2021-11-21T00:17:04","date_gmt":"2021-11-20T23:17:04","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/21\/in-india-facebook-struggles-to-combat-misinformation-and-hate-speech-the-new-york-times\/"},"modified":"2021-11-21T00:17:04","modified_gmt":"2021-11-20T23:17:04","slug":"in-india-facebook-struggles-to-combat-misinformation-and-hate-speech-the-new-york-times","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/21\/in-india-facebook-struggles-to-combat-misinformation-and-hate-speech-the-new-york-times\/","title":{"rendered":"In India, Facebook Struggles to Combat Misinformation and Hate Speech &#8211; The New York Times"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>Advertisement<br \/>Supported by<br \/>The Facebook Papers<br \/>Internal documents show a struggle with misinformation, hate speech and celebrations of violence in the country, the company\u2019s biggest market.<br \/><strong>Send any friend a story<\/strong><br \/>As a subscriber, you have <strong class=\"css-8qgvsz ebyp5n10\">10 gift articles<\/strong> to give each month. Anyone can read what you share.<br \/><span class=\"byline-prefix\">By <\/span><span class=\"css-1baulvz\" itemprop=\"name\"><a href=\"https:\/\/www.nytimes.com\/by\/sheera-frenkel\" class=\"css-mrorfa e1jsehar0\">Sheera Frenkel<\/a><\/span> and <span class=\"css-1baulvz last-byline\" itemprop=\"name\"><a href=\"https:\/\/www.nytimes.com\/by\/davey-alba\" class=\"css-mrorfa e1jsehar0\">Davey Alba<\/a><\/span><br \/>On Feb. 4, 2019, a <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2021\/10\/24\/business\/media\/facebook-leak-frances-haugen.html\" title=\"\">Facebook<\/a> researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.<br \/>For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook\u2019s algorithms to join groups, watch videos and explore new pages on the site.<br \/>The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.<br \/>\u201cFollowing this test user\u2019s News Feed, I\u2019ve seen more images of dead people in the past three weeks than I\u2019ve seen in my entire life total,\u201d the Facebook researcher wrote.<br \/> \tFrom the Document: An Indian Test User\u2019s Descent Into a Sea of Polarizing, Nationalistic Messages <br \/> \t\u201cThe test user\u2019s News Feed has become a near constant <strong>barrage of polarizing nationalist content, misinformation, and violence and gore<\/strong>.\u201d <br \/>The report was one of dozens of studies and memos written by Facebook employees grappling with the effects of the platform on India. They provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential impact on local culture and politics, and fails to deploy the resources to act on issues once they occur.<br \/>With 340 million people using Facebook\u2019s various social media platforms, <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2021\/11\/09\/world\/asia\/india-hospital-fire.html\" title=\"\">India<\/a> is the company\u2019s largest market. And Facebook\u2019s problems on the subcontinent present an amplified version of the issues it has faced throughout the world, made worse by a lack of resources and a lack of expertise in India\u2019s 22 officially recognized languages.<br \/>The internal documents, obtained by a consortium of news organizations that included The New York Times, are part of a larger cache of material called <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2021\/10\/25\/business\/facebook-papers-takeaways.html\" title=\"\">The Facebook Papers<\/a>. They were collected by Frances Haugen, a former Facebook product manager who became a whistle-blower and recently testified before a Senate subcommittee about the company and its social media platforms. References to India were scattered among documents filed by Ms. Haugen to the Securities and Exchange Commission in a complaint earlier this month.<br \/>The documents include reports on how <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2019\/04\/01\/technology\/india-elections-facebook.html\" title=\"\">bots and fake accounts tied to<\/a> the country\u2019s ruling party and opposition figures were wreaking havoc on national elections. They also detail how a plan championed by Mark Zuckerberg, Facebook\u2019s chief executive, to focus on \u201cmeaningful social interactions,\u201d or exchanges between friends and family, was leading to more misinformation in India, particularly <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2021\/04\/25\/business\/india-covid19-twitter-facebook.html\" title=\"\">during the pandemic<\/a>.<br \/>Facebook did not have enough resources in India and was unable to grapple with the problems it had introduced there, including anti-Muslim posts, according to its documents. Eighty-seven percent of the company\u2019s global budget for time spent on classifying misinformation is earmarked for the United States, while only 13 percent is set aside for the rest of the world \u2014 even though North American users make up only 10 percent of the social network\u2019s daily active users, according to one document describing Facebook\u2019s allocation of resources.<br \/>Andy Stone, a Facebook spokesman, said the figures were incomplete and don\u2019t include the company\u2019s third-party fact-checking partners, most of whom are outside the United States.<br \/>That lopsided focus on the United States has had consequences in a number of countries besides India. Company documents showed that Facebook installed measures to demote misinformation during the November election in <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2021\/11\/10\/world\/asia\/danny-fenster-myanmar.html\" title=\"\">Myanmar<\/a>, including disinformation shared by the Myanmar military junta.<br \/>The company rolled back those measures after the election, despite research that showed they lowered the number of views of inflammatory posts by 25.1 percent and photo posts containing misinformation by 48.5 percent. Three months later, the military carried out <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/article\/myanmar-news-protests-coup.html\" title=\"\">a violent coup in the country<\/a>. Facebook said that after the coup, it implemented <a class=\"css-1g7m0tk\" href=\"https:\/\/about.fb.com\/news\/2021\/02\/an-update-on-myanmar\/\" title=\"\" rel=\"noopener noreferrer\" target=\"_blank\">a special policy<\/a> to remove praise and support of violence in the country, and later banned the Myanmar military from Facebook and Instagram.<br \/>In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups, exposing them to violence-inducing and hateful content. In Ethiopia, a nationalist youth militia group successfully coordinated calls for violence on Facebook and posted other inflammatory content.<br \/>Facebook has invested significantly in technology to find hate speech in various languages, including Hindi and Bengali, two of the most widely used languages, Mr. Stone said. He added that Facebook reduced the amount of hate speech that people see globally by half this year.<br \/>\u201cHate speech against marginalized groups, including Muslims, is on the rise in India and globally,\u201d Mr. Stone said. \u201cSo we are improving enforcement and are committed to updating our policies as hate speech evolves online.\u201d<br \/>In India, \u201cthere is definitely a question about resourcing\u201d for Facebook, but the answer is not \u201cjust throwing more money at the problem,\u201d said Katie Harbath, who spent 10 years at Facebook as a director of public policy, and worked directly on securing India\u2019s national elections. Facebook, she said, needs to find a solution that can be applied to countries around the world.<br \/>Facebook employees have run various tests and conducted field studies in India for several years. That work increased ahead of India\u2019s 2019 national elections; in late January of that year, a handful of Facebook employees traveled to the country to meet with colleagues and speak to dozens of local Facebook users.<br \/>According to a memo written after the trip, one of the key requests from users in India was that Facebook \u201ctake action on types of misinfo that are connected to real-world harm, specifically politics and religious group tension.\u201d<br \/>Ten days after the researcher opened the fake account to study misinformation, a <a class=\"css-1g7m0tk\" href=\"https:\/\/www.nytimes.com\/2019\/02\/14\/world\/asia\/pulwama-attack-kashmir.html\" title=\"\">suicide bombing<\/a> in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation and conspiracies between Indian and Pakistani nationals.<br \/>After the attack, anti-Pakistan content began to circulate in the Facebook-recommended groups that the researcher had joined. Many of the groups, she noted, had tens of thousands of users. A different report by Facebook, published in December 2019, found Indian Facebook users tended to join large groups, with the country\u2019s median group size at 140,000 members.<br \/>Posts Recommended to the Test Account<br \/>The &ldquo;Popular Across Facebook&rdquo; feature began recommending viral, unverified material about military action following Indian airstrikes against Pakistan in the wake of a suicide bombing in Kashmir.<br \/>The feature also began recommending military-themed posts unrelated to the strikes.<br \/>Posts Recommended to the Test Account<br \/>The &ldquo;Popular Across Facebook&rdquo; feature began recommending viral, unverified material about military action following Indian airstrikes against Pakistan in the wake of a suicide bombing in Kashmir.<br \/>The feature also began recommending military-themed posts unrelated to the strikes.<br \/>Graphic posts, including a meme showing the beheading of a Pakistani national and dead bodies wrapped in white sheets on the ground, circulated in the groups she joined.<br \/>After the researcher shared her case study with co-workers, her colleagues commented on the posted report that they were concerned about misinformation about the upcoming elections in India.<br \/> \tFrom the Document: An Indian Test User\u2019s Descent Into a Sea of Polarizing, Nationalistic Messages <br \/> \t\u201cThese groups become <strong>perfect distribution channels when they want to promote bad content<\/strong> within short period of time.\u201d <br \/> \t\u201cThe <strong>admins of these groups tended to take a lax position\/hands-off attitude<\/strong> towards ensuring that the content shared in the group was on a particular topic of focus, and allowed users to freely post whatever they found interesting\/wanted to share.\u201d <br \/>Two months later, after India\u2019s national elections had begun, Facebook put in place a series of steps to stem the flow of misinformation and hate speech in the country, according to an internal document called Indian Election Case Study.<br \/>The case study painted an optimistic picture of Facebook\u2019s efforts, including adding more fact-checking partners \u2014 the third-party network of outlets with which Facebook works to outsource fact-checking \u2014 and increasing the amount of misinformation it removed. It also noted how Facebook had created a \u201cpolitical white list to limit P.R. risk,\u201d essentially a list of politicians who received a special exemption from fact-checking.<br \/><strong>A tech giant in trouble.<!-- --> <\/strong><span>The leak of internal documents by a former Facebook employee has provided <a href=\"https:\/\/www.nytimes.com\/2021\/10\/04\/technology\/facebook-files.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">an intimate look<\/a>\u00a0at the operations of the secretive social media company and renewed calls for better regulations of the company\u2019s wide reach into the lives of its users.<\/span><br \/><strong>How it began.<!-- --> <\/strong><span>In September, The Wall Street Journal published The Facebook Files, <a href=\"https:\/\/www.nytimes.com\/2021\/09\/17\/business\/dealbook\/facebook-files-whistleblower.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">a series of reports based on leaked documents<\/a>. The series exposed evidence that Facebook, which on Oct. 28 assumed the corporate name of Meta, knew Instagram, one of its products <a href=\"https:\/\/www.nytimes.com\/2021\/10\/01\/technology\/facebook-instagram-teenagers.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">was worsening body-image issues among teenagers<\/a>.<\/span><br \/><strong>The whistle-blower.<!-- --> <\/strong><span>During an interview with \u201c60 Minutes\u201d that aired Oct. 3, <a href=\"https:\/\/www.nytimes.com\/2021\/10\/03\/technology\/whistle-blower-facebook-frances-haugen.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">Frances Haugen, a Facebook product manager <\/a>who left the company in May, revealed that she was responsible for the leak of those internal documents.<\/span><br \/><strong>Ms. Haugen\u2019s testimony in Congress.<!-- --> <\/strong><span>On Oct. 5, Ms. Haugen <a href=\"https:\/\/www.nytimes.com\/2021\/10\/05\/technology\/what-happened-at-facebook-whistleblower-hearing.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">testified before a Senate subcommittee<\/a>, saying that Facebook was willing to use hateful and <a href=\"https:\/\/www.nytimes.com\/2021\/10\/05\/technology\/haugen-facebook.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">harmful content<\/a>\u00a0on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.<\/span><br \/><strong>The Facebook Papers.<!-- --> <\/strong><span>Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the <a href=\"https:\/\/www.nytimes.com\/2021\/10\/25\/business\/facebook-papers-takeaways.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">Facebook Papers<\/a>, to several news organizations, including The New York Times.<\/span><br \/><strong>New revelations.<!-- --> <\/strong><span>Documents from the Facebook Papers show the degree to which <a href=\"https:\/\/www.nytimes.com\/2021\/10\/22\/technology\/facebook-election-misinformation.html?action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc\">Facebook knew of extremist groups on its site<\/a>\u00a0trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how <a href=\"https:\/\/www.nytimes.com\/2021\/10\/25\/technology\/facebook-like-share-buttons.html?action=click&#038;action=click&#038;pgtype=Article&#038;state=default&#038;module=styln-facebook-meta&#038;variant=show&#038;region=MAIN_CONTENT_3&#038;block=storyline_levelup_swipe_recirc&#038;module=RelatedLinks&#038;pgtype=Article\">Facebook\u2019s key features<\/a>\u00a0amplified toxic content on the platform.<\/span><br \/>The study did not note the immense problem the company faced with bots in India, nor issues like voter suppression. During the election, Facebook saw a spike in bots \u2014 or fake accounts \u2014 linked to various political groups, as well as efforts to spread misinformation that could have affected people\u2019s understanding of the voting process.<br \/>In a separate report produced after the elections, Facebook found that over 40 percent of top views, or impressions, in the Indian state of West Bengal were \u201cfake\/inauthentic.\u201d One inauthentic account had amassed more than 30 million impressions.<br \/>A report published in March 2021 showed that many of the problems cited during the 2019 elections persisted.<br \/>In the internal document, called Adversarial Harmful Networks: India Case Study, Facebook researchers wrote that there were groups and pages \u201creplete with inflammatory and misleading anti-Muslim content\u201d on Facebook.<br \/>The report said there were a number of dehumanizing posts comparing Muslims to \u201cpigs\u201d and \u201cdogs,\u201d and misinformation claiming that the Quran, the holy book of Islam, calls for men to rape their female family members.<br \/>Much of the material circulated around Facebook groups promoting Rashtriya Swayamsevak Sangh, an Indian right-wing and nationalist group with close ties to India\u2019s ruling\u00a0Bharatiya Janata Party, or B.J.P. The groups took issue with an expanding Muslim minority population in West Bengal and near the Pakistani border, and published posts on Facebook calling for the ouster of Muslim populations from India and promoting a Muslim population control law.<br \/>Facebook knew that such harmful posts proliferated on its platform, the report indicated, and it needed to improve its \u201cclassifiers,\u201d which are automated systems that can detect and remove posts containing violent and inciting language. Facebook also hesitated to designate R.S.S. as a dangerous organization because of \u201cpolitical sensitivities\u201d that could affect the social network\u2019s operation in the country.<br \/>Of India\u2019s 22 officially recognized languages, Facebook said it has trained its A.I. systems on five. (It said it had human reviewers for some others.) But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims \u201cis never flagged or actioned,\u201d the Facebook report said.<br \/>Five months ago, Facebook was still struggling to efficiently remove hate speech against Muslims. Another company report detailed efforts by Bajrang Dal, an extremist group linked with the B.J.P., to publish posts containing anti-Muslim narratives on the platform.<br \/>Facebook is considering designating the group as a dangerous organization because it is \u201cinciting religious violence\u201d on the platform, the document showed. But it has not yet done so.<br \/>\u201cJoin the group and help to run the group; increase the number of members of the group, friends,\u201d said one post seeking recruits on Facebook to spread Bajrang Dal\u2019s messages. \u201cFight for truth and justice until the unjust are destroyed.\u201d<br \/>Ryan Mac<!-- -->, <!-- -->Cecilia Kang<!-- --> and <!-- -->Mike Isaac<!-- --> contributed reporting.<br \/>Advertisement<\/p>\n<p><a href=\"https:\/\/www.nytimes.com\/2021\/10\/23\/technology\/facebook-india-misinformation.html\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>AdvertisementSupported byThe Facebook PapersInternal documents show a struggle with misinformation, hate speech and celebrations of violence in the country, the company\u2019s biggest market.Send any friend a storyAs a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.By Sheera Frenkel and Davey AlbaOn Feb. 4, 2019, a Facebook researcher [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-946","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/946","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=946"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/946\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=946"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=946"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=946"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}