{"id":1561,"date":"2021-11-26T01:00:46","date_gmt":"2021-11-26T00:00:46","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/26\/facebook-fueling-violence-and-instability-across-the-globe-the-organization-for-world-peace\/"},"modified":"2021-11-26T01:00:46","modified_gmt":"2021-11-26T00:00:46","slug":"facebook-fueling-violence-and-instability-across-the-globe-the-organization-for-world-peace","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/26\/facebook-fueling-violence-and-instability-across-the-globe-the-organization-for-world-peace\/","title":{"rendered":"Facebook Fueling Violence And Instability Across The Globe &#8211; The Organization for World Peace"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>Recent revelations show that Facebook has been fueling violence and instability across the globe. As algorithms amplify divisive content and moderating efforts prove ineffective or otherwise negligent, hate speech proliferates across the social media platform. Although Facebook was well aware of these \u201creal world\u201d harms, the company willingly disregarded them in pursuit of profit.<br \/>The revelations come from <em>The Wall Street Journal<\/em>, which has published a series of articles reviewing leaked company documents titled <a href=\"https:\/\/www.wsj.com\/articles\/the-facebook-files-11631713039\">\u201cThe Facebook Files.\u201d<\/a> The documents \u2013 including internal research reports, employee discussions, and draft presentations to senior management \u2013 were leaked to the <em>Journal<\/em>\u00a0 by Frances Haugen, a former product manager at Facebook, who left the company this May.<br \/>On the 5<sup>th<\/sup> of October, <a href=\"https:\/\/www.rev.com\/blog\/transcripts\/facebook-whistleblower-frances-haugen-testifies-on-children-social-media-use-full-senate-hearing-transcript\">Haugen testified before a U.S. Senate subcommittee on the leak.<\/a> While the testimony, and subsequent coverage, was principally concerned with the <a href=\"https:\/\/www.wsj.com\/articles\/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739?mod=hp_lead_pos7&amp;mod=article_inline\">effects of social media on children,<\/a> Haugen outlined far broader concerns. Facebook, she said, was \u2018tearing apart our democracy, putting our children in danger and sewing ethnic violence around the world.\u2019 In Myanmar, India, and Ethiopia, the platform has provided a vehicle for hate speech, and incitements to violence \u2013 <a href=\"https:\/\/www.reuters.com\/article\/us-myanmar-rohingya-facebook\/u-n-investigators-cite-facebook-role-in-myanmar-crisis-idUSKCN1GO2PN\">often with lethal consequences.<\/a><br \/>Facebook insists that it is <a href=\"https:\/\/about.fb.com\/news\/2017\/06\/hard-questions-hate-speech\/\">\u2018opposed to hate speech in all its forms.\u2019<\/a> Responding to <em>The<\/em> <em>Wall Street Journal<\/em>, spokesman Andy Stone stressed that the company invested significantly in technology to find hate speech across the platform, and noted that such content has been declining on Facebook globally. Stone even seemed to challenge the extent to which Facebook was responsible. Given its global audience, argued Stone, \u2018everything that is good, bad and ugly in our societies will find expression on our platform.\u2019 Hatred is perhaps an inevitable reality in our societies, but such assertions understate the importance of Facebook in spreading this hatred. As Haugen explained in her testimony, it \u2018is not simply a matter of certain social media users being angry or unstable,\u2019 but algorithms designed by Facebook amplifying divisive content through \u201cengagement-based ranking\u201d.<br \/>Across the platform, content is ranked according to user engagement, which Facebook terms \u201cmeaningful social interaction,\u201d or MSI. Effectively, posts that attract more likes, comments, and shares are adjudged to have generated more MSI. The algorithm then organizes the \u201cNews Feed\u201d to promote content with higher MSI, giving these posts greater visibility on the site.<br \/>In 2018, when this system was introduced, <a href=\"https:\/\/www.facebook.com\/zuck\/posts\/10104413015393571\">Mark Zuckerberg framed the change<\/a> as promoting \u2018personal connections.\u2019 It was aimed at improving the \u2018well-being and happiness,\u2019 of users. Instead, <a href=\"https:\/\/www.wsj.com\/articles\/facebook-algorithm-change-zuckerberg-11631654215?mod=article_inline\">internal research found the change cultivating outrage and hatred on the platform.<\/a> As content that elicits an extreme reaction is more likely to get a click, comment, or re-share, incendiary posts generate the most MSI. The algorithm accordingly amplifies this content across the platform, rewarding divisive content, like misinformation, hate speech and incitements to violence. Such a system entails \u201creal world\u201d consequences. \u2018In places like Ethiopia,\u2019 Haugen claimed \u2018it is literally fanning ethnic violence.\u2019<br \/>Facebook has long been well aware of the impacts associated with its algorithm. Yet, executives have repeatedly disregarded them. In her testimony, Haugen related one such instance, alleging that, in April 2020, Mark Zuckerberg was presented with the option to remove MSI but refused. Zuckerberg purportedly even rejected calls to remove it from Facebook services in countries at risk of violence, including Ethiopia, citing concerns that it might lead to a loss in engagement \u2013 despite escalating ethnic tensions in the region. These tensions culminated in the ongoing Tigray conflict. As the hostilities unfolded, <a href=\"https:\/\/www.wsj.com\/articles\/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline\">groups turned to Facebook, using the platform as a vehicle to incite violence and disseminate hate speech.<\/a><br \/>When Haugen recounted the role of Facebook in Ethiopia, it prompted outrage on the part of Senator Maria Cantwell. As Cantwell recalled, it was not the first time the company was implicated in ethnic violence within the developing world. <a href=\"https:\/\/www.reuters.com\/article\/us-myanmar-rohingya-facebook\/u-n-investigators-cite-facebook-role-in-myanmar-crisis-idUSKCN1GO2PN\">In 2018, Facebook was blamed by UN investigators for playing a \u2018determining role,\u2019 in the Rohingya crisis.<\/a> As in Ethiopia years later, the platform provided groups in Myanmar with a vehicle to sew hatred and encourage violence. For over half a decade, the Myanmar military used Facebook to orchestrate a <a href=\"https:\/\/www.nytimes.com\/2018\/10\/15\/technology\/myanmar-facebook-genocide.html\">systematic propaganda campaign against the Rohingya minority<\/a>, portraying them as terrorists and circulating misinformation about imminent attacks. With the beginning of the crisis in August 2017, <a href=\"https:\/\/www.theguardian.com\/world\/2018\/apr\/03\/revealed-facebook-hate-speech-exploded-in-myanmar-during-rohingya-crisis\">hate speech exploded on the platform<\/a>, as the Rohingya were subjected to forced labour, rape, extrajudicial killings, and the displacement of more than 700,000 people. Facebook ultimately issued an apology for its failure to adequately respond to the crisis and pledged that it would do more. But, it seems to have neglected these promises in Ethiopia and elsewhere.<br \/>In India, incendiary conflict similarly proliferates across Facebook services, exacerbating the deep-seated social and religious tensions that divide the nation. <a href=\"https:\/\/www.wsj.com\/articles\/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline\">An internal report in 2019 saw researchers set up a test account as a female user.<\/a> After following pages and groups recommended by the algorithm, the account\u2019s News Feed became a \u2018near constant barrage of polarizing nationalist content, misinformation, and violence.\u2019 <a href=\"https:\/\/www.wsj.com\/articles\/facebook-services-are-used-to-spread-religious-hatred-in-india-internal-documents-show-11635016354?mod=article_inline\">In another internal report, the company collected user testimonies to assess the scale of the problem.<\/a> \u2018Most participants,\u2019 the report found, \u2018felt that they saw a large amount of content that encourages conflict, hatred and violence.\u2019<br \/>Facebook insists that it has a \u2018comprehensive strategy,\u2019 to keep people safe on its services, with \u2018sophisticated systems,\u2019 in place to combat hate. But, these accounts highlight continued failings in its efforts to moderate content \u2013 particularly in developing countries. While these markets now constitute Facebook\u2019s principal source of new users, the company continues to commit fewer resources to content moderation in these regions. In 2020, Facebook employees and contractors spent over 3.2 million hours investigating and addressing misinformation on the platform. Only 13% of that time was dedicated to content from outside the U.S., despite Americans making up less than 10% of the platform\u2019s monthly users.<br \/>Meanwhile, the automated systems, which Facebook has repeatedly lauded as the solution to its problem with hate, continue to prove ineffective. Facebook researchers themselves estimate that their A.I. addresses less than 5% of hate speech posted on the platform; while in places like Ethiopia and India, the company neglected to even build systems for several local languages, allowing dangerous content to circulate effectively unmoderated, despite real threats of violence.<br \/>More serious still, where this content is identified, the response from Facebook is often inconsistent. The company has been shown as willing <a href=\"https:\/\/www.wsj.com\/articles\/facebook-files-xcheck-zuckerberg-elite-rules-11631541353?mod=article_inline\">to bend its own rules in favour of elites<\/a>\u00a0to avoid scandal, even if that means leaving incendiary material on its platform. In one instance, the company refused to remove one Hindu nationalist group, Rashtriya Swayamsevak Sangh (or RSS), despite internal research highlighting its role in promoting violence and hate speech towards Muslims. <a href=\"https:\/\/www.wsj.com\/articles\/facebook-services-are-used-to-spread-religious-hatred-in-india-internal-documents-show-11635016354?mod=article_inline\">A report cited \u2018political sensitivities,\u2019 as the basis for the decision.<\/a> India\u2019s Prime Minister, Narendra Modi, worked for the RSS for decades, and in the past year has used <a href=\"https:\/\/www.wsj.com\/articles\/india-threatens-jail-for-facebook-whatsapp-and-twitter-employees-11614964542\">threats<\/a> and <a href=\"https:\/\/www.wsj.com\/articles\/facebook-whatsapp-and-twitter-face-new-rules-in-india-11614250424?mod=article_inline\">legislation<\/a> as part of a wider attempt to exercise greater control over social media in the country.<br \/>\u2018At the heart of these accusations,\u2019 <a href=\"https:\/\/www.facebook.com\/zuck\/posts\/10113961365418581\">wrote Zuckerberg in response to the Haugen testimony<\/a>, \u2018is the idea that we prioritize profit over safety and well-being. That\u2019s just not true.\u2019 Yet, these findings show that Zuckerberg, and other executives, repeatedly made decisions not to address harms linked to Facebook. Rather than learn from its failings in places like Myanmar, the company continued to prioritize profit and growth, ignoring the human costs.<br \/>Unless the incentives underpinning the economics of social media alter radically, there is little chance that Facebook will pursue the necessary changes independently. As the buoyancy of its share prices, despite the leak, shows, moral integrity does not equate with profit. Regulation is a necessity.<br \/>Some states have already taken some form of regulatory action, with more in the pipeline: amongst U.S. lawmakers, calls to reform section 230 are increasingly prominent; a Digital Services Act has been submitted to the European Council; and in the UK, an Online Safety Bill is currently being scrutinized by Parliament.<br \/>However, regulation is complex, with diverse approaches entailing distinct legal, administrative and ethical challenges. Pre-eminent among the concerns of policymakers must be freedom of expression. This is particularly pertinent for regulation that proposes to establish rules surrounding content moderation, especially those that include provisions for \u201charmful\u201d (though not illegal) content \u2013 like vaccine misinformation. By requiring social media companies to take down content deemed harmful by the state, such policies could set dangerous precedents that threaten the freedoms of citizens. Proponents of these policies rightly insist that a balance must be struck between freedoms and potential harms. But from a global perspective, it is an especially precarious balance. In more authoritarian states, regulations of this sort might serve less as a means to reduce the harms of social media, as much a tool for silencing dissent. <a href=\"https:\/\/www.wsj.com\/articles\/india-threatens-jail-for-facebook-whatsapp-and-twitter-employees-11614964542\">The attempts of Modi to bully social media companies into taking down content related to the Farmers\u2019 Protests should give cause for caution.<\/a><br \/>An internationally harmonized approach to regulation (<a href=\"https:\/\/www.ft.com\/content\/5dc4e2d5-d7bd-4000-bf94-088f17e21936\">like the OECD global tax deal<\/a>) might blunt potential regulatory excesses, but any agreement would need to be \u201ccontent-neutral\u201d if it is to be practicable internationally. As attitudes towards policing speech vary massively worldwide, neutrality is the only viable option. <a href=\"https:\/\/knightcolumbia.org\/content\/amplification-and-its-discontents\">Indeed, anything else is unlikely to survive constitutional scrutiny in the U.S.<\/a><br \/>However, a content-neutral approach is not necessarily an ineffective one. As outlined in this report, one critical element in the problems surrounding Facebook is its algorithm. \u201cEngagement-based\u201d ranking has been shown to amplify incendiary content, and in doing so foster division and sow the seeds of violence. But, there are alternatives to the algorithm. Organizing social media feeds chronologically, for instance, would not limit freedom of expression online, but it would prevent the disproportionate amplification of hate, and policymakers might force social media companies into adopting such alternatives by making them liable for any illegal content amplified on their services. As no system of content moderation could identify every instance, they would likely be forced to scrap algorithmic feeds altogether. This would address the fundamental problem with Facebook: not that hatred exists on the platform (as it inevitably does), but that it is given so much reach.<br \/>As is evident in Myanmar, Ethiopia, and India, the prominence given to this hatred has \u201creal world\u201d consequences. Executives at Facebook were well-aware of these consequences but neglected to act upon them, prioritizing profit over people. It is time for regulators to act and put people first.<br \/>&nbsp;<br \/>Hundreds and thousands of people trapped in crude camps forced out of Myanmar, a country where they are denied basic human rights and not even<br \/>Although official estimates have remained inexact regarding the actual number of Rohingya refugees in Bangladesh at present, Bangladeshi authorities have suggested that &#8216;the number exceeds<br \/>The Rohingya crisis is one that has been ongoing for decades, seeing countless waves of violence and annexes from within their own nation by Myanmar<\/p>\n<p><a href=\"https:\/\/theowp.org\/reports\/facebook-fuelling-violence-and-instability-across-the-globe\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Recent revelations show that Facebook has been fueling violence and instability across the globe. As algorithms amplify divisive content and moderating efforts prove ineffective or otherwise negligent, hate speech proliferates across the social media platform. Although Facebook was well aware of these \u201creal world\u201d harms, the company willingly disregarded them in pursuit of profit.The revelations [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-1561","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1561","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=1561"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1561\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=1561"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=1561"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=1561"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}