{"id":1838,"date":"2021-11-28T12:48:17","date_gmt":"2021-11-28T11:48:17","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/28\/the-facebook-trap-harvard-business-review\/"},"modified":"2021-11-28T12:48:17","modified_gmt":"2021-11-28T11:48:17","slug":"the-facebook-trap-harvard-business-review","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/28\/the-facebook-trap-harvard-business-review\/","title":{"rendered":"The Facebook Trap &#8211; Harvard Business Review"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>Facebook has a clear mission: Connect everyone in the world. Clarity is good, but in Facebook&#8217;s case, it has also put the company in a bind because the mission &#8212; and the company&#8217;s vision for creating value through network effects &#8212; has also become the source of its biggest problems. As the company moved from connecting existing friends online to making new global connections (both examples of direct network effects) and now to connecting users to professional creators (indirect network effects), it has come under fire for everything from violating individual privacy to bullying small companies as a monopoly to radicalizing its users. Now, it is struggling to find solutions that don&#8217;t undercut its mission. The author calls this &#8220;the Facebook Trap.&#8221; To address the problems created by the platform &#8212; and by other social networks, too &#8212; it helps to clearly establish where the company should be held accountable. While it&#8217;s reasonable to push for changes in how Facebook&#8217;s recommendations work, it&#8217;s harder to decide how the platform should deal with organic connections, which would likely entail censoring users and blocking them from making connections that they want to make. Facebook isn&#8217;t the only company facing the conundrum of needing to undermine its own mission to minimize harm, and companies and governments will need to develop strategies for how to deal with this issue.<br \/><em>Founded in 2004, Facebook\u2019s mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what\u2019s going on in the world, and to share and express what matters to them.<\/em><br \/><em>&nbsp;\u2015 <a href=\"https:\/\/investor.fb.com\/resources\/default.aspx\">Facebook Mission Statement<\/a><\/em><br \/><em>Our mission is to connect every person in the world.<\/em><br \/><em>\u2015 <a href=\"https:\/\/time.com\/facebook-world-plan\/\">Mark Zuckerberg<\/a>, CEO and Co-Founder of Facebook<\/em><br \/>Depending on who you ask, Facebook\u2019s biggest problem might be almost anything. Critics have argued that it\u2019s <a href=\"https:\/\/www.ftc.gov\/news-events\/press-releases\/2019\/07\/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions\">violating<\/a> <a href=\"https:\/\/www.theverge.com\/2019\/7\/24\/20707013\/ftc-facebook-settlement-data-cambridge-analytica-penalty-privacy-punishment-5-billion\">individual privacy<\/a> or <a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/09\/facebook-antitrust-dominance\/\">bullying small companies<\/a> as a monopoly, <a href=\"https:\/\/www.wsj.com\/articles\/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739\">damaging teens\u2019 mental health<\/a> or <a href=\"https:\/\/www.wired.com\/story\/opinion-platforms-must-pay-for-their-role-in-the-insurrection\/\">inciting<\/a> <a href=\"https:\/\/apnews.com\/article\/donald-trump-violence-misinformation-a3dc1c9479e7677d6d34b4bae7dc9680\">violent<\/a> <a href=\"https:\/\/www.nytimes.com\/2021\/01\/14\/opinion\/facebook-far-right.html\">insurrections<\/a> \u2014 the list of possibilities goes on (and on). But varied as these troubles may seem, they are actually all facets of one big, fundamental problem that is staring all of us \u2014 policymakers, general public, and Facebook\u2019s own employees \u2014 right in the face.<br \/>Facebook exists to \u201cconnect every person in the world,\u201d as CEO Mark Zuckerberg himself will clearly and frequently pronounce. At face value, there is nothing wrong with that goal. In fact, it is exactly the kind of strategic clarity that strategy professors would like to see from more companies. As the guiding vision of Facebook leadership, this aspirational ideal has been deeply <a href=\"https:\/\/www.buzzfeednews.com\/article\/ryanmac\/growth-at-any-cost-top-facebook-executive-defended-data\">ingrained into Facebook\u2019s company culture<\/a>. Importantly, connecting people is the fundamental basis on which Facebook has been so successful over the last 15 years.<br \/>In my course on technology strategy, we teach students that the most important driver of value creation today is network effects: My own value from using Facebook \u2014 and Instagram, Messenger, and WhatsApp \u2014 grows as other users adopt and use Facebook. By following through on its mission to connect people, Facebook facilitates a huge amount of network effects that have propelled its organic growth and reinforces its dominant position in social networking.<br \/>But as we are all experiencing today, that core purpose leads to myriad negative impacts on all parts of our society. That mission of connecting people is also destroying people\u2019s lives and threatening our established institutions. Facebook faces a monumental challenge, because fixing these issues isn\u2019t as simple as adding more moderators to watch for hate speech or changing the news feed \u2014 it will require a fundamental shift in the company\u2019s core strategic goal. In that sense, Facebook is trapped: Network effects made the company a success and now they\u2019re threatening to unmake it, but the company can\u2019t just turn off the engine that makes it work. So, what can it do?<br \/>Following its core purpose means Facebook must continue to connect people and to connect them in more intense ways. It can grow the user base and <em>connect more people<\/em> who otherwise wouldn\u2019t have been connected, and it can get the existing user base to <em>connect more intensely<\/em> by using Facebook more, i.e., by increasing engagement. Both these things directly drive advertising revenue, the predominant mode by which Facebook captures value, i.e., monetizes the user base that otherwise uses Facebook for free, something I <a href=\"https:\/\/journals.aom.org\/doi\/10.5465\/amr.2020.0222\">have written about with a coauthor<\/a>.<br \/>The issue is that even if Facebook were only motivated to create value for users \u2014 through connecting people \u2014 without any incentive to capture value through advertising, it would still be on the road to disaster. The difficulties it faces are a fundamental consequence of connecting people.<br \/>To understand why, let\u2019s consider how Facebook has changed since its idyllic early days.<br \/>Initially, Facebook connected users to their real-life extended social circle \u2014 their <em>local connections<\/em>. As a millennial, I joined Facebook in high school as an extension of the friendships I already had. In this world, Facebook facilitated <em>direct network effects<\/em>, or reciprocal content generation between parties: I create content for my friends, and my friends create content for me. I would post some prom photos, my friends would post slightly different prom photos, and we would all comment on how great everyone looked. Even if someone looked bad in a photo, no one would ever write that: We still had to see each other in real life.<br \/>This version of Facebook had some major limitations. First, it didn\u2019t <em>really<\/em> give me access to anything I didn\u2019t already have in my life. When I became interested in DJing \u2014 a niche interest \u2014 I couldn\u2019t connect with other DJs on Facebook, because I didn\u2019t have any in my immediate network of real-life friends. Second, there was a finite amount of content \u2014 there are only so many prom photos. Third, regular users don\u2019t have the resources to generate \u201chigh-quality\u201d content \u2014 no one was professionally airbrushing all these prom photos. In this world, the connections were relatively weak, in the sense that they do not optimize for intense and ongoing engagement that keep me using Facebook.<br \/>Facebook solved this problem by bringing on millions \u2014 and eventually billions \u2014 of users and then facilitating <em>global connections<\/em>. Suddenly, through the stronger network effects of a larger user base, users with niche interests could connect and reach a critical mass. There are plenty of other DJs on Facebook and Instagram for me to connect with.<br \/>These global connections aren\u2019t always good, however. Users with dangerous interests \u2014 to oneself and to others \u2014 can easily connect with one another and reinforce those interests. A user with suicidal thoughts may now seek advice from others with the same thoughts. A user with racist views can choose to be surrounded by other racists. And once connected, these users gather together at a critical mass and can coordinate activities. This can range from the relatively benign but still damaging, such as <a href=\"https:\/\/www.theatlantic.com\/business\/archive\/2018\/04\/multilevel-marketing-yoga-pants-facebook\/558296\/\">multi-level marketing schemes<\/a>, to the coordination of events such as the January 6, 2021 attack on the U.S. Capitol, which was organized across many social networks, but was <a href=\"https:\/\/www.wsj.com\/articles\/facebook-knew-calls-for-violence-plagued-groups-now-plans-overhaul-11612131374\">stoked by online communities<\/a> of users drawn to conspiracy theories about election fraud.<br \/>There\u2019s another important shift that\u2019s happened, too. As Facebook has evolved, it has begun to rely heavily on <em>indirect network effects<\/em>. Instead of peers reciprocally generating content for one another, a large user base of content consumers incentivizes the \u201cprofessional\u201d content producers to keep pushing out content, and the professional content keeps the large user base on Facebook and engaged.<br \/>Relying on professional content producers to drive indirect network effects has a number of damaging consequences. First, it encourages elite individuals \u2014 celebrities or quasi-professional \u201cinfluencers\u201d \u2014 to portray an unachievable body image and lifestyle as otherwise normal, which Facebook\u2019s own research finds <a href=\"https:\/\/www.wsj.com\/articles\/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739\">can exacerbate depression, anxiety, and suicidal thoughts in young people<\/a>. Second, it professionalizes the generation of \u201cclickbait.\u201d Both traditional media businesses and all-out bad actors have the incentive to pump out content and headlines that <a href=\"https:\/\/www.wsj.com\/articles\/buzzfeed-expects-to-break-even-this-year-thanks-to-heavy-cost-cuts-11603738660\">exploit the curiosity and emotional reaction of users<\/a>. Third, it empowers professional extremists to spread explicitly dangerous messages at scale. ISIS has used Facebook effectively for its recruiting efforts by sharing videos of grotesque violence <a href=\"https:\/\/www.bbc.com\/news\/technology-53389657\">that resonate with disaffected youth<\/a>.<br \/>The challenge for Facebook, and for us as a society, is that everything Facebook can do to solve its \u201cproblem\u201d works directly against how it creates value and its core mission. In essence, critics of Facebook are asking it to connect less people and connect them less intensely. But that violates the core ethos of what Facebook has always set out to do. This is the <em>Facebook Trap<\/em>.<br \/>So what is Facebook to do about this problem? And how much of this problem can we as a society actually hold Facebook accountable for, through public pressure, regulatory policy or other means? To answer these questions, let\u2019s consider Facebook\u2019s role in facilitating <em>user-originated connections<\/em> vs. <em>algorithmic-originated connections<\/em>.<br \/><em>User-originated connections<\/em> are the direct interactions between parties that the platform facilitated in the beginning. When Facebook started as a registry of Harvard undergraduates, a user could scroll through all the other students and choose to connect with the few that the user wants to see content from. A Facebook with only user-originated connections would be limited to fairly local connections and more of the direct network effects.<br \/>However, as a platform scales, it becomes harder and harder for a user to sift through and find the connections valuable to them. To ensure Facebook could continue to effectively connect people, it deployed <em>algorithmic-originated connections.<\/em> This recommendation engine uses the data users give the platform to suggest new friends and groups and populate the newsfeed and search results. This heavy hand is necessary to allow global connections to form and indirect network effects to come about, and to bring users the connections they want and would engage with most intensely.<br \/>Separating out which issues are a result of organic user-originated connections vs. Facebook-driven algorithm-originated connections gives us a sense of what Facebook can reasonably be held accountable for. Unfortunately, it doesn\u2019t present easy solutions.<br \/>The scenarios where Facebook uses a heavy hand to facilitate connections are where we can rightfully look for some accountability \u2014 even if doing so works against Facebook\u2019s mission. For instance, just because the data says that others really like being connected to incendiary parties or content does not mean that Facebook has to bring that content to my attention. The choice to not expose users to new content they wouldn\u2019t have gone looking for is a relatively straightforward one.<br \/>The question of accountability becomes less clear when we consider whether the engine should recommend connections that a specific user actually wants, as revealed by data on the user\u2019s own activity. Facebook\u2019s mission implies that it should intentionally facilitate these connections, but these connections can <a href=\"https:\/\/www.washingtonpost.com\/opinions\/2020\/10\/26\/facebook-algorithm-conservative-liberal-extremes\/\">intensify a user\u2019s behavior and worldview<\/a>. If a user with mild political leanings shows interest in reading about national politics, how much political content can Facebook recommend before it becomes extreme or even dangerous? Yes, Facebook can limit how it makes these recommendations \u2014 if only because individual users cannot hold themselves accountable \u2014 but there is no obvious line in the sand for Facebook to draw here.<br \/>But clear accountability goes completely out the window when users are making connections on their own. To deal with the problematic user-originated connections, Facebook would ultimately need to censor content and ban users that create the content that we deem problematic. There are some bright lines \u2014 of course explicit planning of violent activity should be barred \u2014 but the bulk of the potentially damaging content falls into a massive grey area. Consider the dark gray area of anti-vaccine content: If, say, we want Facebook to censor explicit misinformation, what should be done about nuanced, evidence-based content that describes a vaccine\u2019s side effects? Facebook can adjust its algorithm to suppress recommendations of this content, but if users are going out of their way to find it, can or should Facebook censor it? Do we want it to?<br \/>The is the area with which Facebook struggles the most. The company has repeatedly <a href=\"https:\/\/www.wsj.com\/articles\/facebook-mark-zuckerberg-vaccinated-11631880296\">been inconsistent<\/a> and <a href=\"https:\/\/www.wsj.com\/articles\/facebook-files-xcheck-zuckerberg-elite-rules-11631541353\">non-transparent<\/a> about how it censors content. Zuckerberg has tried to defer responsibility to a quasi-independent oversight panel, but <a href=\"https:\/\/techcrunch.com\/2020\/01\/28\/under-consideration\/\">critics accuse Facebook<\/a> of intentionally not giving the panel the resources or control to do its job comprehensively and effectively.<br \/>But this evasiveness derives from the accountability challenge intrinsic to social networking. Yes, we can hold Facebook accountable for what Facebook goes out of its way to connect us with. But can we hold Facebook accountable for what we go out of our way to connect with? And as a company dedicated to connecting people as its mission, Facebook clearly does not want to be accountable for the connections that users genuinely want, independent of whether Facebook gives it to users or the users find it themselves.<br \/>As a strategy professor, I am probably more empathetic to Facebook than most. Facebook has a strategy of connecting people that has created a tremendous amount of value, but that same strategy is getting Facebook into a lot of trouble today. There are hard tradeoffs on all sides. My view is that there is no clear solution, but there are three broad routes that Facebook can pursue, potentially in conjunction.<br \/>In past efforts to project responsibility, <a href=\"https:\/\/www.washingtonpost.com\/news\/the-switch\/wp\/2018\/04\/10\/transcript-of-mark-zuckerbergs-senate-hearing\/\">Facebook has implied that it has solutions<\/a> to the problems it creates, which at present it doesn\u2019t seem to have. As one route, Facebook can be more transparent about the fundamental tradeoffs that come with social networking by releasing research that documents specific issues, like with body image and Instagram, alongside its ongoing advocacy for the value that comes with connecting people. These insights can guide regulators and put Facebook in a good position to take regulation in a favorable direction for the industry, and regulation that imposes costly compliance requirements can be a barrier to entry that protects incumbents like Facebook, e.g., <a href=\"https:\/\/www.politico.eu\/article\/europe-data-protection-gdpr-general-data-protection-regulation-facebook-google\/\">GDPR in Europe<\/a>.<br \/>To comprehensively moderate all its content, Facebook would need to continue advancing the frontier on algorithm detection of undesirable content <em>and<\/em> increase the number of human moderators by an order of magnitude (or multiple). As of 2020, Facebook employs 15,000 human moderators that each view hundreds of content items daily, and <a href=\"https:\/\/www.technologyreview.com\/2020\/06\/08\/1002894\/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report\/\">it will need many more<\/a>. This effort will cost billions of dollars, and perhaps more painfully for Facebook, force it to decide what content to restrict: curating for one person is censoring another. However, no moderation effort can do much about the content running through encrypted WhatsApp or Messenger communications.<br \/>Facebook needs clear boundaries on which aspects of its platform it wants to \u2014 and can be \u2014 accountable for, and clearly delegate accountability to governments, independent agencies, and users where it doesn\u2019t. On algorithm-originated connections, it will be impractical to delegate accountability on what is often a black box process \u2014 and this technology is a core piece of intellectual property for Facebook \u2014 so Facebook needs to be ready to take responsibility on what connections that algorithm promotes.<br \/>But on user-originated connections to undesirable content, Facebook has been unclear about who is accountable here. The quasi-independent Oversight Board moves Facebook towards this direction of delegating accountability, but it is still evasive and incomplete: The board only reviews Facebook content decisions after the fact on appeal, and the board is still financially dependent on Facebook and too small to operate at scale.<br \/>Moving forward, Facebook can itself take on genuine accountability by massively ramping up its own moderating efforts; publicly and credibly give that accountability to an outside authority; or leave that accountability in the hands of individual users by taking a stand and fighting for its original mission of connecting people freely however they want. Right now, Facebook is ambiguously doing all three, leaving no one accountable at the end of the day.<br \/>Facebook serves as a convenient lightning rod for ire, but Facebook could disappear off of the face of the earth tomorrow and we will still face these problems again and again. The Facebook Trap is intrinsic to social networking as a whole, and reflects the consequences of digital technology facilitating a more connected world.<br \/>Twitter has evolved on the same path as Facebook towards using algorithms to connect people globally, imparting many of the same adverse consequences as Facebook. Snap(chat), originally reliant on connecting friends, drastically redesigned its platform to drive indirect network effects <a href=\"https:\/\/www.wsj.com\/articles\/evan-spiegel-stands-by-the-big-bet-that-sank-snaps-stock-11572667238\">that increase the amount of time users spend watching professional content<\/a>. TikTok has rapidly become a powerhouse by using its best-in-class algorithms to connect users to the most engaging content globally without having to build from a network of real-life friends.<br \/>We all need to reckon with the consequences of what it means to connect more people more intensely. To do that, and navigate this trap we\u2019re in, Facebook and all the social networking platforms today (and yet to come) need a clear sense of what they will be accountable for. It\u2019s time these companies \u2014 along with governments and users \u2014 tackle the Facebook Trap head on.<\/p>\n<p><a href=\"https:\/\/hbr.org\/2021\/10\/the-facebook-trap\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Facebook has a clear mission: Connect everyone in the world. Clarity is good, but in Facebook&#8217;s case, it has also put the company in a bind because the mission &#8212; and the company&#8217;s vision for creating value through network effects &#8212; has also become the source of its biggest problems. As the company moved from [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-1838","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1838","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=1838"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/1838\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=1838"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=1838"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=1838"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}