{"id":674,"date":"2021-11-19T02:39:44","date_gmt":"2021-11-19T01:39:44","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/facebooks-algorithm-is-broken-we-collected-some-suggestions-on-how-to-fix-it-fivethirtyeight\/"},"modified":"2021-11-19T02:39:44","modified_gmt":"2021-11-19T01:39:44","slug":"facebooks-algorithm-is-broken-we-collected-some-suggestions-on-how-to-fix-it-fivethirtyeight","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/facebooks-algorithm-is-broken-we-collected-some-suggestions-on-how-to-fix-it-fivethirtyeight\/","title":{"rendered":"Facebook\u2019s Algorithm Is Broken. We Collected Some Suggestions On How To Fix It. &#8211; FiveThirtyEight"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p> \t\t\t\t\t\t\t\t<time class=\"datetime\">Nov. 16, 2021<\/time>, \t\t\t\t\t\t\t\tat \t\t\t\t\t\t\t\t<time class=\"datetime updated\" title=\"2021-11-16T11:00:00+00:00\">6:00 AM<\/time>  \t\t\t\t\t\t\t<br \/>By <a href=\"https:\/\/fivethirtyeight.com\/contributors\/kaleigh-rogers\/\" title=\"\" class=\"author url fn\" rel=\"author\">Kaleigh Rogers<\/a><br \/>Filed under <a href=\"https:\/\/fivethirtyeight.com\/tag\/technology\/\" class=\"term\" name=\"\">Technology<\/a><br \/>ILLUSTRATION BY EMILY SCHERER<br \/>Facebook\u2019s algorithm<a class=\"espn-footnote-link\" data-footnote-id=\"1\" href=\"#fn-1\" data-footnote-content=\"&lt;p&gt;&lt;p&gt;Facebook\u2019s platform runs on a system of many algorithms that all have different functions and interact with one another. This system is colloquially referred to as \u201cthe algorithm.\u201d&lt;\/p&gt; &lt;\/p&gt;\"><sup id=\"ss-1\">1<\/sup><\/a> is its superpower \u2014 and its kryptonite. Yes, it <a href=\"https:\/\/www.cnn.com\/2021\/10\/10\/tech\/facebook-whistleblower-algorithms-fix\/index.html\" target=\"_blank\" rel=\"noopener\">leads to higher engagement<\/a> that earns the company <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/facebook-reports-9-billion-profit-day-documents-highlight-internal-ang-rcna3757\" target=\"_blank\" rel=\"noopener\">billions of dollars<\/a>, but it\u2019s also tied to some of <a href=\"https:\/\/www.theguardian.com\/technology\/2018\/mar\/17\/facebook-cambridge-analytica-kogan-data-algorithm\" target=\"_blank\" rel=\"noopener\">the company\u2019s biggest scandals<\/a>. Last month, when the Facebook Papers \u2014 a <a href=\"https:\/\/abcnews.go.com\/Politics\/wireStory\/explainer-facebook-papers-80766584\" target=\"_blank\" rel=\"noopener\">trove of leaked corporate documents<\/a> provided to reporters and Congress \u2014 were released, a mountain of news coverage blamed the algorithm for <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/26\/facebook-angry-emoji-algorithm\/\" target=\"_blank\" rel=\"noopener\">the spread of misinformation<\/a> and divisive content, <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/facebook-knew-radicalized-users-rcna3581\" target=\"_blank\" rel=\"noopener\">radicalizing users<\/a> and <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2021-10-23\/how-facebook-s-algorithm-led-a-new-india-user-to-fake-news-violence\" target=\"_blank\" rel=\"noopener\">failing to protect them<\/a> from some of the most graphic content on the site.<br \/>If the algorithm is to blame, can Facebook change the algorithm to make it better? What would that look like? To find out, I interviewed 12 leading experts on data and computer science, as well as former Facebook employees, and asked them to propose changes that could help the algorithm suck less. What I got was a range of ideas about how Facebook could start to solve this problem, or whether a solution is even possible. Some are more radical than others, so I\u2019ve categorized these ideas from mild to spicy<strong> <\/strong>(though we know Facebook CEO Mark Zuckerberg <a href=\"https:\/\/www.nytimes.com\/2021\/10\/29\/style\/mark-zuckerberg-meta-jokes.html\" target=\"_blank\" rel=\"noopener\">prefers it sweet<\/a>).\u00a0<br \/>Many experts pointed out that, along with identifying some of the problems with the algorithm, the Facebook Papers also included a number of possible solutions.\u00a0<br \/>\u201cSome of the internal research found shockingly simple tweaks [to improve the algorithm],\u201d said Noah Giansiracusa, a mathematics professor at Bentley University and author of \u201cHow Algorithms Create and Prevent Fake News.\u201d \u201cFor example, if you limit the number of reshares, that will actually reduce the amount of disinformation.\u201d<br \/>Resharing is a crucial way that Facebook gets engaging content into users\u2019 newsfeeds. It allows content to travel through Facebook networks to get in front of users who wouldn\u2019t otherwise see it, and it\u2019s how you wind up with viral content, <a href=\"https:\/\/time.com\/3117501\/als-ice-bucket-challenge-videos-on-facebook\/\" target=\"_blank\" rel=\"noopener\">for better<\/a> or <a href=\"https:\/\/www.npr.org\/2021\/08\/21\/1030038616\/facebooks-most-viewed-article-in-early-2021-raised-doubt-about-covid-vaccine\" target=\"_blank\" rel=\"noopener\">for worse<\/a>. Many of the experts I spoke to mentioned creating \u201cfriction\u201d or \u201cspeed bumps\u201d to slow down bad content \u2014 like disinformation, hate speech or extreme content \u2014 before it goes viral. <a href=\"https:\/\/www.npr.org\/2021\/10\/22\/1048543513\/facebook-groups-jan-6-insurrection\" target=\"_blank\" rel=\"noopener\">Internal research at Facebook<\/a> found that limiting \u201cdeep reshares\u201d (where content is reshared not only by the original poster\u2019s network of friends or followers, but also their friends\u2019 friends, and their friends\u2019 friends\u2019 friends, and so on) of political content could reduce misinformation shared via external links by 25 percent and reduce misinformation found in images (think misleading memes) by half. Facebook implemented these changes, but only temporarily. The site <a href=\"https:\/\/about.fb.com\/news\/2018\/05\/inside-feed-reduce-remove-inform\/\" target=\"_blank\" rel=\"noopener\">does moderate misleading content<\/a> by removing it if it violates the site\u2019s community standards, and downranking content that the algorithm deems likely to be harmful, but the experts I spoke to still felt there was a gap where harmful content is slipping through.<br \/>Karen Kornbluh, a senior fellow and the director of the Digital Innovation and Democracy Initiative at the German Marshall Fund, suggested Facebook adopt a kind of \u201ccircuit breaker,\u201d where the share button is automatically but temporarily removed on content that starts getting deep reshares very quickly, until the content can be evaluated. Something like this, Kornbluh noted, could have stopped the disinformation video \u201cPlandemic\u201d (which falsely cast doubts on the severity of the COVID-19 pandemic and the safety of vaccines) <a href=\"https:\/\/www.nytimes.com\/2020\/05\/20\/technology\/plandemic-movie-youtube-facebook-coronavirus.html\" target=\"_blank\" rel=\"noopener\">from getting millions of views<\/a> before it was ultimately removed from the site. Katie Harbath, the former public policy director at Facebook, noted that WhatsApp (which is owned by Facebook\u2019s parent company, <a href=\"https:\/\/abcnews.go.com\/Business\/facebook-announces-changing-company-meta\/story?id=80683908\" target=\"_blank\" rel=\"noopener\">Meta<\/a>) was <a href=\"https:\/\/techcrunch.com\/2020\/04\/27\/whatsapps-new-limit-cuts-virality-of-highly-forwarded-messages-by-70\/#:~:text=In%20one%20of%20the%20biggest,billion%20users%20on%20April%207.\" target=\"_blank\" rel=\"noopener\">able to cut the virality of similar kinds<\/a> of \u201cdeep reshared\u201d content by limiting how many contacts a user could forward a message to at one time, and said similar limits on sharing could be helpful at Facebook. <br \/> \t\t<a href=\"https:\/\/fivethirtyeight.com\/features\/we-polled-kids-about-the-pandemic-theyre-doing-surprisingly-ok\/\"><img class=\"pullout\" sizes=\"(min-width: 440px) 153px, 100vw\" srcset=\"https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=153&amp;quality=100&amp;strip=all 153w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=306&amp;quality=100&amp;strip=all 306w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=405&amp;quality=100&amp;strip=all 405w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=810&amp;quality=100&amp;strip=all 810w\" alt=\"Kids\" width=\"153\" height=\"115\" data-sizes=\"(min-width: 440px) 153px, 100vw\" data-srcset=\"https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=153&amp;quality=100&amp;strip=all 153w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=306&amp;quality=100&amp;strip=all 306w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=405&amp;quality=100&amp;strip=all 405w, https:\/\/fivethirtyeight.com\/wp-content\/uploads\/2021\/11\/GettyImages-1351368186-1024.jpg?w=810&amp;quality=100&amp;strip=all 810w\"><\/a> \t<br \/> \t\t\t\t<strong>Related:<\/strong> \t\t\t\tHow Are Kids Handling The Pandemic? We Asked Them. <a href=\"https:\/\/fivethirtyeight.com\/features\/we-polled-kids-about-the-pandemic-theyre-doing-surprisingly-ok\/\">Read more. \u00bb<\/a> \t<br \/>The algorithm could also identify \u201cbad actors\u201d who have repeatedly shared misleading content and demote all of their future posts, said Nathaniel Persily, co-director of the Stanford Cyber Policy Center. \u201cThey should always be demoted in the newsfeed, regardless if they\u2019re talking about baseball or QAnon,\u201d he said.\u00a0<br \/>Multiple experts also pointed to more prominent user controls, to allow users to decide what content they\u2019d like to see. While Facebook does offer quite a lot of user control options, <a href=\"https:\/\/www.technologyreview.com\/2014\/10\/21\/170668\/algorithm-awareness\/\" target=\"_blank\" rel=\"noopener\">studies have shown<\/a> most users are unaware of how they work, and there\u2019s not an intuitive way for users to signal dissatisfaction with content, said Karrie Karahalios, a computer science professor at the University of Illinois at Urbana-Champaign who has studied user experience with Facebook.\u00a0<br \/>Roddy Lindsay, a former Facebook data scientist who went on to co-found a startup, wants the algorithm to prioritize content that users are likely to deem \u201cgood for the world.\u201d It\u2019s an admittedly subjective metric, but <a href=\"https:\/\/www.nytimes.com\/2020\/11\/24\/technology\/facebook-election-misinformation.html\" target=\"_blank\" rel=\"noopener\">Facebook experimented with it<\/a> by having users rate content on whether they felt it was \u201cgood\u201d or \u201cbad\u201d for the world. It then used that feedback to train the algorithm to prioritize only the \u201cgood\u201d stuff. Facebook researchers found this reduced the amount of negative content in users\u2019 feeds, but it also reduced the number of times users logged onto Facebook, so a watered-down version of it was ultimately adopted instead.\u00a0<br \/>\u201cIt\u2019s not that these algorithms can\u2019t be improved,\u201d Lindsay said. \u201cThe problem is that the only decision makers for what these algorithms optimize for are the companies.\u201d\u00a0<br \/>We know from the Facebook Papers that some of these recommendations were <a href=\"https:\/\/apnews.com\/article\/the-facebook-papers-covid-vaccine-misinformation-c8bbc569be7cc2ca583dadb4236a0613\" target=\"_blank\" rel=\"noopener\">implemented slowly<\/a> or for a limited period of time, while others were, at least at the time, <a href=\"https:\/\/www.wired.com\/story\/facebook-papers-badge-posts-former-employees\/\" target=\"_blank\" rel=\"noopener\">rejected<\/a>. It\u2019s unclear which, if any, of the recommendations from the papers have since been implemented at Facebook \u2014 Mari Melguizo, a spokesperson for the platform, pointed to the site\u2019s recently published \u201c<a href=\"https:\/\/transparency.fb.com\/en-gb\/features\/approach-to-ranking\/types-of-content-we-demote\/\" target=\"_blank\" rel=\"noopener\">content distribution guidelines<\/a>,\u201d which list the kind of content that\u2019s demoted on the site, including spam and clickbait. \u201cThese working documents from years ago show our efforts to understand these issues and don\u2019t reflect the product and policy solutions we\u2019ve implemented since,\u201d Melguizo said.<br \/>Some experts felt Facebook\u2019s algorithm needed a more substantial overhaul to tackle its worst byproducts, including changes that might make Facebook slightly less fun for users. Laura Edelson, a computer science Ph.D. candidate at New York University who <a href=\"https:\/\/www.vox.com\/recode\/22612151\/laura-edelson-facebook-nyu-ad-observatory-social-media-researcher\" target=\"_blank\" rel=\"noopener\">studies disinformation and political advertising on social media<\/a>, said one thing revealed in the Facebook Papers was the algorithm\u2019s prioritization of something called \u201cdownstream meaningful social interactions.\u201d (Which is also how I describe my Friday nights.) In a nutshell, imagine two pieces of content: Post A is something you, the Facebook user, are highly likely to engage with, but nobody else in your network is. Post B is something you\u2019re less likely to engage with, but many people in your network are. By prioritizing downstream MSI, the algorithm is more likely to show you post B, even though it might be less relevant to you. That might expose you to more polarizing or extreme content than you would like to see. <a href=\"https:\/\/www.wsj.com\/articles\/facebook-algorithm-change-zuckerberg-11631654215\" target=\"_blank\" rel=\"noopener\">Internal research at Facebook showed<\/a> that reducing how much the algorithm considered this metric when it came to posts about civic (i.e., political) and health information helped reduce the spread of misinformation, and <a href=\"https:\/\/twitter.com\/davegillis\/status\/1446371819531890689\" target=\"_blank\" rel=\"noopener\">Facebook did implement<\/a> the changes for those categories (and is <a href=\"https:\/\/about.fb.com\/news\/2021\/02\/reducing-political-content-in-news-feed\/\" target=\"_blank\" rel=\"noopener\">currently experimenting<\/a> with further reducing how much the algorithm considers the potential for comments and shares on political content), but Edelson argued those changes should be made for all content on the site.\u00a0<br \/>\u201cFacebook is not perfect at detecting these categories \u2014 far from it \u2014 and they\u2019re particularly bad at the beginning of a piece of content\u2019s life,\u201d Edelson said. \u201cThat means it\u2019s entirely possible that it won\u2019t detect that civic content is civic until after it\u2019s already gone viral.\u201d\u00a0<br \/>Jinyan Zang, a researcher at the Public Interest Tech Lab at Harvard University, said one thing Facebook could do is shift the balance of what it deems valuable. Rather than focusing on quantitative metrics like clicks and reshares, the algorithm could prioritize qualitative metrics (like how positive or relevant a post might be to a user). You might be more likely to engage with your cousin\u2019s college boyfriend\u2019s post about a conspiracy theory, but you might <em>prefer<\/em> to see a photo of your neighbor\u2019s kids\u2019 Halloween costumes. Facebook\u2019s algorithm takes a lot of factors into account when ranking content, including more qualitative measures, so there\u2019s no reason it couldn\u2019t crank up that dial.\u00a0<br \/>Another more dramatic change would be eliminating the ranking algorithm for the newsfeed altogether, and returning to a reverse-chronological feed. In other words, just show everybody everything people posted, rather than trying to personalize the feed just for you (and whatever the algorithm thinks you\u2019re most likely to click, or rage-click). This notion is controversial. Some of the experts I spoke to said it would never work because it incentivizes quantity over quality \u2014 a fast road to spam \u2014 while also making it less likely that you\u2019ll see anything relevant, interesting or engaging (in every sense of the word) on your feed.<br \/>But proponents of this idea (<a href=\"https:\/\/www.bloomberg.com\/opinion\/articles\/2021-10-05\/could-facebook-whistleblower-frances-haugen-bring-back-time-order\" target=\"_blank\" rel=\"noopener\">including Facebook whistleblower<\/a> Frances Haugen) say the downsides to reverting to a reverse chronological feed may be outweighed by the benefits. A reverse chronological feed would, by definition, favor content from users who post frequently, so your newsfeed could easily get clogged with posts from a particularly active group you\u2019re in, or memes from a particularly bored acquaintance you forgot you friended. \u201cYou would be exposed to things that are more mundane, but that content plays an important role \u2014 we\u2019re not all having the best day ever,\u201d Lindsay, the former Facebook data scientist, said. \u201cPeople say it\u2019s boring or noisy or has content from random pages, and my response to that is, \u2018Yes, of course, but that\u2019s where user controls come in.\u2019\u201d\u00a0<br \/>At the furthest end of the take Scoville scale are two ideas, one optimistic and the other pessimistic:\u00a0<br \/>Jeff Allen and Sahar Massachi are both former Facebook employees who worked on integrity teams (the groups tasked with finding ways to deal with all of the worst bits of Facebook, like disinformation and extremism). They said that as long as Facebook\u2019s mission (\u201cto give people the power to build community and bring the world closer together\u201d) and metrics (user engagement) are at odds, there\u2019s no amount of algorithmic tweaks that will solve its problems. Instead, Facebook needs to use different metrics that align with its stated values to measure its success.\u00a0<br \/>\u201cIf you\u2019re counting harmful content towards your success, you\u2019re just setting yourself up for internal conflict,\u201d Allen said.\u00a0<br \/>But some experts felt that the blue sky version of Facebook realigning its metrics with its values simply wouldn\u2019t work. I asked each of the interviewees what they would do if I waved a magic wand and gave them total control over Facebook. \u201cI would turn off Facebook and apologize to the people of the world,\u201d said Cathy O\u2019Neil, a data scientist and algorithmic auditor. \u201cThey can\u2019t actually solve their problems.\u201d <br \/>O\u2019Neil argued that as long as Facebook is a for-profit company that earns revenue through ads, it will only ever be able to play catchup with negative content, and that any external pressures like regulation that would actually be effective would only end up making Facebook so unprofitable that the business would collapse. Imran Ahmed, the founding CEO of the Center for Countering Digital Hate, made a similar argument, noting that asking Facebook to make changes to an algorithm that \u2014 from a business perspective \u2014 works quite effectively is \u201cthe axiom of insanity.\u201d Instead, he called for regulations that would create costs to Facebook for the harm its product creates as an incentive for change.<br \/>\u201cThe cost of the harms created by Facebook are not in any way internal to Facebook. Users pay the price and society pays the price,\u201d Ahmed said. \u201cImpunity leads to terrible things, and we\u2019re seeing an experiment in impunity now. \u201c<br \/>In fact, the experts I spoke to almost unanimously called for regulation. Blue sky ideas are great, but we\u2019ve been trusting Facebook to get better for 15 years, and it\u2019s <a href=\"https:\/\/www.washingtonpost.com\/technology\/2021\/10\/22\/jan-6-capitol-riot-facebook\/\" target=\"_blank\" rel=\"noopener\">arguably worse<\/a> than it\u2019s ever been. It might be time to put some limits on its superpower, and its supreme power.<br \/>Facebook\u2019s platform runs on a system of many algorithms that all have different functions and interact with one another. This system is colloquially referred to as \u201cthe algorithm.\u201d<br \/>Kaleigh Rogers is FiveThirtyEight&#8217;s technology and politics reporter.<br \/>Filed under<br \/><a class=\"tag\" href=\"https:\/\/fivethirtyeight.com\/tag\/technology\/\">Technology <span class=\"count\">(43 posts)<\/span><\/a> <a class=\"tag\" href=\"https:\/\/fivethirtyeight.com\/tag\/facebook\/\">Facebook <span class=\"count\">(28)<\/span><\/a> <a class=\"tag\" href=\"https:\/\/fivethirtyeight.com\/tag\/algorithms\/\">Algorithms <span class=\"count\">(7)<\/span><\/a> <a class=\"tag\" href=\"https:\/\/fivethirtyeight.com\/tag\/big-tech\/\">Big Tech <span class=\"count\">(3)<\/span><\/a> <br \/><a href=\"https:\/\/fivethirtyeight.com\/newsletters\/\">See all newsletters<\/a><\/p>\n<p> \t\t\t\t&copy; 2021 ABC News Internet Ventures. All rights reserved. \t\t\t<\/p>\n<p><a href=\"https:\/\/fivethirtyeight.com\/features\/facebooks-algorithm-is-broken-we-collected-some-spicy-suggestions-on-how-to-fix-it\/\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Nov. 16, 2021, at 6:00 AM By Kaleigh RogersFiled under TechnologyILLUSTRATION BY EMILY SCHERERFacebook\u2019s algorithm1 is its superpower \u2014 and its kryptonite. Yes, it leads to higher engagement that earns the company billions of dollars, but it\u2019s also tied to some of the company\u2019s biggest scandals. Last month, when the Facebook Papers \u2014 a trove [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-674","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/674","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=674"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/674\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=674"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=674"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=674"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}