{"id":672,"date":"2021-11-19T02:22:55","date_gmt":"2021-11-19T01:22:55","guid":{"rendered":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/revealed-the-software-that-studies-your-facebook-friends-to-predict-who-may-commit-a-crime-the-guardian\/"},"modified":"2021-11-19T02:22:55","modified_gmt":"2021-11-19T01:22:55","slug":"revealed-the-software-that-studies-your-facebook-friends-to-predict-who-may-commit-a-crime-the-guardian","status":"publish","type":"post","link":"https:\/\/monblogeur.tech\/index.php\/2021\/11\/19\/revealed-the-software-that-studies-your-facebook-friends-to-predict-who-may-commit-a-crime-the-guardian\/","title":{"rendered":"Revealed: the software that studies your Facebook friends to predict who may commit a crime &#8211; The Guardian"},"content":{"rendered":"<div class=\"cfbc967f0983488262956e73eca9483a\" data-index=\"1\" style=\"float: none; margin:10px 0 10px 0; text-align:center;\">\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3859091246952232\" crossorigin=\"anonymous\"><\/script>\r\n<!-- blok -->\r\n<ins class=\"adsbygoogle\" data-ad-client=\"ca-pub-3859091246952232\" data-ad-slot=\"1334354390\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script>\r\n\n<\/div>\n<p>Voyager, which pitches its tech to police, has suggested indicators such as Instagram usernames that show Arab pride can signal inclination towards extremism<br \/>Last modified on Wed 17 Nov 2021 13.20 GMT<br \/><span class=\"dcr-114to15\"><span class=\"dcr-1jnp7wy\">W<\/span><\/span><span class=\"dcr-o5gy41\">hat do your Facebook posts, who you follow on Instagram and who you interact with the most on social media say about you? According to the tech startup Voyager Labs, that information could help police figure out if you have committed or plan to commit a crime.<\/span><br \/>Voyager Labs is one of dozens of US companies that have popped up in recent years with technology that purports to harness social media to help solve and predict crime.<\/p>\n<p>Pulling information from every part of an individual\u2019s various social media profiles, Voyager helps police investigate and surveil people by reconstructing their entire digital lives \u2013 public and private. By relying on artificial intelligence, the company claims, its software can decipher the meaning and significance of online human behavior, and can determine whether subjects have already committed a crime, may commit a crime or adhere to certain ideologies.<br \/>But new <a href=\"https:\/\/www.brennancenter.org\/our-work\/research-reports\/lapd-social-media-monitoring-documents\" data-link-name=\"in body link\">documents<\/a>, obtained through <a href=\"https:\/\/www.brennancenter.org\/our-work\/analysis-opinion\/lapd-documents-show-what-one-social-media-surveillance-firm-promises\" data-link-name=\"in body link\">public information requests<\/a> by the Brennan Center, a non-profit organization, and shared with the Guardian, show that the assumptions the software relies on to draw those conclusions may run afoul of first amendment protections. In one case, Voyager indicated that it considered using an Instagram name that showed Arab pride or tweeting about Islam to be signs of a potential inclination toward extremism.<br \/>The documents also reveal Voyager promotes a variety of ethically questionable strategies to access user information, including enabling police to use fake personas to gain access to groups or private social media profiles.<br \/>Voyager, a nine-year-old startup registered as Bionic 8 Analytics with<strong> <\/strong>offices in Israel, Washington, New York and elsewhere<strong>,<\/strong> is a small fish in a big pond that includes companies like Palantir and Media Sonar. The <a href=\"https:\/\/www.theguardian.com\/us-news\/los-angeles\" data-component=\"auto-linked-tag\" data-link-name=\"in body link\">Los Angeles<\/a> police department trialed Voyager software in 2019, the Brennan Center documents show, and engaged in a lengthy back-and-forth with the company about a permanent contract.<strong> <\/strong> <br \/>But experts say Voyager\u2019s products are emblematic of a broader ecosystem of tech players answering law enforcement\u2019s calls for advanced tools to expand their policing capabilities.<br \/>For police, the appeal of such tools is clear: use technology to automatically and quickly see connections that might take officers much longer to uncover, or to detect unnoticed behaviors or leads that a human might not pick up on because of lack of sophistication or capacity. With immense pressure on departments to keep crime rates low and prevent attacks, using technology to be able to make fast and efficient law enforcement decisions is an attractive value proposition. New and existing documents show the LAPD alone has worked or considered working with companies such as <a href=\"https:\/\/www.theguardian.com\/us-news\/2021\/nov\/07\/lapd-predictive-policing-surveillance-reform\" data-link-name=\"in body link\">PredPol<\/a>, <a href=\"https:\/\/www.theguardian.com\/us-news\/2021\/sep\/08\/revealed-los-angeles-police-officers-gathering-social-media\" data-link-name=\"in body link\">MediaSonar, Geofeedia,<\/a> Dataminr, and now Voyager.<br \/>But for the public, social media-informed policing can be a privacy nightmare that effectively criminalizes casual and at times protected behavior, experts who have reviewed the documents for the Guardian say.<br \/>As the Guardian previously <a href=\"https:\/\/www.theguardian.com\/us-news\/2021\/nov\/07\/lapd-predictive-policing-surveillance-reform\" data-link-name=\"in body link\">reported<\/a>, police departments are often unwilling to relinquish the use of those tools even in the face of public outcry and in spite of little proof it helps to reduce crime.<br \/>Experts also point out that companies like Voyager often use buzzwords such as \u201cartificial intelligence\u201d and \u201calgorithms\u201d to explain how they analyze and process information but provide little evidence that it works.<br \/>A Voyager spokesperson, Lital Carter Rosenne, said the company\u2019s software was used by a wide range of clients to enable searches through databases but said that Voyager did not build those databases on its own or supply Voyager staffers to run its software.<br \/>\u201cThese are our clients\u2019 responsibilities and decisions, in which Voyager has no involvement at all,\u201d Rosenne said in an email. \u201cAs a company, we follow the laws of all the countries in which we do business. We also have confidence that those with whom we do business are law-abiding public and private organizations.\u201d<br \/>\u201cVoyager is a software company,\u201d Rosenne said in answer to questions about how the technology works. \u201cOur products are search and analytics engines that employ artificial intelligence and machine learning with explainability.\u201d<br \/>Voyager did not respond to the detailed questions about who it has contracts with or how its software draws conclusions on a person\u2019s support for specific ideologies.<br \/>LAPD declined to respond to a request for comment.<br \/>The way Voyager and companies like it work is not complicated, the documents show. Voyager software hoovers up all the public information available on a person or topic \u2013 including posts, connections and even emojis \u2013 analyzes and indexes it and then, in some cases, cross-references it with non-public information.<br \/>Internal documents show the technology<strong> <\/strong>creates a topography of a person\u2019s entire social media existence, specifically looking at users\u2019 posts as well as their connections, and how strong each of those relationships are.<br \/>The software visualizes how a person\u2019s direct connections are connected to each other, where all of those connections work, and any \u201cindirect connections\u201d (people with at least four mutual friends). Voyager also detects any indirect connections between a subject and other people the customer has previously searched for.<br \/>Voyager\u2019s data collection is far reaching. If a person tracked by Voyager software deletes a friend or a post from their own profile, it remains archived in their Voyager profile. The system catalogues not only a subject\u2019s contacts, but also any content or media those contacts posted, including status updates, pictures and geotags. And it draws in second- and third-degree friendships to \u201cunearth previously unknown middlemen or instances of improper association\u201d.<br \/>Meredith Broussard, a New York University data journalism professor and author of Artificial Unintelligence: How Computers Misunderstand the World, said it appeared Voyager\u2019s algorithms were making assessments about people based on their online activity and networks, using a process that resembled online ad targeting.<br \/>Ad targeting systems place people in \u201caffinity groups\u201d, determining who is most likely to be interested in buying a new car, for example, based on their friends and connections, Broussard explained: \u201cSo instead of grouping people into buckets like \u2018pet owners\u2019, what Voyager seems to be doing is putting people into \u2018buckets\u2019 of likely criminals.\u201d<br \/>In the advertising context, many consumers have come to accept this kind of targeting, she said, but the stakes are much higher when it comes to policing. <br \/>\u201cIt\u2019s a \u2018guilt by association\u2019 system,\u201d she said, adding that this kind of algorithm was not particularly sophisticated.<br \/>Voyager software applies a similar process to Facebook groups, pages and events \u2013 both public and closed \u2013 cataloging recently published content and mapping out the most active users. Documents show the company also allows users to search for posts about specific topics, pulling up all mentions of that term, as well as the location tagged in those posts.<br \/>The company claims all of this information on individuals, groups and pages allows its software to conduct real-time \u201csentiment analysis\u201d and find new leads when investigating \u201cideological solidarity\u201d. In proposals to the LAPD, the company claimed its artificial intelligence platform was unmatched in its ability to analyze \u201chuman behavior indicators\u201d.<br \/>Voyager claims its AI can provide insights such as an individual or group\u2019s \u201csocial whereabouts\u201d, can uncover hidden relationships and can perform a \u201csentiment analysis\u201d to determine where someone stands ideologically on various topics, including extremism.<br \/>\u201cWe don\u2019t just connect existing dots,\u201d a Voyager promotional document read. \u201cWe create new dots. What seem like random and inconsequential interactions, behaviors or interests, suddenly become clear and comprehensible.\u201d<br \/>A service the company calls VoyagerDiscover presents social profiles of people who \u201cmost fully identify with a stance or any given topic\u201d. The company says the system takes into account personal involvement, emotional involvement, knowledge and calls to action, according to documents. Unlike other companies, Voyager claims it doesn\u2019t need extra time to study and process online behavior and instead can make this type of judgment \u201con the fly\u201d. <br \/>\u201cThis ability moves the discussion from those who are most engaged online to those most engaged in their hearts,\u201d the documents read.<br \/>In one redacted case study Voyager presented to LAPD when it was pursuing a contract with the agency, the company examined the ways in which it would have analyzed the social media profile of Adam Alsahli, who was killed last year while attempting to attack the Corpus Christi naval base in Texas.<br \/>The company wrote that its software used artificial intelligence to examine whether subjects have ties to Islamic fundamentalism, and color coded these profiles as green, orange or red (orange and red seemingly indicating a proclivity toward extremism). \u201cThis provides a flag or indication for further vetting or investigation, before an incident has occurred, as part of an effort to put in place a \u2018trip wire\u2019 to indicate emerging threats,\u201d the company wrote.<br \/>In Alsahli\u2019s case, Voyager said, the company concluded his social media activity reflected \u201cIslamic fundamentalism and extremism\u201d and suggested investigators could further reviews Alsahli\u2019s accounts to \u201cdetermine the strength and nature of his direct and indirect connections to other Persons of Interest\u201d. <br \/>But the documents show that many aspects of what Voyager pointed out as tripwires or signals of fundamentalism could also qualify as free speech or other protected activity. Voyager, for instance, said 29 of Alsahli\u2019s 31 Facebook posts were pictures with Islamic themes and that one of Alsahli\u2019s Instagram account handles, which was redacted in the documents, reflected \u201chis pride in and identification with his Arab heritage\u201d.<br \/>When examining the list of accounts he followed and who followed him, Voyager said that \u201cmost are in Arabic\u201d \u2013 one of the 100 languages the company said it can automatically translate \u2013 and \u201cgenerally appear\u201d to be accounts posting religious content. On his Twitter account, Voyager wrote, Alsahli mostly tweeted about Islam.<br \/>The documents also implicated Alsahli\u2019s connections, writing that three Facebook users he shared posts with could \u201chave had other interactions with him outside social media, or been connecting in the same Islamist circles and forums\u201d.<br \/>Parts of the documents were redacted. However, the only visible mention of content that could be seen as explicitly tying Alsahli to fundamentalism consisted of tweets Voyager said he had posted in support of mujahideen.<br \/>Julie Mao, the deputy director of Just Futures, a legal support group for immigrants, said she worried the color-coded risk algorithm and Voyager\u2019s choice to study this particular case showed potential bias.<br \/>\u201cIt\u2019s always easy in hindsight to pick out someone who was violent and say \u2018hey, tech works based on them,\u2019\u201d Mao said. \u201cIt\u2019s incredibly opaque how Voyager arrived to this threat level (was it something more than expressing religious devotion?) and how many individuals receive similar threat levels based on innocuous conduct. So even by its own logic, it\u2019s a flawed example of accuracy and could lead to over-policing and harassment.\u201d<br \/>It\u2019s \u201cbasically a stop and frisk tool for police\u201d, Mao said.<br \/>Voyager\u2019s claims that it used \u201ccutting-edge AI-based technologies\u201d such as \u201cmachine learning\u201d, \u201ccognitive computing\u201d, and \u201ccombinatorial and statistical algorithms\u201d were, in effect, just \u201cword salad\u201d, said Cathy O\u2019Neil, a data scientist and CEO of Orcaa, a firm that audits algorithms. \u201cThey\u2019re saying, \u2018We use <em>big math<\/em>.\u2019 It doesn\u2019t actually say anything about what they\u2019re doing.\u201d<br \/>In fact, O\u2019Neil said, companies like Voyager generally provided little evidence demonstrating their algorithms had the capabilities they claim. And often, she said, police departments did not require or ask for this kind of data, and companies would be unable to provide evidence if it were requested \u2013 because their claims are frequently hyperbolic and unfounded.<br \/>The problem with this kind of marketing, O\u2019Neil added, was that it could provide cover for biased policing practices: \u201cIf they successfully get people to trust their algorithm, with zero evidence that it works, then it can be weaponized.\u201d <br \/>Melina Abdullah, a Black Lives Matter LA co-founder, said she was disturbed to learn about the conclusions Voyager\u2019s software had made about the online activity of Muslim users.<br \/>\u201cAs a Black Muslim, I\u2019m concerned. I always know that my last name alone flags me differently than other folks, that I\u2019m seen with heightened scrutiny, that there are assumptions made about \u2018extremism\u2019 because I\u2019m Muslim,\u201d she said, adding that the records left her with many unanswered questions: \u201cWho have they been flagging and what are the justifications? \u2026 It sounds like everybody\u2019s vulnerable to this.\u201d<br \/>Relying on publicly available information, Voyager\u2019s software cobbles together a fairly comprehensive and invasive picture of a person\u2019s private life, the experts said. But the company supplements that data with non-public information it gains access to through two primary channels: warrants or subpoenas and what the company calls an \u201cactive persona\u201d.<br \/>In the first case, Voyager tech sifts through vast swaths of data law enforcement gets through <a href=\"https:\/\/www.theguardian.com\/us-news\/2021\/sep\/16\/geofence-warrants-reverse-search-warrants-police-google\" data-link-name=\"in body link\">various types of warrants<\/a>. Such information can include subjects\u2019 private messages, their location information or the keywords they have searched for. Voyager catalogs and analyzes these often vast troves of user data \u2013 an undertaking LAPD officers wrote in emails they would appreciate help with \u2013 and cross-references it with social and geographic maps drawn up from public information.<br \/>For its Facebook-specific warrant service, Voyager software analyzes private messages to identify profiles subjects are communicating most frequently with. It then shows a subject\u2019s public posts alongside these private messages to provide \u201cvaluable\u201d context. \u201cIn numerous cases, its effectiveness has prompted our clients to request additional PDF warrant returns\u201d from tech companies, the documents read. Voyager said it planned to roll out the same warrant-indexing capabilities for Instagram and Snap, which would include image processing capabilities.<br \/>John Hamasaki, a criminal defense lawyer and member of San Francisco\u2019s police commission, said he had already had concerns about how judges grant law enforcement access to people\u2019s private online accounts, especially Black and Latino defendants accused of being in gangs: \u201cThe degree to which private information is being seized, purportedly lawfully under search warrants, is just way over-broad.\u201d<br \/>If police were additionally using software and algorithms to analyze the private data, it compounded the potential privacy and civil liberties violations, he said: \u201cAnd what conclusions are they drawing from it, and what spin is an expert giving to it? Because \u2018gang experts\u2019 are notorious for coming to a conclusion that supports the prosecution.\u201d<br \/>There is less detail about the second means through which Voyager software accesses non-public information: its premium service called \u201cactive persona\u201d. The documents indicate customers can use what Voyager calls \u201cavatars\u201d to \u201ccollect and analyze information that is otherwise inaccessible\u201d on select networks. Using the active persona feature, the company said, its software was able to access and analyze information from encrypted messaging app Telegram. A 2019 product roadmap also shows plans to roll out the \u201cactive persona\u201d mechanism for WhatsApp groups, \u201cmeaning the user will have to provide the system with an avatar with access to the group from which he wishes to collect\u201d. A timeline Voyager provided to the LAPD shows the company also had plans to introduce a feature that enabled \u201cInstagram private profile collection\u201d.<br \/>Experts say the \u201cactive persona\u201d feature appears to be another name for fake profiles and an LAPD officer described the function in an email with Voyager as the ability to \u201clog in with fake accounts that are already friended with the target subject and pulling data\u201d. While police departments across the country have increasingly used fake social media profiles to conduct investigations, the practice may violate Facebook and other platforms\u2019 community standards.<br \/>Facebook <a href=\"https:\/\/t.co\/qITPpvcOoQ?amp=1\" data-link-name=\"in body link\">rules<\/a> require people to use \u201cthe name they go by in everyday life\u201d. The company removes or temporarily restricts accounts that \u201ccompromise the security of other accounts\u201d or try to impersonate others. In <a href=\"https:\/\/www.eff.org\/deeplinks\/2018\/09\/facebook-warns-memphis-police-no-more-fake-bob-smith-accounts\" data-link-name=\"in body link\">2018<\/a> police in Memphis, Tennessee, used a fake account under the name Bob Smith to befriend and gather information on activists. In response, Facebook deactivated the account and others like it and told the police department it needed to \u201ccease all activities on Facebook that involve the use of fake accounts or impersonation of others.\u201d Facebook said everyone, including law enforcement, was required to use their real names on their profiles. <br \/>\u201cAs stated in our terms of services, misrepresentations and impersonations are not allowed on our services and we take action when we find violating activity, \u201d a Facebook spokesperson, Sally Aldous, said in a statement.<br \/>The feature also posed privacy and ethical questions, experts said. \u201cI worry about how low the threshold is for tech companies explicitly enabling police surveillance,\u201d said Chris Gilliard, a professor at Macomb Community College and a research fellow at the Harvard Kennedy School\u2019s Shorenstein Center.<br \/>\u201cThere\u2019s a long history of law enforcement spying on activists \u2013 who are engaging in entirely legal activities \u2013 in efforts to intimidate people or disrupt movements. Because of this, the bar for when companies aid police surveillance should be really high.\u201d<\/p>\n<p><a href=\"https:\/\/www.theguardian.com\/us-news\/2021\/nov\/17\/police-surveillance-technology-voyager\">source<\/a><\/p>\n<!--CusAds0-->\n<div style=\"font-size: 0px; height: 0px; line-height: 0px; margin: 0; padding: 0; clear: both;\"><\/div>","protected":false},"excerpt":{"rendered":"<p>Voyager, which pitches its tech to police, has suggested indicators such as Instagram usernames that show Arab pride can signal inclination towards extremismLast modified on Wed 17 Nov 2021 13.20 GMTWhat do your Facebook posts, who you follow on Instagram and who you interact with the most on social media say about you? According to [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAow1sXXCw:productID":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-672","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/672","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/comments?post=672"}],"version-history":[{"count":0,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/posts\/672\/revisions"}],"wp:attachment":[{"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/media?parent=672"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/categories?post=672"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/monblogeur.tech\/index.php\/wp-json\/wp\/v2\/tags?post=672"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}