Facebook Plans to Shut Down Its Facial Recognition System – The New York Times

0 Comments

Advertisement
Supported by
Saying it wants “to find the right balance” with the technology, the social network will delete the face scan data of more than one billion users.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.
Kashmir Hill and
To hear more audio stories from publications like The New York Times, download Audm for iPhone or Android.
Facebook plans to shut down its decade-old facial recognition system this month, deleting the face scan data of more than one billion users and effectively eliminating a feature that has fueled privacy concerns, government investigations, a class-action lawsuit and regulatory woes.
Jerome Pesenti, vice president of artificial intelligence at Meta, Facebook’s newly named parent company, said in a blog post on Tuesday that the social network was making the change because of “many concerns about the place of facial recognition technology in society.” He added that the company still saw the software as a powerful tool, but “every new technology brings with it potential for both benefit and concern, and we want to find the right balance.”
The decision shutters a feature that was introduced in December 2010 so that Facebook users could save time. The facial-recognition software automatically identified people who appeared in users’ digital photo albums and suggested users “tag” them all with a click, linking their accounts to the images. Facebook now has built one of the largest repositories of digital photos in the world, partly thanks to this software.
Facial-recognition technology, which has advanced in accuracy and power in recent years, has increasingly been the focus of debate because of how it can be misused by governments, law enforcement and companies. In China, authorities use the capabilities to track and control the Uyghurs, a largely Muslim minority. In the United States, law enforcement has turned to the software to aid policing, leading to fears of overreach and mistaken arrests. Some cities and states have banned or limited the technology to prevent potential abuse.
Facebook only used its facial-recognition capabilities on its own site and did not sell its software to third parties. Even so, the feature became a privacy and regulatory headache for the company. Privacy advocates repeatedly raised questions about how much facial data Facebook had amassed and what the company could do with such information. Images of faces that are found on social networks can be used by start-ups and other entities to train facial-recognition software.
When the Federal Trade Commission fined Facebook a record $5 billion to settle privacy complaints in 2019, the facial recognition software was among the concerns. Last year, the company also agreed to pay $650 million to settle a class-action lawsuit in Illinois that accused Facebook of violating a state law that requires residents’ consent to use their biometric information, including their “face geometry.”
The social network made its facial recognition technology announcement as it also grapples with intense public scrutiny. Lawmakers and regulators have been up in arms over the company in recent months after a former Facebook employee, Frances Haugen, leaked thousands of internal documents that showed the firm was aware of how it enabled the spread of misinformation, hate speech and violence-inciting content.
The revelations have led to congressional hearings and regulatory inquiries. Last week, Mark Zuckerberg, the chief executive, renamed Facebook’s parent company as Meta and said he would shift resources toward building products for the next online frontier, a digital world known as the metaverse.
The change affects more than a third of Facebook’s daily users who had facial recognition turned on for their accounts, according to the company. That meant they received alerts when new photos or videos of them were uploaded to the social network. The feature had also been used to flag accounts that might be impersonating someone else and was incorporated into software that described photos to blind users.
“Making this change required us to weigh the instances where facial recognition can be helpful against the growing concerns about the use of this technology as a whole,” said Jason Grosse, a Meta spokesman.
Although Facebook plans to delete more than one billion facial recognition templates, which are digital scans of facial features, by December, it will not eliminate the software that powers the system, which is an advanced algorithm called DeepFace. The company has also not ruled out incorporating facial recognition technology into future products, Mr. Grosse said.
Privacy advocates nonetheless applauded the decision.
“Facebook getting out of the face recognition business is a pivotal moment in the growing national discomfort with this technology,” said Adam Schwartz, a senior lawyer with the Electronic Frontier Foundation, a civil liberties organization. “Corporate use of face surveillance is very dangerous to people’s privacy.”
Facebook is not the first large technology company to pull back on facial recognition software. Amazon, Microsoft and IBM have paused or ceased selling their facial recognition products to law enforcement in recent years, while expressing concerns about privacy and algorithmic bias and calling for clearer regulation.
Facebook’s facial recognition software has a long and expensive history. When the software was rolled out to Europe in 2011, data protection authorities there said the move was illegal and that the company needed consent to analyze photos of a person and extract the unique pattern of an individual face. In 2015, the technology also led to the filing of the class action suit in Illinois.
Over the last decade, the Electronic Privacy Information Center, a Washington-based privacy advocacy group, filed two complaints about Facebook’s use of facial recognition with the F.T.C. When the F.T.C. fined Facebook in 2019, it named the site’s confusing privacy settings around facial recognition as one of the reasons for the penalty.
A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.
How it began. In September, The Wall Street Journal published The Facebook Files, a series of reports based on leaked documents. The series exposed evidence that Facebook, which on Oct. 28 assumed the corporate name of Meta, knew Instagram, one of its products was worsening body-image issues among teenagers.
The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.
Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.
The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.
New revelations. Documents from the Facebook Papers show the degree to which Facebook knew of extremist groups on its site trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how Facebook’s key features amplified toxic content on the platform.
“This was a known problem that we called out over 10 years ago but it dragged out for a long time,” said Alan Butler, EPIC’s executive director. He said he was glad Facebook had made the decision, but added that the protracted episode exemplified the need for more robust U.S. privacy protections.
“Every other modern democratic society and country has a data protection regulator,” Mr. Butler said. “The law is not well designed to address these problems. We need more clear legal rules and principles and a regulator that is actively looking into these issues day in and day out.”
Mr. Butler also called for Facebook to do more to prevent its photos from being used to power other companies’ facial recognition systems, such as Clearview AI and PimEyes, start-ups that have scraped photos from the public web, including from Facebook and from its sister app, Instagram.
In Meta’s blog post, Mr. Pesenti wrote that facial recognition’s “long-term role in society needs to be debated in the open” and that the company “will continue engaging in that conversation and working with the civil society groups and regulators who are leading this discussion.”
Meta has discussed adding facial recognition capabilities to a future product. In an internal meeting in February, an employee asked if the company would let people “mark their faces as unsearchable” if future versions of a planned smart glasses device incorporated facial recognition technology, according to attendees. The meeting was first reported by BuzzFeed News.
In the meeting, Andrew Bosworth, a longtime company executive who will become Meta’s chief technology officer next year, told employees that facial recognition technology had real benefits but acknowledged its risks, according to attendees and his tweets. In September, the company introduced a pair of glasses with a camera, speakers and a computer processing chip in partnership with Ray-Ban; it did not include facial recognition capabilities.
“We’re having discussions externally and internally about the potential benefits and harms,” Mr. Grosse, the Meta spokesman, said. “We’re meeting with policymakers, civil society organizations and privacy advocates from around the world to fully understand their perspectives before introducing this type of technology into any future products.”
Advertisement

source

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *

Related Posts