By Nancy Kim
Facebook has had a rough month, and deservedly so. The company has earned a special place of distrust in the hearts of many: A CNN poll published in November found that 3 of 4 U.S. adults say Facebook is making American society worse.
In a U.S. Senate hearing in October, former Facebook employee Frances Haugen made explosive allegations that the company’s own research documented the harms its site inflicts upon users. In other words, Facebook itself allegedly knew its business harmed others in concrete and preventable ways, like promoting photo sharing that damages the mental health of young people, especially girls.
How has Facebook gotten away with it? Part of the answer lies with Section 230 of the Communications Decency Act, the controversial federal law that essentially gives websites broad protection against liability for content posted by others. The law shields Facebook from the responsibility and liability of a traditional publisher.
Though a newspaper might be sued for libel over a defamatory article, Section 230 protects online platforms from liability for the content they distribute as long as they did not create it. In effect, Facebook has received a federal subsidy in the form of Section 230, which largely protects it from an important form of societal regulation: lawsuits.
Lawsuits bring issues into a public forum for scrutiny and discussion. In the absence of adequate regulation, the public depends upon private citizens to assert their rights and redress wrongs in court. When companies deploy new technology and business models, legislators and regulators often are slow to react. As a result, the legality of these new practices often is litigated — meaning they get debated by attorneys, reported by the news media and discussed by the public.
Social media companies have escaped these lawsuits mostly unscathed. For example, Facebook was sued by a victim of sex trafficking who had connected with her abuser through the site. In June, the Texas Supreme Court dismissed most of her claims based on Section 230 immunity.
In a different case, family members of victims killed by terrorist attacks sued Twitter, Facebook and Google, alleging these companies provided material support to terrorist organizations. The 9th U.S. Circuit Court of Appeals ruled in June that most of the claims were barred by Section 230.
But there are grounds for civil liability lawsuits against Facebook outside the scope of Section 230. While 230 lets social media companies off the hook for harmful content posted by users, Facebook’s internal documents and Haugen’s Senate testimony suggest its business model and products themselves are harmful and addictive.
The “like” button and the endless scrolling feature may have negative consequences for mental and physical health by keeping users glued to their screens, as noted by tech insiders such as Tristan Harris and former Facebook executive Chamath Palihapitiya. The company’s product design also rewards misinformation. When Facebook overhauled its algorithm to increase user engagement, it boosted amplification of divisive and provocative content.
Facebook should further be held liable for misleading public statements about the nature of its products. For example, the company’s statements about the mental health benefits of social apps for young people glaringly omit its own internal research showing that Instagram use makes body image issues worse for 1 in 3 teenage girls.
Facebook’s products and what the company says about them should be fair game for product liability lawsuits. People who suffer physical or emotional harm from those products — especially teenagers and young adults who are particularly vulnerable to the site’s features — should be able to sue the company without getting bogged down by Section 230.
Certainly Section 230 needs to be modified. It currently is written so that courts interpret it too broadly to mean blanket immunity even when the claims against a company are not based on publisher or speaker liability. The law should be updated to clarify that companies are responsible for their business practices and products, a line that could be drawn without upending the important protections for free speech and content moderation that 230 provides.
But legislative reform won’t happen fast, and accountability for Facebook shouldn’t have to wait. In addition to compensating injured victims, lawsuits serve another purpose — they will compel the famously evasive company to disclose more information on what it knows about its own products.
Nancy Kim is a law professor at Chicago-Kent College of Law, Illinois Institute of Technology. © 2021, Los Angeles Times Distributed by Tribune Content Agency
I have had the honor of serving in the Virginia Senate since 2020, and issues of environmental justice have been among my top priorities. In t…
This is the image Americans are comfortable with on Thanksgiving: struggling Pilgrims saved by the generosity of kindly Native Americans, with…
This week, Congress began its debate on the Build Back Better Act, partner legislation to the sweeping infrastructure bill President Joe Biden…
My name is Joseph Carter, and I am the 58th wrongfully convicted person in Virginia to be exonerated. I served 27 years in prison for a murder…
By Gonzalo Bearman, Michelle Doll and Richard P. Wenzel
By Dominic J. Packer and Jay Van Bavel
By Brian Klaas
As a young man, I often received correspondence from Dr. Fred Woodlief, a dentist on Chamberlayne Avenue in Richmond. He was involved in the W…
By Carmen Black
Almost two years ago, the General Assembly made history with a series of laws shepherding Virginia toward a future of clean, low-cost wind and…
Get up-to-the-minute news sent straight to your device.
30 novembre, 2021 0 Comments 1 category
Category: Non classé