Get local news delivered to your inbox!
Facebook has had a rough month, and deservedly so. The company has earned a special place of distrust in the hearts of many: A CNN poll published last week found that 3 out of 4 U.S. adults say Facebook is making American society worse.
In an October Senate hearing, former Facebook employee Frances Haugen made explosive allegations that the company’s own research documented the harms its site inflicts upon users. In other words, Facebook itself allegedly knew that its business harmed others in concrete and preventable ways, like promoting photo sharing that damages the mental health of young people, especially girls. How has Facebook gotten away with it?
Part of the answer lies with Section 230 of the Communications Decency Act, the controversial federal law that essentially gives websites broad protection against liability for content posted by others. The law shields Facebook from the responsibility and liability of a traditional publisher.
Though a newspaper might be sued for libel over a defamatory article, Section 230 protects online platforms from liability for the content they distribute as long as they did not create it. In effect, Facebook has received a federal subsidy in the form of Section 230, which largely protects it from an important form of societal regulation: lawsuits.
Lawsuits bring issues into a public forum for scrutiny and discussion. In the absence of adequate regulation, the public depends upon private citizens to assert their rights and redress wrongs in court. When companies deploy new technology and business models, legislators and regulators are often slow to react. As a result, the legality of these new practices is often litigated — meaning they get debated by attorneys, reported by the news media and discussed by the public.
Social media companies have escaped these lawsuits mostly unscathed. For example, Facebook was sued by a victim of sex trafficking who had connected with her abuser through the site. In June the Texas Supreme Court dismissed most of her claims based on Section 230 immunity. In a different case, family members of victims killed by terrorist attacks sued Twitter, Facebook and Google, alleging that these companies provided material support to terrorist organizations. The 9th Circuit ruled (also in June) that most of the claims were barred by Section 230.
But there are grounds for civil liability lawsuits against Facebook outside the scope of Section 230. While 230 lets social media companies off the hook for harmful content posted by users, Facebook’s internal documents and Haugen’s Senate testimony suggest its business model and products are themselves harmful and addictive.
The “like” button and the endless scrolling feature may have negative consequences for mental and physical health by keeping users glued to their screens, as noted by tech insiders such as Tristan Harris and former Facebook executive Chamath Palihapitiya. The company’s product design also rewards misinformation. When Facebook overhauled its algorithm to increase user engagement, it boosted amplification of divisive and provocative content.
Facebook should further be held liable for misleading public statements about the nature of its products. For example, the company’s statements about the mental health benefits of social apps for young people glaringly omit its own internal research showing that Instagram use makes body image issues worse for 1 in 3 teenage girls.
Facebook’s products and what the company says about them should be fair game for product liability lawsuits. People who suffer physical or emotional harm from those products — especially teenagers and young adults who are particularly vulnerable to the site’s features — should be able to sue the company without getting bogged down by Section 230.
Certainly Section 230 needs to be modified. It is currently written so that courts interpret it too broadly to mean blanket immunity even when the claims against a company are not based on publisher or speaker liability. The law should be updated to clarify that companies are responsible for their business practices and products, a line that could be drawn without upending the important protections for free speech and content moderation that 230 provides.
But legislative reform won’t happen fast, and accountability for Facebook shouldn’t have to wait. In addition to compensating injured victims, lawsuits serve another purpose — they will compel the famously evasive company to disclose more information on what it knows about its own products.
Nancy Kim is a law professor at Chicago-Kent College of Law, Illinois Institute of Technology.
Get local news delivered to your inbox!
Come next year, Flagstaff’s Barnes & Noble location will be shutting down to make room for a new Goodwill outlet.
Dear EarthTalk: What’s up with efforts by Native Americans to take back the national parks?
This week, Flagstaff lost one of its most prominent sons in local author and historian Jim Babbitt.
The CDC is advising consumers to avoid purchasing certain onions due to a salmonella outbreak reported in 38 states and Puerto Rico.
Work continues this week on a city pilot project to improve safety on several bike lanes within Flagstaff.
After several years of work, Flagstaff may be one step closer to the development of a 100% affordable housing development and parking garage d…
COVID-19 metrics are rising in Coconino County again this week, according to a dashboard data report released Friday, as both percent positivi…
Warm weather and no foreseeable precipitation has forced Arizona Snowbowl to indefinitely delay its opening day. Originally scheduled to open …
The Arizona Board of Regents (ABOR) is set to approve its fall 2021 enrollment report for the state’s three public universities in a meeting T…
Boy on skateboard killed in Kachina Village car collision Sunday
Get up-to-the-minute news sent straight to your device.