The federal government has let the animal spirits of the social media industry run wild by giving it Section 230 immunity for the harm it causes to young children, i.e., anxiety, depression, addiction, sleep disruption, and cyberbullying. Refusing to enforce the ban on TikTok until the President’s friends can acquire it is further proof of the industry’s privileged status with the federal government. Finally, the industry knows it can cheaply buy its way out of liability or from being regulated by contributing a few million to the President’s library. Considering the industry’s immunity protection, it is up to parents whose children are at risk to find trial lawyers who are willing to take on their cause for a share of the monetary recovery. These challengers will be the “algorithm trainers.”
An ‘algorithm trainer’ is a legal professional who, like a lion trainer, seeks to tame primal forces. They do this by entering the ‘cage’ of the courtroom and using their legal tools, such as ‘discovery’ (the process of obtaining evidence from the opposing party), to force the tech industry to conform to civil society. These ‘algorithm trainers’ are not afraid to take on issues that mainstream firms shy away from, and their notable successes include reining in the tobacco, asbestos, and automobile industries for defective products.
How Federal Law Protects Social Media.
Section 230 of the Federal Communications Decency Act grants immunity to social media platforms for all third-party content and for good-faith moderation decisions, such as the removal of obscene, lewd, or excessively violent content. Local courts have also found that federal immunity preempts state and local laws that are inconsistent with its protections, such as defamation, negligence, and fraud.
Section 230 provides immunity protections for platforms that publish the works of third parties, but these protections only apply when the platform acts as an intermediary. This federal immunity does not extend to the opinions or actions of the platforms or any conduct that occurs outside of publishing third-party content. Recognizing this limitation, trial lawyers have developed a theory suggesting that algorithms can cause harm due to their defective design. According to this theory, it is the design of the algorithms—not the content they publish—that leads to damage. In simpler terms, trial lawyers distinguish between the structure of the algorithms and the content on the platform. Their most promising argument revolves around the concept of addictive design.
A ‘tort’ is a civil wrong that causes harm to a person. Tort law dates to 1250 AD in England. It recognized the right of a victim of an unjustified physical attack to sue their attacker for damages. By 1500, it had expanded to recognize torts ranging from medical malpractice to defamation. The fundamental tasks of tort law include defining wrongs and empowering victims to initiate court proceedings as a form of recourse.
Tort law in the U.S. has a proven track record of securing significant financial recoveries for injured parties. In the 1998 Tobacco Master Settlement Agreement, the major tobacco companies agreed to pay $206 billion over 25 years to 46 states, to cover medical-related injuries from smoking.
In Anderson v. General Motors (1999), six people received a jury verdict of $4.9 billion when their Chevrolet Malibu’s fuel tank exploded in a rear-end collision and they were severely burned.
If the trial lawyers can get the social media cases to a jury, the recoveries could be in the hundreds of billions, depending on the number of victims and proof the industry prioritized profits over the safety of children.
Trial lawyers: It’s the platform, Your Honor, not the user-generated content.
The leading theory proposed by trial lawyers suggests that social media companies designed their platforms with “addictive design” features that capture the attention of young people by maximizing user engagement. It’s argued that this design, rather than the content itself, is responsible for causing harm, particularly youth addiction, anxiety, and depression.
One technique employed by social media platforms is the “feedback loop.” This algorithm generates unpredictable rewards that create a sense of pleasure or urgency. It uses a continuous cycle of user interactions to influence both content creators and platform algorithms, which in turn shape future content. When users like, comment, share, or react to a post, they signal to both the creator and the platform that the content is engaging. This engagement ultimately impacts what gets prioritized in users’ feeds, the type of content produced next, and the overall evolution of social media experiences.
“Essentially, it’s the reason you keep seeing more of what you engage with – whether it’s funny memes, cat videos, or conspiracy theories.”
Promising litigation.
While most of these defective algorithm cases have been dismissed based on Section 230 immunity, a few have survived dismissal. If even a few cases continue into discovery, trial lawyers will uncover the facts underlying the algorithms, i.e., what the industry knew and how it utilized that information when structuring the algorithms. This information is essential for distinguishing harm caused by algorithms, rather than the site’s third-party content.
In September 2025, a trial court in Los Angeles, California, denied the industry’s motion to prohibit expert testimony. It affirmed the plaintiffs’ argument that the case rested on how the platforms are designed, not on the user-generated content, which is protected by immunity. The consolidated cases are known as JCCP 5255 . By allowing scientific evidence, trial lawyers will present expert witnesses in psychiatry, neuroscience, pediatrics, and media psychology. They will testify on the impact of social media on the mental health of young people, including addiction, anxiety, and depression.
The trial lawyers believe that by distinguishing between how the platforms are designed (the algorithms and notifications), and the publishing of the site’s third-party user-generated content, they can persuade juries that the industry profits handsomely by developing algorithms to capture the minds of children.
Defective design lawsuits target business practices, not user-generated products.
Defective design lawsuits seek to “hold tech companies liable in their capacity as businesses for the harm that occurs because of irresponsible ways they design their services, not because of the harms from the actual content of the information they spread.”
An illustration of the type of evidence needed to differentiate this case from a Section 230 immunity case involves the use of Intermittent Variable Rewards (IRVs). Companies that implement IRVs do not deliver “likes” in real time; instead, they use artificially spaced patterns called IRVs that trigger larger dopamine responses. This spacing induces more anxiety in users, leading to increased time spent on the platform and more return visits compared to real-time responses.
In a significant class action lawsuit, a Massachusetts court permitted scientific testimony regarding the harm caused by these practices. The court determined that not all content-neutral tools are immune from legal action. Instead, it noted that “[l]iability may arise from such tools, provided that plaintiffs’ claims do not attribute the content generated by the parties to the tools themselves.”
A contrary decision was reached by a federal district court in California, which found the IRVs to be a neutral part of the protected platform.
Intentional torts and punitive damages.
An intentional tort is a wrongful civil act committed intentionally. Once discovery in these cases begins, the trial lawyers will seek to understand the tech industry’s knowledge of the algorithm, its potential harm, and whether the sector intentionally applied its knowledge to cause harm for profit. In such cases, juries can award Punitive damages if they find a defendant’s conduct is egregious, malicious, or in reckless disregard for the plaintiff’s rights.
With the correct set of facts, a void for violating public policy is a nuclear option to eliminate Section 230 immunity.
For over a hundred years, courts have voided contracts and laws for violating public policy. This void for public policy is a legal principle that enables courts to nullify contracts and statutes that contradict the broader interests of society or undermine fundamental moral, economic, or political values. If the trial lawyers can establish that the social media industry had a business practice of intentionally designing algorithms that they knew were harming the mental health of young children, the court could void all the industry’s defective algorithms practices in its jurisdiction. Since there would be hundreds, thousands of these cases, the industry could be economically devastated.
The first personal injury social-media defective design lawsuit is scheduled for November 2025.
William L. Kovacs, author of Devolution of Power: Rolling Back the Federal State to Preserve the Republic. It received five stars from Readers’ Favorite. His previous book, Reform the Kakistocracy, received the 2021 Independent Press Award for Political/Social Change. He served as senior vice president for the U.S. Chamber of Commerce and chief counsel to a congressional committee. He can be contacted at wlk@ReformTheKakistocracy.com