South Carolina is proposing an age-appropriate design code bill that would create a new legal duty of care for online platforms likely to be used by minors. While this legislation reflects growing concern about the need to protect children online, it risks forcing platforms to censor constitutionally protected free speech, collect sensitive user data, and impose vague standards that are difficult to enforce.
Under Senate Bill 268 (SB 268), companies would be required to mitigate risks to minors that the bill identifies, including compulsive usage, severe psychological harm, and severe emotional distress. In legal terms, severe psychological harm refers to a more clinical injury to mental health, while severe emotional distress refers to intense mental suffering, anguish, or fear that may not amount to a diagnosable condition. Companies that fail to comply with this rule may face significant civil penalties, and their executives may be held personally liable.
SB 268 is modeled on other laws, including California’s Age Appropriate Design Code and the United Kingdom’s Online Safety Act. Those laws require online services to adjust features and design elements to minimize harm to minors. In practice, however, these frameworks serve as government-imposed content standards. In other words: censorship. They ask platforms to predict when exposure to information might cause emotional distress, but there is no clear way to determine when high engagement becomes “compulsive” or what level of emotional impact would qualify as “severe.” A company attempting to comply would have to determine which information is suitable for young audiences and which ideas may be off-limits. That type of decision-making is inherently editorial and directly conflicts with constitutional protections for free speech and Section 230 of the Communications Decency Act.
Federal courts have already found constitutional problems with California’s Age Appropriate Design Code, including its requirement that covered businesses analyze how their services might present risks to minors. The Ninth Circuit held that this impact assessment requirement was likely unconstitutional because it burdened otherwise-legal speech and compelled companies to disclose detailed information about their editorial and design choices.
South Carolina’s SB 268 includes a closely related requirement by imposing a broad duty of care on platforms to identify and prevent harms to minors, along with mandated reporting and compliance tied to age estimation, profiling, targeted advertising, and default settings. Those provisions create the same basic problem that California’s law did: they require platforms to make predictive judgments about lawful content and user experience to avoid liability. Speech that may be considered “distressing” to minors is still protected by the First Amendment, and the government cannot suppress lawful expression simply because it might unsettle, offend, or emotionally affect young people.
SB 268 also raises important concerns about privacy and anonymity. To distinguish minors from adults, platforms would rely on systems that infer a user’s age based on device data, browsing patterns, and viewing behavior. These age-inferencing tools are widely used but frequently inaccurate. A parent who shares a tablet with a child or an adult who watches nostalgic cartoons may be identified as underage. When this occurs, users may be asked to upload a government ID to verify their age. Courts have repeatedly found that requiring users to identify themselves before accessing information discourages people from exploring topics that could be viewed as controversial or sensitive. This chilling effect on speech violates long-established constitutional principles.
The privacy risks of widespread age verification could be even more dangerous. Collecting and storing vast numbers of identity documents would create valuable targets for hackers and criminals. In October 2025, hackers accessed a third-party verification vendor and exposed over 70,000 user IDs and personal documents of users of the chat and media platform Discord. Discord began using that verification to comply with the United Kingdom’s Online Safety Act, which had gone into effect three months earlier. South Carolina should avoid creating similar vulnerabilities within its borders.
Smaller developers would also face steep compliance burdens because they lack the resources to securely handle large volumes of sensitive data. The result would be higher costs and increased cybersecurity risks without a clear improvement in child safety.
Instead of imposing universal age-verification systems, policymakers should emphasize parental involvement and existing safety tools. Smartphones and operating systems already include features that allow parents to set screen time limits, block certain types of content, and monitor usage. These tools can be adapted to each family’s values and adjusted as children grow older. Encouraging parents to use these features respects privacy and choice while avoiding constitutional problems. South Carolina can also promote digital literacy programs and education campaigns to help families teach responsible online behavior without restricting lawful expression.
SB 268 aims to safeguard minors, but its approach would undermine free speech and privacy while creating new opportunities for data theft. Experience in other jurisdictions shows that such laws fail both legally and practically. Protecting children online is an important goal, but it should be achieved through parental empowerment and education, not through government-mandated surveillance or restrictions on speech.