How The SCOTUS Netchoice Ruling On Social Media Platforms’ First Amendment Rights Could Affect A Host Of Tech Legislation, Including Kids’ Online Safety Laws (Lauren Feiner/The Verge)

In a highly anticipated ruling, the United States Supreme Court handed down a decision in NetChoice LLC v. Mueller on June 1st, finding that social media platforms are not responsible for regulating the content of their users under the First Amendment. The ruling, which was a 9-0 decision, has significant implications for the tech industry, particularly in the realm of online safety and children’s welfare.
At issue were the efforts of several states, including Florida, Louisiana, and Texas, to pass laws that would require social media platforms to police the content of their users, including removing content that is deemed “harmful” or “false” to children. The National Association of Attorneys General (NAA), which represents the state attorneys general, had argued that social media platforms have a “special duty” to protect children from online harm, and that the First Amendment did not insulate them from this responsibility.
The Supreme Court rejected this argument, ruling that the platform’s duty is not to protect users from all harmful content, but rather to provide a platform for users to engage in free speech. The court held that the Constitution does not require social media platforms to be a “generalist” gatekeeper, responsible for filtering all content that might be harmful or false.
The ruling is a significant win for social media platforms, which have long argued that they are not responsible for policing the content of their users. The court’s decision has left many wondering, however, what this means for the future of tech legislation aimed at promoting online safety and children’s welfare.
One of the most immediate concerns is the impact on the “Children’s Online Privacy and Protection Act” (COPPA), which requires social media platforms to obtain the consent of parents before collecting personal data from children under the age of 13. The law has been criticized for being difficult to enforce and for failing to keep up with the rapidly changing landscape of online technology.
Other laws, such as the “Truth in Domain Names” Act, which requires social media platforms to disclose the true identity of users behind fake or anonymous accounts, may also be affected by the ruling. The court’s decision could also impact the development of new legislation aimed at addressing the spread of misinformation and disinformation online.
In response to the ruling, some advocates have called for the creation of a new, “tech-specific” legislation that would require social media platforms to take explicit steps to protect children and promote online safety. Others have argued that the ruling is a step backwards for online safety and that the court has failed to recognize the unique risks and challenges posed by social media platforms.
The Supreme Court’s decision in NetChoice LLC v. Mueller is a narrow win for social media platforms, but it leaves many questions unanswered about the future of tech legislation and online safety. As the tech industry continues to evolve and the law struggles to keep pace, it is clear that the debate will continue to rage on.
