In the polarized political climate of the United States one can feel hard-pressed to find a topic that all sides can champion. The Senate’s hearing this past week, where they grilled five of the biggest names in social media regarding kids’ online safety, seemed to do just that. Representatives from Meta, TikTok, Discord, X, and Snapchat were all present, with Meta and TikTok taking the brunt of the Senate Judiciary Committee’s ire for their products’ impact on children. Notably, Mark Zuckerberg of Meta apologized to the families who had been harmed by social media, and Senator Lindsey Graham proclaimed that Zuckerberg had “blood on his hands.” Tensions were high, but chief executives from Snapchat and X agreed to support the Kids Online Safety Act (K.O.S.A.), a piece of legislation proposed last year requiring social media platforms to take “reasonable measures” to prevent harm to minors, specifically mentioning sexual exploitation, mental health, substance abuse and suicide.
The name alone sounds like a positive step. It truly is a universally agreed upon issue—the safety of children—and the internet, with its far-reaching corners, catfishing, and countless opportunities for abuse could surely stand to receive some protective boundaries. Though the aim is laudable critics say that issues of free speech and, perhaps to a lesser extent, privacy could create greater harm passing the act than finding another way to create safer online interactions for kids.
In a letter entitled “Vote ‘No’ on the Kids Online Safety Act,” the American Civil Liberties Union (ACLU) raises issues like silencing important conversations, limiting minors’ access to potentially vital resources, and violating the First Amendment by imposing a government-mandated content moderation rule. By permitting restrictions to be content-specific, the ACLU claims the act will “chill resources” meant to help with the very issues the legislation hopes to prevent. The internet is not the sole cause of ailments like eating disorders, depression, and anxiety, and limiting online lifelines is unlikely to improve outcomes for people already dealing with these concerns.
The ACLU is only one among over one hundred human rights groups that have come out against the bill, saying it will actually endanger kids, especially those in the LGBTQ+ community and young people who can become pregnant. Evan Greer, director of Fight for the Future, pointed out that any topic lawmakers deem controversial could end up on the chopping block, from flashpoints such as abortion, gender-affirming care, and substance abuse treatment to something as simple as important current events.
Activists also pointed to moments of last week’s hearing that signaled a desire to repress political content. For example, Senator Lindsey Graham, who claimed that TikTok is “being used to basically destroy Israel,” and Senator Marsha Blackburn previously stated that social media “is where children are being indoctrinated” and hoped that KOSA will “protect children from the transgender in the culture.”
Those in opposition of KOSA take issue with the privacy infringements based on the likely collection of more sensitive data from both children and adults. “Rather than misguided efforts to track every user’s age and identity, we need privacy protections for every American,” said S.T.O.P.’s executive director Albert Fox Cahn.
Though the senate judiciary hearing brought this discussion to the forefront, history says that all of this may not amount to much. Executives from Meta alone have testified thirty-three times since 2017 on several topics, but no federal laws have been passed to limit or regulate social media companies. As the ACLU puts it, content-based regulation like KOSA is unconstitutional as it regulates protected speech, and the lack of “a direct causal link” between social media and the harms Congress hopes to avoid would prevent the law from overcoming strict-scrutiny review by the courts. Harmful speech can be tempting to “regulate away,” but this line of defense is a double-edged sword. When the desire to prevent one type of harm leads to limiting speech, there is always a possibility that regulation would equally limit speech meant to provide resources for the same vulnerable groups of people. Furthermore, the line drawn by our legal system against content-based speech restrictions has not allowed many exceptions. The more exceptions deemed acceptable, the weaker the wall protecting our
Brynn Story attended the University of North Carolina at Chapel Hill for college majoring in Strategic Communication with a double major in Media Production and a minor in Global Cinema. She worked for five years in television and film in Los Angeles before attending law school at UNC School of Law. Since starting law school, Brynn has worked with A24 in their legal department and was published in the Winter 2024 issue of the North Carolina Journal of Law and Technology.