Browse all articles

Digital "Censorship" Bills Pose Risks to Kids

State Lawmakers beware: bills preventing tech’s ability to enforce rules could make online spaces worse for children and teens.

Lawmakers in states across the country are racing to introduce bills that would limit Big Tech's ability to remove content that they deem problematic, offensive, or harmful from their sites. However, these proposals will actually undermine the ability of social media platforms and digital apps to create safe and welcoming online environments for young people and families.

The pandemic has made the importance of online communities apparent. Kids use online platforms to connect with family, learn, and play, but these digital playgrounds can be difficult for parents to monitor -- and easy for kids to navigate into mature content. In 2020, parents' top three child health concerns were: overuse of social media, bullying/cyberbullying, and internet safety.

Young people on social media are regularly exposed to violence, self-harm, profanity, porn, hate speech, and even violent livestreams. Much of this content is not necessarily illegal, and companies need to have the ability to respond to problems. This has become even more evident over the past year. During the pandemic, reports have found that 47% of children and teens have seen content they'd rather avoid, leaving them feeling uncomfortable (29%), scared (23%), and confused (19%). Sixty-one percent of parents whose kids watch YouTube say their child has encountered content they felt was unsuitable for children.

This exposure has consequences. Teens with low social and emotional well-being experience more negative effects of social media than kids with high social and emotional well-being. One report found that 45% of girls say social media makes them feel like they have to look or act a certain way. In short, we can all agree that social media platforms have done a terrible job of moderating content.

Companies should improve the way they communicate and enforce their community guidelines, and moderation rules can be inconsistent and opaque. However, less moderation isn't the solution. Rather than creating contentious moderation rules that may well limit how tech companies can take action to protect young people, Common Sense encourages lawmakers to consider proposals that will support digital well-being and encourage the creation of healthy online spaces.

Tech companies could do much right now to improve children's online experiences. Social media companies could act immediately to limit the information they collect from kids and families, and then restrict how they use this data. This would go a long way to resolving the divisive and manipulative filter bubbles that people are sorted into, but it would also cut into lucrative business models that fuel the larger problem.

Legislative solutions need to address the heart of the problem. We need bright-line rules around the online content and advertising that's amplified and targeted toward kids. Protections should:

  • Limit children's exposure to unhealthy content through social media and other platforms curated by algorithms.

  • Curb incentives to push inappropriate ads and other disturbing, even illegal, content to kids.

  • Control algorithmic amplification and UX/UI design that undermines users' choices and amplifies negative content.

Allegations about censorship are ultimately the by-product of our opaque online ecosystem, and that is something lawmakers can correct. Common Sense has supported the Kids Internet Design and Safety Act that implements these protections. These sorts of proposals can also go hand in hand with the privacy proposals that are popping up in state after state.

Joe Jerome
Joseph Jerome serves as Director, Platform Accountability and State Advocacy at Common Sense Media, where he focuses on common sense legislative and policy solutions that support kids’ digital well-being. Joseph has worked at the intersection of law and technology, and has written about AR/VR, the privacy implications of big data, data portability policy, trust deficits in the online sharing economy, and emerging technologies in video games. Previously, he was part of the Privacy & Data Project at the Center for Democracy & Technology, an associate in the cybersecurity and privacy practice at WilmerHale, and counsel at the Future of Privacy Forum. He was a fellow with the Internet Law & Policy Foundry and has taught courses on cybersecurity and privacy compliance. Joseph has a J.D. from the New York University School of Law, where he was an International Law and Human Rights Student Fellow.