Browse all articles

Opportunities and Challenges in Online Content Management

Common Sense feedback on R Street's project on content regulation

Today's internet offers an easy opportunity to endlessly consume ads, media, and user-generated content. Kids face a dizzying array of digital platforms, some overly powerful, some truly toxic, and others just plain confusing to adults and parents. Underlying many of these platforms and services, unfortunately, is a business model designed to engage and extract kids' attention. Teens understand this: They think these platforms are designed to make them spend more time on their devices and distract them and their friends. Tech companies claim this is not the case, but this claim cannot be reconciled with online reality.

While Common Sense supports laws and regulation that could improve online experiences, like limits on manipulative design, so-called "dark patterns," and algorithmic amplification of harmful content to kids, online platforms -- video streaming, social media and gaming apps -- are already in the position to make the biggest changes to protect kids. Plain and simple, we think tech companies should prioritize kids' interests over corporations' pure profits. Making simple design changes, or even just providing more information about how online services rank and categorize people, pages, and content, is something the biggest tech companies and social media services can do right now.

Unfortunately, online platforms too often take a laissez-faire approach to any responsibility to promote a safe and healthy online community. That isn't to say one should not support or enable a wide variety of different online communities and approaches to content moderation -- particularly for adults -- but it is rarely, if ever, appropriate for a platform to disclaim responsibility and put the onus entirely on users to protect themselves. This is particularly the case when children and teens are involved.

Content management is challenging, but Common Sense believes platforms should take the following approach when it comes to protecting kids online.

  • First, platforms should know where children and teens are on their platforms. Ongoing debates about operators' knowledge of kid activity under the Children's Online Privacy Protection Act (COPPA) at the very least confirm that platforms should know their audience. Some platforms are clearly directed to children, but all platforms should honestly assess their user base and recognize when they have large populations of kids that use their products.

  • Second, platforms should embrace a holistic approach that considers the best interests of the child online. Among considerations identified by the UK Information Commissioner's Office, platforms should consider how they can keep kids safe from exploitation, and protect and support their overall well being and developmental needs, while giving parents the ability to meaningfully oversee their children's activity.

Common Sense believes any platform or service that is directed to or ultimately used by children and teens should have these costs baked into the product or service. Further, invocation that content moderation does not scale is an excuse that prioritizes free expression over safety, good digital citizenship, and other community values -- a calculation that must be questioned where young people are concerned.

There are many challenges and opportunities when it comes to managing the huge volume on online content today. Chris Riley, who worked on these issues for Mozilla, has kicked off a project at the R Street Institute to explore challenges, ambiguities, and issues in how social media, large and small, grapple with the worst and most inappropriate stuff posted online. Common Sense submitted comments that encourage this effort to take seriously the negative social experiences that children and teens face online. We hope any effort that looks to improve the entire digital ecosystem considers the interests of younger users, embraces adding friction, and discusses how content moderation practices can be turned into a learning experience that promotes good digital citizenship for internet users, young and old.

Joe Jerome
Joseph Jerome serves as Director, Platform Accountability and State Advocacy at Common Sense Media, where he focuses on common sense legislative and policy solutions that support kids’ digital well-being. Joseph has worked at the intersection of law and technology, and has written about AR/VR, the privacy implications of big data, data portability policy, trust deficits in the online sharing economy, and emerging technologies in video games. Previously, he was part of the Privacy & Data Project at the Center for Democracy & Technology, an associate in the cybersecurity and privacy practice at WilmerHale, and counsel at the Future of Privacy Forum. He was a fellow with the Internet Law & Policy Foundry and has taught courses on cybersecurity and privacy compliance. Joseph has a J.D. from the New York University School of Law, where he was an International Law and Human Rights Student Fellow.