The Power of Words: The March on Washington and the Voices of Change
To Teaching Truth and Liberatory Learning!: A Back-to-School Reading List

KOSA: More Spies and New Censors for Kids and Adults

By Heidi Boghosian

Child on laptop
Image credit: N-region

A new bill designed to strengthen kids’ privacy online might have the opposite effect. The Kids Online Safety Act of 2022 (KOSA) could expose users to heightened surveillance and data collection while also leading to digital content censorship. Increases in surveillance and content cutbacks will affect not just kids but adults as well.

Introduced by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), the intent behind the law is to enhance kids’ data privacy and modernize the 1998 Children’s Online Privacy Protection Rule (COPPA). COPPA was designed to give parents control over the online collection, use, or disclosure of personal data from their children.

Censorship through Digital Filtering

KOSA is poised to become a high-octane censorship engine. It turns technology companies and parents into gatekeepers of young users’ online activities to prevent their seeing content the government defines as “not in their best interest.” One problem is that the law gives tech platforms discretion to interpret this overly broad term. They may err on the side of blocking benign information.

That’s because the law creates a duty of care for platforms to prevent or mitigate certain dangers to teens in their design and operation of products. Platforms face the specter of legal challenges if teens come across content promoting “self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor.”

Censorship concerns aside, the law isn’t the way to keep kids safe. It could worsen certain tendencies. For instance, after Tumblr and Pinterest banned self-harm blogs, including those promoting anorexia, a study in Perspectives in Public Health found that these banned blogs created a more impenetrable wall between people with eating disorders and health professionals trying to help them. Blocking certain content would prevent teens from accessing information in areas they often wish to explore without their parents’ knowledge. That includes sexual health, gender identity, eating disorders, and substance abuse.

KOSA would also allow individual state attorneys general to bring actions against platforms if they deem state residents are “threatened or adversely affected by the engagement of any person in a practice that violates this Act.” That provides wide latitude and political motivations to determine what topics might pose a risk to the mental or physical health of a minor. States with high rates of teen pregnancies like Arizona, Mississippi, Texas, Florida, and Arkansas, might consider reproductive rights information as harmful. Tech platforms could restrict access to such vital information, even if most parents—like the 93% reported in 2017—support comprehensive sex education in schools.

Increased Tracking and Data Collection

Platforms will face challenges trying to comply with KOSA. One dilemma is choosing to implement age verification systems or to simply block certain content for all users. If a fifteen-year-old user identifies falsely as an adult, the site could be held legally liable. The site is also liable if that same person truthfully identifies themselves. That’s because websites must “reasonably know” a user’s age.

If platforms institute age-verification systems, all users must submit personal data. User tracking would increase across the board. That would add to an outsized problem of data aggregation, the industry that capitalizes on reselling private data and using it to create targeted advertisements. KOSA requires parental consent when kids create accounts. It requires providers to give parents the ability to toggle privacy settings on any service their child uses. This requires sharing the sites or online services they’re using. That’s a significant privacy violation.

End-to-end encrypted messaging services like iMessage, Signal, and WhatsApp could potentially fall under KOSA’s definition of covered platforms. To avoid litigation, they may feel pressured to eliminate encryption in trying to interpret their duty to monitor minors' communications. Such a move would significantly undermine privacy for their vast user bases. It would also result in more data gathering from users who sought out channels for privacy.

KOSA’s requirement that specific types of content be hidden, and to track and log other content using parental tools would force platforms to intensify surveillance on all user activities.

Enforcing COPPA is a Better Approach

Children’s rights advocacy groups have had success in pressuring the Federal Trade Commission (FTC) to hold two companies accountable for violations of the existing privacy law, COPPA.

In 2018, a coalition spearheaded by Fairplay and the Center for Digital Democracy (CDD) alerted the FTC about COPPA by Google and YouTube. This led to a 2019 landmark settlement where the companies were fined a record $170 million. Their infraction was gathering personal data from children without obtaining parental consent.

In August 2023, both groups again approached the FTC, filing a Request for Investigation. They alleged that Google and YouTube were breaching COPPA, their 2019 settlement agreement, and the FTC Act. Research conducted by Adalytics and Fairplay indicate that Google serves personalized ads on YouTube’s “made for kids” videos and transmits viewer data to data brokers and ad tech firms, while making misleading claims about targeting children, all in violation of COPPA. Joining Fairplay and CDD were Common Sense Media and the Electronic Privacy Information Center. They urged the Commission to investigate and to sanction Google for its violations of children’s privacy, suggesting that the FTC seek penalties up to tens of billions of dollars.

Continued and consistent enforcement of COPPA offers a better approach than KOSA to protect children and teens from online surveillance. By avoiding increased surveillance and more data aggregation from new age-verification systems, and without the specter of pervasive content censorship, we can ensure a safer and freer online experience for everyone.

 

About the Author 

Heidi Boghosian is an attorney and co-host of Law & Disorder Radio. She is executive director of the A.J. Muste Institute, a charitable foundation supporting activist organizations. She was previously executive director of the National Lawyers Guild. Boghosian has written numerous articles and reports on policing and activism, and is the author of Spying on Democracy: Government Surveillance, Corporate Power, and Public Resistance. and “I Have Nothing to Hide”: And 20 Other Myths About Surveillance and Privacy. Connect with her online at heidiboghosian.com and on Twitter (@HeidiBoghosian).

Comments