Children’s online safety took center stage last summer when the Senate passed the Children and Teen’s Online Safety Act (“COPPA 2.0”) and the Kids Online Safety Act (“KOSA”). Should they pass the House and be signed into law, the measures proposed in the bills could drastically change how children and their parents interact with the internet and social media.
Background on COPPA and the COPPA Rule
Internet use by children was on the rise heading into the 2000s. With that increased use, more online content and websites became available to and targeted children. In 1998, Congress passed the first internet safety legislation aimed at protecting children: the Children’s Online Privacy Protection Act (“COPPA”), which established rules for the use and collection of online data from children under the age of 13. The law gave the Federal Trade Commission (the “FTC”) more power to oversee and regulate the protection of children’s data in hopes of providing them more privacy. Under the FTC’s Children Online Privacy Protection Rule (“COPPA Rule” or “Rule”), a federal regulation enacted in 2000, operators of websites or online services which are directed to children under 13, must give notice to parents and get verifiable consent before collecting, using, or disclosing the collected personal information of its minor users. This consent requirement also extends to website operators and online service providers who have actual knowledge that they are collecting such personal information, not just those online services directed at children. The Rule also prohibits operators from requiring more personal information than is reasonably necessary, as a condition to children’s use, access, and participation in online activities. Even with the Rule in place, it is undeniable that the internet of the 1990s and early 2000s was a wildly different environment than the internet of today. Over the last twenty-years, e-commerce and social media platforms drove a surge in internet use and the prevalence of mobile devices and apps made accessing it easier than ever before. After initiating its first review of the COPPA Rule in 2011, the FTC adopted final amendments that strengthened existing protections and sough to keep up with technological advancements. These amendments went into effect in 2013 and modified definitions, data security protections, data retention and deletion requirements and third-party obligations. Among the amendments, the definition of “personal information” that cannot be collected without parental notice and consent was broadened to include information like geolocation, photographs, and videos. This was a critical development because in addition to personal information such as names, address, and telephone numbers that could directly identify users, the Rule now accounted for other information that could identify users indirectly. The FTC also extended the Rule to cover “persistent identifiers” such IP addresses and device IDs, which can be used to recognize users over time and over different websites.
Calls for Proposed Change in 2019
In 2019, the FTC initiated its most recent review of the COPPA Rule and requested public comment on whether changes to the Rule were needed. After thousands of public comments, the FTC proposed several changes.
In one of the most significant proposed changes, the FTC would expand the definition of “personal information” to include biometric identifiers “that can be used for the automated or semi-automated recognition of an individual, including fingerprints or handprints; retina and iris patterns; genetic data, including a DNA sequence; or data derived from voice data, gait data, or facial data.” After considering public comment, the FTC stated that, “The Commission believes that, as with a photograph, video, or audio file containing a child’s image or voice, biometric data is inherently personal in nature…the privacy interest in protecting such data is a strong one.” The FTC also proposed regulations that would affect the use of educational technology (“EdTech”), an area ripe for the collection of children’s personal information. Schools would be allowed to authorize EdTech providers to collect and use student’s personal information but only for educational purposes authorized by the school. The EdTech providers would not be permitted to use the information for any commercial purpose.
While the process to finalize the amendments is still underway, if finalized, this newly amended COPPA Rule would be a large step toward improving children’s online privacy and keeping in pace with evolving technologies.
The Children and Teens’ Online Privacy Act (COPPA 2.0)
Last July, new legislation aimed at drastically changing online children’s privacy made headway after passing the Senate. This bill, called the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), not only strengthens safety precautions for children online, but for the first time acknowledges that teens—not just children under 13—should be protected as well. The bill squares itself on the frequency and prevalence of internet use by children and teens and acknowledges concerns about children’s wellbeing stemming from social media use. In the face of worries about algorithms pushing harmful content to children and teens, the bill focuses on Big Tech (technology companies including Apple, Amazon, Meta, Microsoft, Google, and others) and its role in promoting enticing yet harmful content. COPPA 2.0 points to Big Tech’s use of data to capture children’s attention and keep them online, often at the expense of their mental health.
In 2023, U.S. Surgeon General Dr. Vivek Murthy released an advisory on social media and youth mental health, calling attention to the potential harms from social media use on children and teens. The report highlighted recent research showing adolescents who spend more than three hours a day on social media faced increased risks of anxiety and depression. Further, the report called attention to the risks social media poses specifically to adolescent girls. Low self-esteem, body dissatisfaction, and disordered eating were just some of the concerns brought to the forefront in the Surgeon General’s advisory. Notably, the Surgeon General has previously called on tech companies to take responsibility for creating safer online environments for children in an earlier advisory issued in 2021. This is all further support for implementing updated policies around child and teen engagement with internet-based activities.
If signed into law, COPPA 2.0 would create massive changes in the online privacy landscape that would simultaneously strengthen the online privacy rights of children and parents. COPPA 2.0 would:
- Build on the original 1998 COPPA legislation by prohibiting internet companies from collecting personal information from users who are 13 to 16 years old without their consent;
- Ban targeted advertising to children and teens;
- Revise COPPA’s “actual knowledge” standard whereby technology providers hide behind willful ignorance regarding children’s use of their platforms, by revising it to require platforms that are “reasonably likely to be used” by children to abide by COPPA 2.0’s standards and protecting users who are “reasonably likely to be” children or minors;
- Require companies offering their platform to children and teens to create an “Eraser Button” for parents and kids whereby users are permitted to eliminate a child or teen’s personal information when technologically feasible;
- Establish a “Digital Marketing Bill of Rights for Teens” which limits the collection of personal information of teens; and
- Establish a Youth Marketing and Privacy Division at the FTC.
Along with strengthening parental and children’s rights online, many proposed measures under COPPA 2.0 place greater responsibility on internet companies. Steps such as outright bans on targeted advertising and revising the “actual knowledge” standard would require companies to take a more proactive role in protecting the privacy of children and teens. Legislators, parents, and children’s advocacy groups have long argued that the power to curb children’s online privacy issues lies primarily with internet companies, and COPPA 2.0 reflects that.
Kids Online Safety Act
Accompanying COPPA 2.0, the Senate also passed a bill entitled the Kids Online Safety Act (“KOSA”). The bill, which targets social media platforms, online video games, streaming services, and social messaging applications, creates new obligations for the companies providing these services. Under KOSA, social media platforms would be required to default to the strongest privacy setting for kids while providing additional protections including the ability to disable addictive features and opt out of personalized recommendations. Most notably, KOSA would create a duty for online platforms to prevent specific harms by restricting the promotion of content related to eating disorders, suicide, sexual exploitation, and substance abuse. As with COPPA 2.0, KOSA could put more responsibility on internet companies and technology platforms to protect children.
Practical Implications
COPPA 2.0 and KOSA largely target Big Tech platforms, but stricter requirements for parental notice and consent stand to affect smaller companies as well. These more stringent restrictions, however, will not take effect unless COPPA 2.0 and KOSA are made into law, and an amended COPPA Rule is finalized.