On April 26, 2023, a bipartisan group of federal lawmakers introduced a new piece of legislation aimed at enhancing protections for children under eighteen on social media, aptly named the Protecting Kids on Social Media Act. This Act follows a trend of heightened focus on children's online privacy by both federal and state lawmakers. However, despite multiple attempts to pass federal legislation updating the Children's Online Privacy Protection Act (COPPA) over the past years, none have gained traction. This stagnation at the federal level has led state legislatures to pass their own laws, rapidly creating a patchwork of regulations that pose challenges for social media platforms and may result in unintended consequences for children.
The Protecting Kids on Social Media Act primarily focuses on regulating the relationship between young people and social media. The Act defines "social media platform" broadly but excludes platforms not intended for media content sharing, like business transaction sites, teleconferencing platforms, and email, among others. It sets a minimum age of thirteen for social media use and requires parental consent for teens who wish to use social media. Additionally, it bans platforms from using algorithms to recommend content to users under eighteen.
The Act introduces a strict age-verification requirement, necessitating that social media platforms verify the age of all account holders. It proposes a novel program under the Secretary of Commerce that would verify an individual's age by checking their identity and issuing "secure digital identification credentials". This identity check would require users to either upload their ID card or consent to the use of other means for age verification, such as state DMV data, IRS records, or Social Security Administration records. Notably, this approach aims to mitigate privacy and data security risks associated with users uploading identity documents to individual servers of each social media company.
Children’s Online Privacy Protection Parental Consent and Algorithmic Recommendations
In line with laws recently passed in Utah and Arkansas, the Act would require parental consent for users under eighteen to create a social media account and prohibit children under thirteen from using a social media platform unless no data is collected from these individuals. Furthermore, the Act would prohibit social media platforms from using the personal data of users under eighteen for algorithmic recommendation systems. However, an exception is carved out for context-based advertising or recommendations.
Online Privacy Enforcement
The Federal Trade Commission and state attorneys general are granted authority to enforce all provisions of the Act, aligning with how COPPA is currently enforced. Any violations of the Act would be considered violations of the Federal Trade Commission Act and could result in significant civil penalties.
COPPA Broader Context and Implications
The introduction of the Protecting Kids on Social Media Act, comes as part of a broader focus on children's privacy. At the state level, several laws have been recently passed. In March, Utah became the first state to limit young people's access to social media, with Arkansas following suit in April requiring parental consent and age verification on social media platforms. Other states, including Texas, Ohio, New Jersey, and Louisiana, have pending legislation on similar issues. This flurry of activity follows California’s passage of the California Age-Appropriate Design Code, prompting companies that offer online services likely to be accessed by children to rethink their approach to children's privacy.
It is uncertain whether these federal proposals will advance or if more focused efforts at the state level will continue to gain traction. However, it is expected that children's privacy, being a bipartisan issue, will remain at the forefront of federal privacy legislation discussions. Amidst the stalled progress on comprehensive privacy legislation, children's privacy appears to be one of the most likely areas for compromise and actual lawmaking.
Given this focus on children's privacy, companies that have a significant number of teen users or know that there are teen users on their platforms should seriously consider whether they offer adequate protections to those users and options for age verification. The issues raised in federal proposals and state laws are complex, but there will likely be federal changes around regulating teens and children online in the near future, requiring a thoughtful approach from companies.