The Kids Online Safety Act (KOSA) responds to the escalating youth mental health crisis and its ties to harmful online content and addictive social media algorithms. Given the rising rates of depression, anxiety, and loneliness among children in the U.S., KOSA aims to establish a legal framework that promotes a healthier online ecosystem for youth.
KOSA: An Introduction
For clarity, it is essential to differentiate between two important technology policies: the Kids Online Safety Act (KOSA) and the Children’s Online Privacy Protection Act (COPPA), which are often considered together due to their concurrent passage in the Senate. This brief will focus specifically on KOSA. COPPA primarily addresses privacy through provisions like prohibiting the collection of personal data from children 13 and under without parental consent, but KOSA takes a different approach to online safety that emphasizes the mental health impacts of social media use.
In a time of increasing legislative partisanship, KOSA is a strongly bipartisan proposal with 72 co-sponsors and the support of President Biden. However, despite passing 91-3 in the Senate, KOSA now faces resistance from House GOP members over potential censorship issues, rendering it unlikely for the bill to advance in its current form. This opposition emphasizes the challenge of balancing online safety regulations and First Amendment rights.
KOSA: The Youth Mental Health Crisis
Research suggests that youth today struggle with mental illness at higher rates than past generations. From 2007 to 2018, rates of suicide among the ages 10-24 increased by nearly 60%. According to the CDC, three in five teen girls report feeling “persistently sad or hopeless,” and 30% have seriously contemplated suicide. One study shows that increased social media use is linked to higher levels of anxiety, depression, and self-doubt. Moreover, a study by the NIH found that the risk of depression rose by 13% for each hour of increase in social media use among adolescents.
Modern-day social media algorithms are intentionally addictive, promoting compulsive use. Congress defines compulsive use as any behavior driven by external stimuli that causes individuals to engage in repetitive behavior reasonably likely to cause psychological distress, loss of control, anxiety, or depression. Furthermore, algorithms often recommend inappropriate or harmful content. For example, as teens click on advertisements for weight loss products, their feeds can become dominated by the content, negatively impacting body image. A study reported that thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.
KOSA’s Major Provisions
KOSA emerged in response to the research linking youth mental health struggles to social media use. The bill aims to protect children’s mental and emotional safety online by:
- Requiring platforms to modify their content moderation policies in order to reduce major harms such as mental illness onset, sexual exploitation, and the sale of illicit drugs to minors on their platforms.
- Giving minors the tools to restrict the collection and visibility of private information.
- Allowing minors to opt out of addictive algorithmic recommendations.
- Requiring platforms to default to the strongest safety settings for child users.
- Holding online platforms accountable through annual independent audits.
Arguments in Favor of KOSA
Proponents of the Kids Online Safety Act (KOSA) argue that the bill is crucial for protecting youth from mental health challenges. Pediatricians across the nation have spoken out about the urgent need to create a healthier digital environment for children and adolescents. Additionally, the U.S. Surgeon General released an advisory stating social media carries a “profound risk of harm” to youth. The U.S. surgeon general called on Congress to require warning labels on social media platforms and their effects on young people’s lives, similar to those on cigarette boxes. This emphasis on mental health is a central aspect of the arguments in favor of KOSA.
Autonomy is a key argument in the discussion surrounding KOSA as well. Addictive algorithms keep kids endlessly scrolling and seeking dopamine hits from follower engagement. There is currently no mechanism for opting out of recommendation algorithms. KOSA aims to change this by requiring social-media platforms to provide parental controls and disable features like algorithmic recommendations. Supporters argue that this provision would provide parents and kids with more autonomy over their online experiences, allowing them to gain greater control over their usage times.
Supporters also argue that KOSA is an important step toward holding companies accountable for the harm they cause. They emphasize that Big Tech companies have designed social media to maximize profit by exploiting children’s developmental vulnerabilities for commercial gain. After years of operating with few regulations, tech firms are receiving increasing scrutiny for alleged harms to young users, exposed by whistleblower revelations. Frances Haugen, a former Facebook employee, disclosed tens of thousands of pages of internal Facebook documents to Congress and the Securities and Exchange Commission proving that Facebook was aware of their negative impact on teens’ mental health. Microsoft, Snap, and X have all endorsed KOSA, likely in recognition that profits can still be made while taking reasonable steps to protect children.
Arguments Against KOSA
Critics of KOSA express concerns about potential censorship of online content. On one hand, many conservatives argue that empowering the Federal Trade Commission (FTC) with censorship authority could lead to the suppression of legitimate speech. For example, if a Republican leads the FTC, content discussing LGBTQ+ lives, reproductive health, and climate change might be deemed harmful to youth, while a Democratic leader could censor discussions on automatic weapons, shootings, and religious viewpoints on the LGBTQ+ community. Critics fear KOSA could be enforced for political purposes.
On the other hand, many liberal legislators are concerned with KOSA’s Duty of Care provision, which requires companies to “prevent and mitigate” harms to children such as bullying, violence, and negative mental health impacts. They worry that this provision could lead to the censorship of LGBTQ+ content and reproductive health content as companies over-filter content to avoid legal repercussions. In September 2022, the Heritage Foundation seemingly endorsed KOSA in an editorial, praising the Duty of Care provision’s potential to limit what they claim is Big Tech’s influence on children’s gender identities. The Heritage Foundation later expressed intentions to use similar bills to limit transgender content online, fueling concerns about KOSA’s potential dangers for the LGBTQ+ community.
The Duty of Care provision has been revised since its initial introduction. Originally, KOSA specified various mental health conditions that companies needed to mitigate, but after revisions, the emphasis shifted to preventing the promotion of inherently dangerous behaviors. Despite these changes, critics maintain that the provision remains too broad and vague, potentially leading to censorship of crucial information. Although KOSA includes a limiting principle saying nothing in the Duty of Care will prevent a minor from deliberately searching for a specific type of content, companies may still censor content to prevent compliance issues.
Another significant concern is that KOSA could be counterproductive, potentially increasing the risks of online harm to children. Critics argue that by restricting access to lawful speech on topics such as addiction and bullying, KOSA may hinder minors from finding supportive online communities and feeling comfortable in their identities.
Conclusion
In conclusion, KOSA aims to address the rising youth mental health crisis by holding platforms accountable for their negative mental health impacts and enhancing parents’ and young users’ autonomy over their online experiences. While supporters believe the bill will create a safer digital environment, concerns about potential censorship and the implications of KOSA’s Duty of Care provision underscore the complexities of balancing safety with free speech. This balance presents a continuing challenge as lawmakers debate the future of the bill.