Last month, Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) reintroduced the Kids Online Safety Act (KOSA), initially introduced last term, noting that the bill now has 62 cosponsors, bipartisan support, and is poised to pass in the Senate.

KOSA would apply to online platforms (including social media services and virtual reality environments), online video games, messaging applications, and video streaming services that are used, or are reasonably likely to be used, by an individual under 17 years of age, subject to enumerated exceptions.

Below we discuss some of KOSA’s key requirements, including notable changes in the most recent version of the bill, as well as in the incorporated Filter Bubble Transparency Act.

Key Requirements

KOSA’s requirements fall into four main categories: (1) product safeguards that all covered platforms must implement; (2) a duty of care to take additional measures as needed to address specified online harms; (3) transparency reporting on harm mitigation, including a third-party audit; and (4) algorithmic transparency. Below is a brief snapshot of each category.

  • Product safeguards. Covered platforms would need to implement a number of safeguards, including the following:
    • Limits on who is able to communicate with minors.
    • Restrictions on viewing minors’ personal data.
    • Limits on features that increase, extend, or result in compulsive use of the platform.
    • Controls for personalized recommendations systems.
    • Restrictions on sharing of minors’ geolocation.
    • Easy-to-use options for account and data deletion.
    • Easy-to-use options to limit the amount of time the minor spends on the platform.
    • Default privacy settings for minors.
    • Parental tools.
    • A reporting mechanism for harms to minors.
  • Duty of care. In addition to the product safeguards, KOSA would also impose a duty of care on covered platforms to prevent and mitigate specified harms to minors in connection with features that encourage or increase time and activity on the platform (e.g., infinite scroll, autoplay, notifications, personalized recommendation systems). The specified harms include mental health disorders, addiction-like behaviors, and online bullying.
  • Transparency obligations. Platforms that meet heightened threshold requirements are required to issue a report, at least annually, that (1) describes the foreseeable risks of material harm to minors and (2) assesses the prevention and mitigation measures taken to address the risks based on a third-party audit.
  • Algorithmic transparency requirements. The Filter Bubble Transparency Act was incorporated into KOSA in July 2023. Covered services would need to provide notice and user controls in connection with “opaque algorithms,” defined to be algorithms that rank information based on user-specific data the user did not expressly provide for this purpose. Covered services would need to disclose how their content feeds are selected, sorted, and prioritized, and provide an option to switch between the “opaque algorithm” and an “input-transparent algorithm.” Notably, for this section of the bill, the “online platforms” scoping standard is defined more broadly and includes any public-facing website, internet, or mobile app (including social networking sites, video sharing services, search engines, or content aggregation services), subject to exclusions.
  • Enforcement. The FTC would be the primary enforcement authority. Under the amended bill, state attorneys general would also have limited powers to enforce certain violations.

Next Steps

While momentum and bipartisan support for the bill is building in the Senate, it is unclear whether the bill would pass in the House where it has not been introduced. If the bill were to pass and be signed into law, it would take effect 18 months after enactment.

Providers that are developing compliance strategies for the various children’s online safety laws being enacted in U.S. states and abroad should continue to track KOSA over the coming months. The UK Online Safety Act and teen privacy and safety laws adopted in the EU and United States impose similar obligations.