Introduction

The Maryland Age-Appropriate Design Code Act (SB 571 / HB 603) (MD AADC) was signed into law on May 9, 2024, with an October 1, 2024, effective date. The law is the second of its kind in the United States, following the California Age-Appropriate Design Code Act (CA AADC), which was passed in 2022 and is currently enjoined on constitutional grounds pending appeal in the U.S. Court of Appeals for the Ninth Circuit. Similar to the CA AADC (and the U.K.’s AADC), the MD AADC provides for privacy and safety requirements for children under age 18. Notably, the MD AADC also includes changes seemingly directed at surviving constitutional challenges under U.S. law. We have outlined the major differences between the two U.S. AADCs below.Continue Reading Maryland’s Enactment of the Age-Appropriate Design Code Act

Last month, Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) reintroduced the Kids Online Safety Act (KOSA), initially introduced last term, noting that the bill now has 62 cosponsors, bipartisan support, and is poised to pass in the Senate.

KOSA would apply to online platforms (including social media services and virtual reality environments), online video games, messaging applications, and video streaming services that are used, or are reasonably likely to be used, by an individual under 17 years of age, subject to enumerated exceptions.

Below we discuss some of KOSA’s key requirements, including notable changes in the most recent version of the bill, as well as in the incorporated Filter Bubble Transparency Act.Continue Reading Kids Online Safety Act Gains Momentum in the Senate

Safety risk assessments are becoming a preferred regulatory tool around the world. Online safety laws in Australia, Ireland, the United Kingdom, and the United States will require a range of providers to evaluate the safety and user-generated content risks associated with their online services.

While the specific assessment requirements vary across jurisdictions, the common thread

Overview

California Governor Gavin Newsom recently signed AB 1394, a law that imposes new obligations on social media platforms to prevent and combat child sexual abuse and exploitation. The law is scheduled to take effect on January 1, 2025, and has two primary requirements for social media platforms (SMP): (1) implement a notice-and-staydown requirement for child sexual abuse material (CSAM); and (2) a prohibition against “knowingly facilitat[ing], aid[ing], or abet[ing] commercial sexual exploitation,” as defined by the statute. If a social media company violates the law, it may be liable to the reporting user for actual damages sustained and statutory damages of up to $250,000 per violation.

The law allows for a civil action to be brought by, or on behalf of, a person who is a minor and a victim of commercial sexual exploitation. The law includes a safe harbor provision for platforms that conduct safety audits. Social media platforms may face damages of up to $4 million per violation.Continue Reading California Law Requires Platforms To Take More Action Against Child Sexual Exploitation

Among the many open questions about large-language models (LLMs) and generative artificial intelligence (AI) are the legal risks that may result from AI-generated content. While AI-specific regulation remains pending and continues to develop in jurisdictions around the world, the following article provides a high-level summary of illegal and harmful content risks under existing law, as well as mitigations that companies may wish to consider when developing baseline models and consumer-facing generative AI tools. (For copyright and intellectual property (IP)-related issues, see Perkins Coie Updates.)Continue Reading Generative AI: How Existing Regulation May Apply to AI-Generated Harmful Content

After a flurry of legislative activity across the United States related to kids’ privacy and safety online, in recent weeks, federal courts in Arkansas and California have enjoined two notable state laws. A federal court in Arkansas preliminarily enjoined the Arkansas Social Media Safety Act (AR SMSA) on August 31, the day before the statute was scheduled to take effect for social media platforms in scope. The U.S. District Court for the Western District of Arkansas found that the plaintiff, NetChoice, LLC, is likely to succeed on the merits of its constitutional challenges.

Less than three weeks later, on September 18, the U.S. District Court for the Northern District of California also preliminarily enjoined California’s Age-Appropriate Design Code (CA AADC), holding that NetChoice is likely to succeed in showing that 10 CA AADC requirements violate the First Amendment.Continue Reading Federal Courts Preliminarily Enjoin Arkansas Social Media Safety Act and California Age-Appropriate Design Code

The Global Online Safety Regulators Network (Network) issued a position statement on human rights and online safety regulation on September 13, 2023.

The Network is intended to facilitate a coherent international approach to online safety regulation by enabling online safety regulators to share insights, experience, and best practices. The current Network members include: the eSafety Commissioner (Australia), Coimisiún na Meán (Ireland), the Film and Publication Board (South Africa), the Korea Communications Standards Commission (Republic of Korea), the Online Safety Commission (Fiji), and Ofcom (UK).Continue Reading Global Online Safety Regulators Issue Statement on Human Rights and Online Safety Regulation

On June 6, 2023, Florida Governor Ron DeSantis signed Senate Bill 262 into law. SB 262 is a departure from the comprehensive privacy laws enacted by other states for a variety of reasons, including its (1) ban on government-directed moderation of social media, (2) restrictions on online interactions with minors (somewhat akin to the California Age-Appropriate Design Code), and (3) establishment of a “digital bill of rights” that creates general consumer privacy rights similar in many respects to those adopted in other states but, unlike them, Florida’s are narrowly applicable. Governor DeSantis has not shied away from saying the new law is directly aimed at “Big Tech,” and the targeted application of certain aspects of the law reflects that goal.

The ban on government-directed moderation took effect on July 1, 2023, with the protections for minors and digital bill of rights provisions set to take effect on July 1, 2024.Continue Reading Florida Enacts “Digital Bill of Rights” Combining Narrowly Applicable “Comprehensive” Privacy Provisions and More Broadly Applicable Restrictions on Children’s Privacy and Social Media Restrictions