Photo of Natasha Amlani

Natasha Amlani has experience with privacy counseling, litigation and data breach response.

Last month, Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) reintroduced the Kids Online Safety Act (KOSA), initially introduced last term, noting that the bill now has 62 cosponsors, bipartisan support, and is poised to pass in the Senate.

KOSA would apply to online platforms (including social media services and virtual reality environments), online video games, messaging applications, and video streaming services that are used, or are reasonably likely to be used, by an individual under 17 years of age, subject to enumerated exceptions.

Below we discuss some of KOSA’s key requirements, including notable changes in the most recent version of the bill, as well as in the incorporated Filter Bubble Transparency Act.Continue Reading Kids Online Safety Act Gains Momentum in the Senate

Safety risk assessments are becoming a preferred regulatory tool around the world. Online safety laws in Australia, Ireland, the United Kingdom, and the United States will require a range of providers to evaluate the safety and user-generated content risks associated with their online services.

While the specific assessment requirements vary across jurisdictions, the common thread

Overview

California Governor Gavin Newsom recently signed AB 1394, a law that imposes new obligations on social media platforms to prevent and combat child sexual abuse and exploitation. The law is scheduled to take effect on January 1, 2025, and has two primary requirements for social media platforms (SMP): (1) implement a notice-and-staydown requirement for child sexual abuse material (CSAM); and (2) a prohibition against “knowingly facilitat[ing], aid[ing], or abet[ing] commercial sexual exploitation,” as defined by the statute. If a social media company violates the law, it may be liable to the reporting user for actual damages sustained and statutory damages of up to $250,000 per violation.

The law allows for a civil action to be brought by, or on behalf of, a person who is a minor and a victim of commercial sexual exploitation. The law includes a safe harbor provision for platforms that conduct safety audits. Social media platforms may face damages of up to $4 million per violation.Continue Reading California Law Requires Platforms To Take More Action Against Child Sexual Exploitation

Among the many open questions about large-language models (LLMs) and generative artificial intelligence (AI) are the legal risks that may result from AI-generated content. While AI-specific regulation remains pending and continues to develop in jurisdictions around the world, the following article provides a high-level summary of illegal and harmful content risks under existing law, as well as mitigations that companies may wish to consider when developing baseline models and consumer-facing generative AI tools. (For copyright and intellectual property (IP)-related issues, see Perkins Coie Updates.)Continue Reading Generative AI: How Existing Regulation May Apply to AI-Generated Harmful Content

After a flurry of legislative activity across the United States related to kids’ privacy and safety online, in recent weeks, federal courts in Arkansas and California have enjoined two notable state laws. A federal court in Arkansas preliminarily enjoined the Arkansas Social Media Safety Act (AR SMSA) on August 31, the day before the statute was scheduled to take effect for social media platforms in scope. The U.S. District Court for the Western District of Arkansas found that the plaintiff, NetChoice, LLC, is likely to succeed on the merits of its constitutional challenges.

Less than three weeks later, on September 18, the U.S. District Court for the Northern District of California also preliminarily enjoined California’s Age-Appropriate Design Code (CA AADC), holding that NetChoice is likely to succeed in showing that 10 CA AADC requirements violate the First Amendment.Continue Reading Federal Courts Preliminarily Enjoin Arkansas Social Media Safety Act and California Age-Appropriate Design Code

The UK Online Safety Bill was passed by Parliament earlier this week and is expected to soon become law through royal assent. The Online Safety Act (UK OSA) will impose a series of sweeping obligations, including risk assessment, content moderation, and age assurance requirements, on a variety of online services that enable user-generated content, including but not limited to social media and search providers.

Among the most notable aspects of the UK OSA are its “duties of care.” The law will impose a series of affirmative obligations to assess and mitigate safety risks.Continue Reading UK Parliament Passes a Sweeping and Controversial Online Safety Bill

The Global Online Safety Regulators Network (Network) issued a position statement on human rights and online safety regulation on September 13, 2023.

The Network is intended to facilitate a coherent international approach to online safety regulation by enabling online safety regulators to share insights, experience, and best practices. The current Network members include: the eSafety Commissioner (Australia), Coimisiún na Meán (Ireland), the Film and Publication Board (South Africa), the Korea Communications Standards Commission (Republic of Korea), the Online Safety Commission (Fiji), and Ofcom (UK).Continue Reading Global Online Safety Regulators Issue Statement on Human Rights and Online Safety Regulation

Following the European Council’s approval last week, the Digital Services Act (DSA) has been officially adopted, starting the countdown to the law’s entry into force later this year. The DSA builds on the Electronic Commerce Directive 2000 (e-Commerce Directive) and regulates the obligations of digital services that act as intermediaries in connecting consumers with third-party