Photo of Noah Bialos

Noah Bialos advises clients on digital safety, platform regulation, risk governance, and human rights.

Safety risk assessments are becoming a preferred regulatory tool around the world. Online safety laws in Australia, Ireland, the United Kingdom, and the United States will require a range of providers to evaluate the safety and user-generated content risks associated with their online services.

While the specific assessment requirements vary across jurisdictions, the common thread

Overview

California Governor Gavin Newsom recently signed AB 1394, a law that imposes new obligations on social media platforms to prevent and combat child sexual abuse and exploitation. The law is scheduled to take effect on January 1, 2025, and has two primary requirements for social media platforms (SMP): (1) implement a notice-and-staydown requirement for child sexual abuse material (CSAM); and (2) a prohibition against “knowingly facilitat[ing], aid[ing], or abet[ing] commercial sexual exploitation,” as defined by the statute. If a social media company violates the law, it may be liable to the reporting user for actual damages sustained and statutory damages of up to $250,000 per violation.

The law allows for a civil action to be brought by, or on behalf of, a person who is a minor and a victim of commercial sexual exploitation. The law includes a safe harbor provision for platforms that conduct safety audits. Social media platforms may face damages of up to $4 million per violation.Continue Reading California Law Requires Platforms To Take More Action Against Child Sexual Exploitation

Among the many open questions about large-language models (LLMs) and generative artificial intelligence (AI) are the legal risks that may result from AI-generated content. While AI-specific regulation remains pending and continues to develop in jurisdictions around the world, the following article provides a high-level summary of illegal and harmful content risks under existing law, as well as mitigations that companies may wish to consider when developing baseline models and consumer-facing generative AI tools. (For copyright and intellectual property (IP)-related issues, see Perkins Coie Updates.)Continue Reading Generative AI: How Existing Regulation May Apply to AI-Generated Harmful Content

After a flurry of legislative activity across the United States related to kids’ privacy and safety online, in recent weeks, federal courts in Arkansas and California have enjoined two notable state laws. A federal court in Arkansas preliminarily enjoined the Arkansas Social Media Safety Act (AR SMSA) on August 31, the day before the statute was scheduled to take effect for social media platforms in scope. The U.S. District Court for the Western District of Arkansas found that the plaintiff, NetChoice, LLC, is likely to succeed on the merits of its constitutional challenges.

Less than three weeks later, on September 18, the U.S. District Court for the Northern District of California also preliminarily enjoined California’s Age-Appropriate Design Code (CA AADC), holding that NetChoice is likely to succeed in showing that 10 CA AADC requirements violate the First Amendment.Continue Reading Federal Courts Preliminarily Enjoin Arkansas Social Media Safety Act and California Age-Appropriate Design Code

The UK Online Safety Bill was passed by Parliament earlier this week and is expected to soon become law through royal assent. The Online Safety Act (UK OSA) will impose a series of sweeping obligations, including risk assessment, content moderation, and age assurance requirements, on a variety of online services that enable user-generated content, including but not limited to social media and search providers.

Among the most notable aspects of the UK OSA are its “duties of care.” The law will impose a series of affirmative obligations to assess and mitigate safety risks.Continue Reading UK Parliament Passes a Sweeping and Controversial Online Safety Bill

The Global Online Safety Regulators Network (Network) issued a position statement on human rights and online safety regulation on September 13, 2023.

The Network is intended to facilitate a coherent international approach to online safety regulation by enabling online safety regulators to share insights, experience, and best practices. The current Network members include: the eSafety Commissioner (Australia), Coimisiún na Meán (Ireland), the Film and Publication Board (South Africa), the Korea Communications Standards Commission (Republic of Korea), the Online Safety Commission (Fiji), and Ofcom (UK).Continue Reading Global Online Safety Regulators Issue Statement on Human Rights and Online Safety Regulation

On June 6, 2023, Florida Governor Ron DeSantis signed Senate Bill 262 into law. SB 262 is a departure from the comprehensive privacy laws enacted by other states for a variety of reasons, including its (1) ban on government-directed moderation of social media, (2) restrictions on online interactions with minors (somewhat akin to the California Age-Appropriate Design Code), and (3) establishment of a “digital bill of rights” that creates general consumer privacy rights similar in many respects to those adopted in other states but, unlike them, Florida’s are narrowly applicable. Governor DeSantis has not shied away from saying the new law is directly aimed at “Big Tech,” and the targeted application of certain aspects of the law reflects that goal.

The ban on government-directed moderation took effect on July 1, 2023, with the protections for minors and digital bill of rights provisions set to take effect on July 1, 2024.Continue Reading Florida Enacts “Digital Bill of Rights” Combining Narrowly Applicable “Comprehensive” Privacy Provisions and More Broadly Applicable Restrictions on Children’s Privacy and Social Media Restrictions

Texas has become the latest state to impose age-related privacy and safety restrictions on online service providers, joining Arkansas, California, Florida, and Utah. Signed by Governor Greg Abbott on June 13, 2023, the Securing Children Online through Parental Empowerment (SCOPE) Act is scheduled to go into effect on September 1, 2024, and will require digital service providers to “register” the age of potential users at account creation and implement a series of privacy and safety controls for known minors.Continue Reading Texas Becomes Latest State to Address Kids’ Privacy and Safety Online