Results

Brian Schatz

02/05/2026 | Press release | Distributed by Public on 02/06/2026 14:41

Schatz, Britt, Colleagues Press Meta For Answers On Delayed Protections For Children

Published: 02.05.2026

Schatz, Britt, Colleagues Press Meta For Answers On Delayed Protections For Children

WASHINGTON - U.S. Senators Brian Schatz (D-Hawai'i), Katie Britt (R-Ala.), Chris Coons (D-Del.), James Lankford (R-Okla.), and Amy Klobuchar (D-Minn.) wrote to Meta CEO Mark Zuckerberg asking why the company delayed rolling out protections for young users even after it knew of the grave risks its platforms posed to children. Citing evidence from a court filing unsealed late last year, the senators pressed the company for answers on why it delayed launching its "private by default" feature for young users and how it reviews and acts on reports of sex trafficking and child sexual abuse material (CSAM) on its platforms.

"In the U.S. District Court for the Northern District of California's consideration of Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, a recently unsealed plaintiff's brief revealed new information about Meta's knowledge of risks to the wellbeing and safety of young users on its platforms," wrote the senators. "The documents allege that Meta employees were aware of potential and ongoing harms to young users, including impacts to attention and emotional wellbeing, exposure to inappropriate contact by adults, and challenges to taking down sex trafficking and child sexual abuse material (CSAM) on its platforms. These developments are alarming, and we are deeply concerned by allegations that Meta was not only aware of these risks, but may have delayed product design changes or prevented public disclosure of these findings."

Earlier this year, Schatz and Britt introduced the Kids Off Social Media Act which prohibits children under the age of 13 from using social media and bans algorithmic targeting for users under the age of 17.

A copy of the letter is below and available here.

Dear Mr. Zuckerberg,

Following recently unsealed evidence regarding Meta's online safety practices towards children, we write to urge Meta's commitment to prioritizing user safety over engagement. To that end, we request additional information about the company's online safety practices, including expectations for public transparency and clarification of its trust and safety protocols.

In the U.S. District Court for the Northern District of California's consideration of Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, a recently unsealed plaintiff's brief revealed new information about Meta's knowledge of risks to the wellbeing and safety of young users on its platforms. The documents allege that Meta employees were aware of potential and ongoing harms to young users, including impacts to attention and emotional wellbeing, exposure to inappropriate contact by adults, and challenges to taking down sex trafficking and child sexual abuse material (CSAM) on its platforms. These developments are alarming, and we are deeply concerned by allegations that Meta was not only aware of these risks, but may have delayed product design changes or prevented public disclosure of these findings.

To that end, we respectfully request responses to the following questions by March 6, 2026:

  1. Please elaborate on Meta's protocol for disclosing risks associated with user wellbeing and safety to both its users and to the public.
  2. Please elaborate on Meta's evaluation of trade-offs between engagement and user safety and wellbeing in its product design, as well as its trust and safety protocols, that impact users under the age of 18.
    1. What policies or risk frameworks inform this decision-making?
    2. What is the process for considering internal warnings and recommendations based on evidence of user harms?
    3. What teams or individual roles are responsible for making these decisions that inform user safety experience?
    4. Has Meta ever seriously considered targeting new products or platform experiences for prospective users under the age of 13?
  3. What is Meta's process for reviewing and incorporating internal warnings and recommendations regarding user wellbeing and safety?
    1. Why did Meta delay launching its "private by default" feature for users under the age of 18?
    2. What teams, departments, or individual roles were responsible for the delay in this feature launch, despite internal recommendations to proceed?
  4. Please elaborate on Meta's knowledge of how use of its platforms can impact the wellbeing of young users.
    1. When did Meta first identify evidence or research linking its products to increased depression, anxiety, or other mental wellbeing harms for users under the age of 18?
    2. Has Meta ever halted any research or studies linking its platforms or products to undesirable outcomes, such as higher rates of anxiety, depression, or reduced emotional wellbeing?
    3. Has Meta ever prevented any public disclosure of research or studies linking its platforms or products to undesirable outcomes, such as higher rates of anxiety, depression, or reduced emotional wellbeing?
    4. Has Meta conducted any research on what percentage of users under the age of 18 exhibit "problematic use?" If so, what is this percentage? Additionally, how does Meta categorize and assess the severity of problematic use on its platforms?
    5. In July 2025, Meta rolled out additional safety features for minors. When did the development of these additional safety features begin? Were any of these features previously delayed or halted because of concerns of reduced engagement?
    6. Will Meta release the design details and findings of its "deactivation study?"
    7. Meta alleged that design flaws prevented its public release of the "deactivation study." What prevented Meta from re-designing and re-conducting the study?
  5. Please elaborate on Meta's protocol for reviewing and acting on reports of sex trafficking and CSAM on its platforms.
    1. What is Meta's process for allowing users to report sex trafficking or CSAM content on its platforms?
    2. What is Meta's process for reviewing and acting on user reports of sex trafficking or CSAM content on its platforms?
    3. Has Meta ever had a policy that required a minimum number of user reports of sex trafficking or CSAM content before an perpetrating account is permanently banned?

Please direct any follow up questions to our offices. Thank you for your prompt attention to these matters.

Sincerely,

###

  • Print
  • Email
  • Share
  • Tweet
Brian Schatz published this content on February 05, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on February 06, 2026 at 20:41 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]