top of page

Meta/Facebook

Child Safety Report

The U.S. Surgeon General report on Social Media and Youth Mental Health called this a “urgent public health issue.” Social media impacts children’s brains differently than adult brains. It also poses physical and psychological risks that many children and teens are unprepared for, including sextortion and grooming, hate group recruitment, human trafficking (for any means), cyberbullying and harassment, exposure to sexual or violent content, invasion of privacy, self-harm content, and financial scams, among others. 

In 2024 and 2023 shareholders filed a Child Safety Report shareholder resolution at Meta. It asks the company to adopt targets and publish an annual report "that includes quantitative metrics appropriate to assess whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms."

 

The 2024 Child Safety resolution at Meta received 18.5% of the vote. This represents a majority of 59.1% of the non-management controlled vote (CEO Mark Zuckerberg gets 10 votes for one share, while regular shareholders get one vote one share - consequently he controls over 60% of the vote). Over 925 million shares were voted FOR the resolution which, based on the $474 closing stock price on the day of the annual meeting, represented more than $439 billion in stock value – more than the combined share holdings of the company's four largest institutional investors - Vanguard, BlackRock, Fidelity and State Street.

Similarly, the 2023 Child Safety resolution received 16.27% support which equals majority support of 53.8% The 817 million shares voted for the resolution were valued at more than $216 billion on the day of the annual meeting.

 

Prior to our broader focus on all child safety risks, shareholders had raised concerns with the company about online child sexual exploitation. Between 2020-2022 shareholders filed three resolutions that asked the company to assess the risk of increased sexual exploitation of children as it develops and offers additional privacy tools such as end-to-end encryption. Meta never provided this information.

Child Sexual Exploitation Online

In 2023 there were nearly 36 million reported cases of online child sexual abuse material (CSAM), nearly 31 million of these (85%) stemmed from Meta platforms including Facebook, WhatsApp, Messenger and Instagram. . This represents an increase of 93% from Meta’s nearly 16 million reports in 2019 when shareholders first raised this issue with the company. Meta is currently applying end-to-end encryption to all its platforms–which without first stopping CSAM–could effectively make invisible 70% of CSAM cases that are currently being detected and reported.

 

Meta’s decision to expand end-to-end encryption across its platforms, without addressing the issue of child sexual exploitation, has led to an immense backlash and poses legitimate risk to children worldwide. Governments, law enforcement agencies and child protection organizations have harshly criticized Meta’s planned encryption, claiming that it will cloak the actions of child predators and make children more vulnerable to sexual abuse. Pending legislation in Congress and other countries could make Meta legally liable for CSAM. The company is facing increasing regulatory, reputational and legal risk due to this issue.

 

Meta executives have admitted that encryption will decrease its ability to report CSAM, saying, “if it’s content we cannot see then it’s content we cannot report.” Yet Meta is intent on applying encryption as it claims that privacy is a valid tradeoff for the risk of increased child sexual abuse. Meta’s CEO Mark Zuckerberg stated, “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.”

As the world’s largest social media company and the largest source of reported child sexual exploitation online, Met’s actions will, for better or worse, have a major impact on global child safety. 

Shareholder resolutions & statements

The 2022 Meta/Facebook shareholder resolution2021 Facebook shareholder resolution and 2020 Facebook shareholder resolution asked the company to report on “the risk of increased sexual exploitation of children as the Company develops and offers additional privacy tools such as end-to-end encryption.” Our initial 2020 resolution was filed after an exponential surge in online child sex imagery. The volume of online child sexual abuse material continues to grow dramatically.

 

Proponents of this resolution are not opposed to encryption and recognize the need for improved online privacy and security. There are a number of technological developments that should allow anti-CSAM practices to coexist with encryption. Proponents believed that these needed to be tested and applied before Meta launched its encryption plans. Support for this resolution was not a vote against privacy, but was a message to management that it needs to take extra precautions to protect the world’s most vulnerable population – children.

Sarah Cooper, a survivor of sex trafficking who was contacted by a predator through Facebook Messenger, spoke at the 2022 and 2021 Facebook annual shareholder meeting in which she described the link between Facebook and her own horrific ordeal. Read or listen to Sarah's 2021 statement. Proxy Impact CEO, Michael Passoff spoke at the 2020 Facebook annual shareholder meeting. Read Michael's statement on the business and social risk from Facebook's encryption plan.

Similar to our current Child safety Report resolution, our earlier Encryption and Child Sexual Abuse Material (CSAM) shareholder resolutions received majority support of the non-management controlled vote.

 

Our 2022 CSAM resolution received 17.3% of the vote, representing 56.7% of the independent vote. The resolution garnered the support of over 910 million shares valued at about $167 billion (based on the closing stock price of the day of the annual meeting).

 

Our 2021 CSAM resolution received 17.25% of the vote, representing 56% of the non-Zuckerberg controlled vote (up from 43% in 2020). The resolution garnered the support of nearly 980 million shares worth about $321 billion in stock value.

 

Our 2020 CSAM resolution received a 12.6% vote—which was about 43% of the non-management controlled stock—with 712 million shares worth over $163 billion.

Despite receiving the support of nearly a billion shares representing hundreds of billions of dollars in value, Meta has barely engaged with shareholders on this issue, often often taking 12-18 months to set up calls with proponents of these resolutions. 

Proxy memo / Exempt solicitation

 

Shareholders can file detailed information with the SEC to solicit support for their shareholder resolution. This document is officially called an "exempt solicitation," but is often referred to as a proxy memo.

Read the 2022 Meta-Facebook exempt solicitation which provides details on the link between social media and child sexual abuse, Meta’s central role, the impact of end-to-end encryption on CSAM, the false choice between privacy or child protection, the financial risk to Meta, law enforcement and child protection agencies' call for a delay to encryption, Meta's response, and a rebuttal to Meta’s proxy statement.

Press releases

 

Read our Facebook press releases:

Articles

Read our articles on Facebook and child sexual exploitation:​

bottom of page