Mark Zuckerberg says sorry to families of children who committed suicide

During a Senate Judiciary Committee hearing weighing child safety solutions on social media, Meta CEO Mark Zuckerberg stopped to apologize to families of children who committed suicide or experienced mental health issues after using Facebook and Instagram.

"I’m sorry for everything you have all been through," Zuckerberg told families. "No one should go through the things that your families have suffered, and this is why we invest so much, and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer."

This was seemingly the first time that Zuckerberg had personally apologized to families. It happened after Sen. Josh Hawley (R-Mo.) asked Zuckerberg if he had ever apologized and suggested that the Meta CEO personally set up a compensation fund to help the families get counseling.

"Internally you know your product is a disaster for teenagers," Hawley said, inciting applause from the audience.

Zuckerberg did not agree to set up any compensation fund, but he turned to address families in the crowded audience, which committee chair Dick Durbin (D-Ill.) described as the "largest" he'd ever seen at a Senate hearing. Some families in the audience held up photos of children harmed after using social media.

Zuckerberg was joined at the hearing by CEOs of TikTok, Snap, Discord, and X (formerly Twitter). Each was asked if they supported an array of online child-safety bills that have been introduced to combat harms after years of what senators described in the hearing as insufficient action by social media companies to effectively reduce harms.

Among these bills is the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act (STOP CSAM). When that bill was introduced, it originally promised to make platforms liable for "the intentional, knowing, or reckless hosting or storing of child pornography or making child pornography available to any person.” Since then, Durbin has amended the bill to omit the word "reckless" to prevent platforms from interpreting the law as banning end-to-end encryption, Recorded Future News reported.

Durbin noted that X became the first social media company to publicly endorse the STOP CSAM Act when X CEO Linda Yaccarino agreed to support the bill during today's hearing. Yaccarino also seemed to stand alone supporting the Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act, which imposes criminal liability for sharing non-consensual intimate imagery and nude images of minors.

None of the platforms voiced support for other bills like the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, which limits the Section 230 liability protections of platforms for "claims alleging violations of child sexual exploitation laws."

Zuckerberg told the committee that Meta agrees with the goals of these bills but has proposed its own legislation that would require app stores to verify ages of users and gain parental consent for app use.

"This stuff doesn't work," Amy Klobuchar (D-Minn.) said in response, suggesting that the only way to get platforms to improve on child safety is to impose liability for harms caused. "I think the time for all of this immunity is done," she added.

Some CEOs made commitments to invest more in safety, like TikTok's Shou Chew, who said that his company would invest $2 billion in safety enhancements in 2024. Senators suggested that wasn't enough, especially since TikTok does not report total earnings to understand how much of its profits are going toward protecting child safety.

Other CEOs discussed new solutions coming. Discord CEO Jason Citron said that his platform, which does not allow end-to-end encryption on text messages, is proactively working on building tech to support a "grooming classifier" with Thorn, which could be potentially used to report more grooming incidents to law enforcement.

During the hearing, senators approached CEOs largely with hostility, cutting off CEOs to prevent "double-speak" justifying current protections offered that they said was the reason why children were still being harmed by social media platforms today. Sen. Richard Blumenthal (D-Conn.) said the "double-speak" is why "we can no longer trust Meta" or any of the other social media to "grade their own homework."

Near the end of the hearing, Sen. Peter Welch (D-Vt.) asked all platforms to respond in writing with any concerns about bills discussed in the hearing that lawmakers on both sides of the aisle believe would make the Internet safer for kids. Welch said that the fact that lawmakers and platforms agreed that there was a persistent threat to kids gave him "some optimism," especially since all CEOs appeared to agree that creating a regulatory body to steer the industry on child safety could be useful. Welch attributed this rare consensus to parents who refused to accept today's child safety standards. By standing up for their kids, parents turned "extraordinary grief" into "action and hope," Welch said.

Some senators asked CEOs to commit to answering questions they have seemingly long avoided answering.

Sen. Sheldon Whitehouse (D-RI) described a tragic incident where a teen CSAM victim committed suicide in 2020 after a video remained on X (then called Twitter) for months before the platform removed it. He asked each CEO to commit to sending within five days of the hearing an explanation of exemptions to Section 230 protections that they would be willing to accept to prevent harms to kids resulting from non-responsiveness such as Twitter's. All CEOs indicated that they would.

Sen. Mazie Hirono (D-Hawaii), referencing testimony from Facebook whistleblower Arturo Bejar, specifically asked Zuckerberg to commit to reporting not percentages of harms reported by minors on Meta products but actual numbers in quarterly earnings reports to the Securities and Exchange Commission. Zuckerberg did not agree, suggesting that Meta already publishes public reports that would be a more appropriate place for reporting of that kind. Bejar has suggested that Meta should make it easy for minors to report harms on the platform and that Meta should report precise data tracking how often harms occur in order to transparently track and address emerging and persistent problems.

Sen. Chris Coons (D-Del.) has introduced the Platform Accountability and Transparency Act, which "would require social media companies to share more data with the public and researchers." Notably, none of the platforms would endorse this bill.

"Let the record reflect a yawning silence," Coons said.