At a hearing focused on stepping up liability social media and technology companies would face for the actions of their users, CEOs didn’t support new rules.
Leading social media company CEOs faced probing questions over their safety policies on Capitol Hill during a hearing on online child sexual exploitation.
The session, held by the Senate Committee on the Judiciary on Jan. 31, grilled TikTok Inc. CEO Shou Chew, Discord Inc. CEO Jason Citron, Snap Inc. CEO and co-founder Evan Spiegel, X Corp. CEO Linda Yaccarino, and Meta Platforms Inc. founder and CEO Mark Zuckerberg over the harms the committee alleges the leading social media and online communications platforms are allowing to befall children in the United States.
Sen. Dick Durbin (D-Ill.), the committee’s chair, noted only Mr. Chew and Mr. Zuckerberg appeared under their own volition. The remaining witnesses had to be subpoenaed to testify.
After hearing opening remarks from Mr. Durbin and Sen. Lindsey Graham (R-S.C.), the committee’s ranking member, that criticized the existing laws that prevent online companies from facing civil liability for the actions of the users of their platforms, the CEOs offered a defense of their platforms.
Mr. Citron, who oversees Discord, the instant messaging and voice-over-internet social platform that was initially designed to serve video game players, said more than 60 percent of its users are between 13 and 24 years old. He touted the money Discord spent to acquire an artificial intelligence tool aimed at eliminating bad actors.
Mr. Zuckerberg, whose company owns the popular image-sharing app Instagram as well as its signature product Facebook, said the existing body of scientific research shows that social media use does not harm young minds. Instead, he said, it provides young people a positive outlet for self-expression.
Mr. Zuckerberg continued to say Meta has about 40,000 people working on safety and security. Moreover, it has spent more than $20 billion on the issue since 2016, he said.
Online Child Exploitation
Mr. Spiegel said Snap’s main product, Snapchat, is transparent about letting users know that images and videos can be saved from its disappearing message app. Additionally, it proactively scans messages and reports abusive accounts. Its interventions, he said, led to more than 1,000 arrests in 2023.
“We believe that people under the age of 13 are not ready to communicate on Snapchat,” Mr. Spiegel said.
Mr. Chew made similar remarks. He said the users of TikTok, the video-looping social media platform with more than 1 billion users worldwide, are protected by “industry-leading policies, use of innovative technology and significant ongoing investments in trust and safety.”
TikTok, Mr. Chew said, has more than 40,000 employees globally committed to trust and safety and plans to invest more than $2 billion in this wing of the company. He declined to share revenue figures when asked by Mr. Graham.
Only Ms. Yaccarino said X is committed to being part of a federal solution to the issue of online child exploitation. Since the company formerly known as Twitter was acquired by Elon Musk and rebranded as X Corp., she said the level of enforcement substantially increased.
“It is time for a federal standard to criminalize the sharing of non-consensual, intimate material,” Ms. Yaccarino said. “We need to raise the standards across the entire internet ecosystem, especially for those tech companies that are not here today and not stepping up.”
In his line of questioning, Mr. Durbin asked his witnesses if they supported opening themselves to civil liability for hosting, storing, or making child sexual abuse materials available.
For example, Mr. Durbin, citing law enforcement reports, identified Snapchat as “the pedophile’s go-to sexual exploitation tool.”
“We already work extensively to proactively detect this type of behavior. We make it very difficult for predators to find teens on Snapchat,” Mr. Spiegel replied. “There are no public friends lists, no public profile photos. When we recommend friends for teens, we make sure that they have several mutual friends in common before making that recommendation.”
Mr. Durbin called on Mr. Citron to quiz him on how well its platform protects children when it cannot use automated tools to patrol servers with less than 200 users.
“So how do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves?” Mr. Durbin asked.
“We deploy a wide array of techniques that work across every surface on Discord,” Mr. Citron said.
Mr. Durbin was more pointed with Mr. Chew.
“Why is it TikTok is allowing children to be exploited into performing commercialized sex acts?” Mr. Durbin asked.
“I respectfully disagree with that characterization. Our live-streaming product is not for anyone below the age of 18,” Mr. Chew replied.
Mr. Graham asked the witnesses if they could support any of the proposed bipartisan bills the committee had approved to force the technology companies to face civil liability they currently do not deal with. None of them could directly say they did.
“If you’re waiting on these guys to solve the problem, we’re going to die waiting,” Mr. Graham said. “Nothing will change until the courtroom door is open to victims of social media.”