Social media executives are set to testify at a US Senate hearing aimed at addressing online child exploitation. The CEOs of Meta, TikTok, Snap, Discord and X, formerly known as Twitter, are expected to defend their stance on Wednesday amid growing concerns and questions about how platforms are considering their impact on children.
Among the politicians attending the hearing, Connecticut Democratic Senator Richard Blumenthal told reporters, “We’re going to work hard to hold their feet to the fire.” Ahead of the meeting, Blumenthal met up with students to discuss the mental health crisis, which he says is driven by social media.
Our young people are on the frontlines of a youth mental health crisis driven by social media.
Today I met with Conard High School students to discuss their experiences on these platforms, the importance of the Kids Online Safety Act, & this week’s hearing with Big Tech CEOs. pic.twitter.com/baXV0VrWpE
— Richard Blumenthal (@SenBlumenthal) January 29, 2024
This is not the first instance of Big Tech executives being summoned before the Senate. Meta founder Mark Zuckerberg appeared before Congress in 2018 concerning the Cambridge Analytica scandal and the spread of fake news on the platform. However, representatives from the various tech firms invited for the hearing mentioned that their CEOs plan to extend gestures of goodwill towards senators and the public by supporting specific legislative proposals and providing policy commitments.
X’s head of U.S. and Canadian public policy, Wifredo Fernandez, told NBC News that CEO Linda Yaccarino is set to express backing for the SHIELD Act and additional child protection laws.
The SHIELD Act, introduced by Minnesota Senator Amy Klobuchar, will criminalize the distribution of sexually exploitative images of a minor that does not constitute sexually explicit content as required under the legal definition of child pornography.
According to POLITICO, a Snapchat spokesperson said its owner, Evan Spiegel, would support the Kids Online Safety Act, which would force social networking sites to promote online safety by tackling illegal material and content that is harmful to children, conducting regular risk assessments, and properly enforcing age limits.
In October, a massive lawsuit filed by 33 states accused Meta of knowingly designing its platforms to be addictive and harmful to children’s mental health. The claims originate from internal research leaked by whistleblower Frances Haugen in 2021, revealing that 13.5% of teenage girls reported Instagram worsens suicidal thoughts, and 17% stated it aggravates eating disorders.
CNN reported that Rosemarie Calvoni was also suing Meta and other social media companies over her daughter’s struggle with anorexia.
Only days ago, Meta announced a partnership with the Center for Open Science in a bid to deepen the understanding of how social media usage may affect users’ well-being, particularly among younger demographics. At the same time, it unveiled a major privacy update, marking a significant stride in teen user protection. However, it has received criticism for disbanding its responsible AI team last year amidst the ongoing issues surrounding user well-being.
Featured image: Canva