Facebook’s safety head was questioned by lawmakers on Thursday over what the company knew about the potential for Instagram to be harmful to young users’ mental health.
The Senate Subcommittee on Consumer Protection, Product Safety and Data Security convened the hearing in the wake of a Wall Street Journal investigation citing Facebook’s own internal research, allegedly leaked by a whistleblower, that found Instagram adversely impacted mental health issues in teens, especially girls. Among the findings were that Instagram made body image issues worse for 1 in 3 teens.
The Journal’s reporting has sparked a fierce backlash amid accusations the tech giant publicly downplayed what it knew about how potentially harmful Instagram could be while also doing nothing to prevent it.
“We’re here today because Facebook has shown us once again that it is incapable of holding itself accountable,” Committee Chair Sen. Richard Blumenthal, D-Conn., said in his opening remarks. “This month, a whistleblower approached my office to provide information about Facebook and Instagram. Thanks to documents provided by that whistleblower, as well as extensive public reporting by The Wall Street Journal and others, we now have deep insight into Facebook’s relentless campaign to recruit and exploit young users.”
“We now know that Facebook routinely puts profits ahead of kids’ online safety,” he added. “We know it chooses the growth of its products over the well-being of our children, and we now know that it is in defensively delinquent in acting to protect them.”
In the wake of the Wall Street Journal expose, Facebook announced earlier this week that it was “pausing” development of an Instagram for Kids platform, but stopped short of scrapping it.
Antigone Davis, Facebook’s global head of safety, faced bipartisan scrutiny as she defended the company during the hearing that lasted some three hours. She denied Blumenthal’s claims.
“We understand that recent reporting has raised a lot of questions about our internal research, including research we do to better understand young people’s experiences on Instagram,” Davis stated in written testimony. “We strongly disagree with how this reporting characterized our work, so we want to be clear about what that research shows, and what it does not show.”
“We undertook this work to inform internal conversations about teens’ most negative perceptions of Instagram,” she added. “It did not measure causal relationships between Instagram and real-world issues.”
Davis said the reporting “implied that the results were surprising and that we hid this research,” which she said wasn’t true and that the company has discussed the “strengths and weaknesses of social media and well-being publicly for more than a decade.”
She also highlighted aspects of Facebook’s in-house research that she said the Journal didn’t include in recent stories, such as reports that Instagram made “sadness” and “loneliness” better for a majority of teenage girls.
Davis said they have removed some 600,000 accounts on Instagram alone between June and August for not meeting the age requirement of 13 years old. She also said the company has “put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”
The hearing comes as Big Tech has come under increased scrutiny from lawmakers on both sides of the aisle over myriad issues, from allowing the spread of misinformation to allegations of political censorship. Lawmakers on Thursday compared Instagram’s policies to Big Tobacco’s previous tactics to attract users before there was government intervention.
Brooke Erin Duffy, a professor of communication at Cornell University whose research focuses on the intersection of media, culture and technology, told ABC News via email on Thursday that Big Tech’s self-regulation hasn’t worked.
Remarks from Sen. Ed Markey, D-Mass., referring to “traditional media’s regulation of material for children — including limitations on advertising that have long guided the television industry — attest to a growing recognition that external regulation of the platforms is critical,” Duffy said. “While Big Tech has long flaunted its mechanisms of self-regulation, these have failed — and continue to fail — its users.”
Duffy said another key takeaway from Davis’ testimony was “a refusal to agree to a long-term promise to abandon plans of further developing Instagram for Kids.” She called the initiative “part of a long-term strategy by Big Tech to court younger — and less witting — users who the platforms can inevitably collect data from.”
Lawmakers on Thursday called for the need to update the 1998 Children’s Online Privacy Protection Act.
In prepared remarks, Davis defended building an Instagram “for tweens,” noting that other companies such as YouTube and TikTok already have developed versions of their app for those under 13.
“The principle is the same: It’s much better for kids to use a safer, more age-appropriate version of social media apps than the alternative,” Davis said. “That said, we recognize how important it is to get this right.”
“We have heard your concerns, and that is why we announced that we are pausing the project to take more time,” she added. “We’ll keep listening to parents, keep talking with policymakers and regulators, keep taking guidance from experts and researchers, and we’ll revisit this project at a later date.”