Skip to main content

What’s good, bad, and missing in the Facebook whistleblower’s testimony

What’s good, bad, and missing in the Facebook whistleblower’s testimony

/

What Frances Haugen gets right — and wrong

Share this story

Today let’s talk about Facebook whistleblower Frances Haugen’s testimony before the Senate: the good, the bad, and what ought to happen next.

For more than three hours on Tuesday, Haugen addressed a subset of the Senate Commerce Committee. She appeared calm, confident, and in control as she read her opening remarks and fielded questions from both parties. While she brought more nuance to her critique than most Facebook critics — she supports Section 230, for example, and opposes a breakup of the company — she also said the company should declare “moral bankruptcy.”

“becoming an almost trillion-dollar company by buying its profits with our safety”

“This is not simply a matter of certain social media users being angry or unstable, or about one side being radicalized against the other,” Haugen told Congress. “It is Facebook choosing to grow at all costs, becoming an almost trillion-dollar company by buying its profits with our safety.”

The Senate largely ate it up. Long frustrated by Facebook’s size and power — and, one suspects, by its own inability to address those issues in any constructive way — senators yielded the floor to Haugen to make her case. During the hearing titled “Protecting Kids Online: Testimony from a Facebook Whistleblower,” Haugen walked senators through most of The Wall Street Journal’s Facebook Files, touching on ethnic violence, national security, polarization, and more during her testimony.

For their part, senators sought to paint the hearing in historic terms. There were repeated comparisons to Big Tobacco, and a “Big Tobacco moment.” “This research is the definition of a bombshell,” said Sen. Richard Blumenthal (D-CT), who led the hearing.

Over at Facebook, the strategic response team lobbed a half-hearted smear at Haugen, noting bizarrely that while at the company, she “had no direct reports” and “never attended a decision-point meeting with C-level executives.” If there’s a point in there, I missed it.

Ultimately, Haugen said little on Tuesday that wasn’t previously known, either because she said it on 60 Minutes or it was previously covered in the Journal series.

What she might have done, though, is finally galvanize support in Congress for meaningful tech regulation.

Let’s walk through Haugen’s testimony.

The good parts

One, Haugen identified real harms that are taking place on Facebook services. For example, she talked about documents which indicate that using Instagram can contribute to eating disorders in some teenagers. Too often, discussions about the harms of social networks is either abstract or emotional. The primary benefit of Haugen’s leaking is to bring some empirical rigor to those discussions — and to highlight the degree to which these issues are known, but not discussed, by Facebook executives. That’s powerful.

In response, Facebook’s Monika Bickert told CNN that the same research shows that the majority of teenagers find that Instagram improves their well-being. But one of the hearing’s most powerful moments came when Haugen noted that only about 10 percent of cigarette smokers ever get cancer. “So the idea that 20 percent of your users could be facing serious mental health issues, and that’s not a problem, is shocking,” she said, citing leaked data.

Two, Haugen highlighted the value of research in understanding problems and crafting solutions. For years now, we’ve watched Congress interrogate Facebook based on spurious anecdotes about who was censored or shadow banned, or what publisher was or wasn’t included on a list of trending topics, to no constructive end.

“The problems here are about the design of algorithms — of AI.”

It was refreshing, then, to see members of Congress wrestling with the company’s own internal data. Sen. Ted Cruz, rarely seen operating in good faith on any subject, largely set aside his questions about censorship to ask Haugen about data exploring the link between Instagram and self-harm. Facebook will say, not unfairly, that senators were largely just cherry-picking with these questions. But we have to ground these discussions in something — why not Facebook’s own research?

Third, and maybe most potently, Haugen helped to shift the discussion of platform problems away from the contents of the speech they host and toward the design of the systems themselves. “The problems here are about the design of algorithms — of AI,” Haugen said, in response to a question about whether the company should be broken up. That wouldn’t solve anything, she said — the same engagement-based algorithms would likely create similar issues within the new baby Facebooks.

Haugen posited regulation of algorithms — specifically, banning engagement-based ranking like Facebook and Instagram use today — as a way to avoid the First Amendment issues that come with attempting to regulate internet speech. As the scholar Daphne Keller has written, attempting to regulate speech algorithms will likely trigger First Amendment scrutiny anyway.

Still, Congress seemed receptive to the idea that it ought to focus on broader system incentives, rather than stunts like the recent efforts in Florida and Texas to force platforms to carry all speech regardless of content. The details get tricky, but that shift would be a welcome one.

The trouble spots

For all its positive aspects, Haugen’s testimony had some unfortunate aspects as well.

One, Haugen came across as a solutionist: someone who believes that any problem created by tech can therefore also be solved by tech. This comes across most strongly in her advocacy for a reverse-chronological feed, which she argues would remove incentives to share polarizing or harmful content.

Fox News viewership tends to shift people’s political opinions more than Facebook usage

It seems possible that this is true but only marginally. Polarizing and harmful content was often shared on Twitter and Instagram during the many years that those services used reverse-chronological feeds. That’s not to say reducing algorithmic amplification is a bad idea, or that Facebook shouldn’t research the issue further and share what it finds. But given the broad range of harms identified in the Facebook Files, I found it surprising that Haugen’s pet issue is feed ranking: I just don’t believe it’s as powerful others seem to.

My second, somewhat related concern is that Haugen’s testimony had tunnel vision. Those of us who opine about social networks are forever at risk of attempting to solve society-level problems at the level of the feed. To avoid that, we have to bring other subjects into the conversation. Subjects like how the US was growing polarized long before the arrival of social networks. Or the research showing that long-term Fox News viewership tends to shift people’s political opinions more than Facebook usage. Or the other reasons teenagers may face a growing mental health crisis, from growing inequality and housing insecurity to the threat of climate change.

It’s possible to consider a subject from so many angles that you find yourself paralyzed. But it’s equally paralyzing to begin your effort to rein in Big Tech with the assumption that if you can only “fix” Facebook, you’ll fix society as well.

Finally, Haugen’s testimony focused on the documents, rather than her own work at Facebook. I can’t have been alone in wanting to hear more about her time on the Civic Integrity team or later working in counterespionage. But senators were more interested in the admittedly fascinating questions raised by the research that she leaked.

That’s understandable, but it also meant that Haugen had to regularly remind the subcommittee that they were asking her questions in which she did not have expertise. In my own talks with current Facebook employees, this is the point on which I hear the most exasperation: just because you found some documents on a server, they tell me, doesn’t mean you are qualified to describe the underlying research.

There’s an obvious fix for that — summon more qualified employees to testify! But in the meantime, I wish Haugen had taken more opportunity to discuss what she saw and learned with her own eyes.

What should happen next

Platforms should take the events of the past few weeks as a cue to begin devising ways to regularly share internal research on subjects in the public interest, annotated with relevant context and with data made available to third-party researchers in a privacy-protecting way. Facebook regularly tells us that most of its research shows that people like it, and the company’s market dominance suggests there is probably evidence to back it up, too. The company should show its hand, if only because soon enough governments will require it to anyway.

Haugen appears to have persuaded Congress that Facebook is as bad as they feared

Congress should pass a law requiring large platforms to make data available to outside researchers for the study of subjects in the public interest. (Nate Persily argues here that the FTC could oversee such a design.) I think sharing more research is in Facebook’s long-term self-interest and that the company ought to do so voluntarily. But to get an ecosystem-level view, we need more platforms to participate. Unless we want to rely on whistleblowers and random caches of leaked documents to understand the effects of social networks on society, we should require platforms to make more data available.

What Congress should not do is pass a sweeping law intended to solve every problem hinted at in Haugen’s testimony in one fell swoop. Doing so would almost certainly curtail free expression dramatically, in ways that would likely benefit incumbent politicians at the expense of challengers and marginalized people. Too many of the bills introduced on these subjects this year fail to take that into account. (Unless they are taking it into account, and quashing dissent is their ulterior motive.)

Instead, I’d like to see Congress do a better job of naming the actual problem it’s trying to solve. Listening to the hearing, you heard a lot of possibilities: Facebook is too big. Facebook is too polarizing. Facebook doesn’t spend enough on safety. Facebook is a national security risk. There still appears to be no consensus on how to prioritize any of that, and it’s fair to wonder whether that’s one reason Congress has had so much trouble advancing any legislation.

In the meantime, right or wrong, Haugen appears to have persuaded Congress that Facebook is as bad as they feared, and that the company’s own research proves it. Simplistic though it may be, that narrative — Facebook is bad, a whistleblower proved it — is quickly hardening into concrete on Capitol Hill.

The question, as ever, is whether our decaying Congress will muster the will to do anything about it.


This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.