AUDIE CORNISH, HOST:
Tomorrow marks a week since Facebook whistleblower Frances Haugen testified to Congress, accusing the company of incentivizing the spread of hate and misinformation and failing to share research on its platform's negative impact on kids' mental health.
(SOUNDBITE OF ARCHIVED RECORDING)
FRANCES HAUGEN: Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good, our common good.
CORNISH: In the days since that testimony, the years-long drumbeat for Congress to take action and impose stricter regulations on social media and tech giants has grown louder. One plan includes forcing these companies to allow independent researchers to analyze its data from the platforms on which, quote, "almost all of human experience is now taking place." Those are the words of Nathaniel Persily, a professor of law at Stanford Law School and director of the Stanford Cyber Policy Center. Welcome back to the program.
NATHANIEL PERSILY: Thanks for having me.
CORNISH: So you have actually tried your hand at drafting legislation that would allow the Federal Trade Commission to create a framework for social media companies to hand their data to outside scholars. How would this mitigate the kind of harm that Frances Haugen is describing?
PERSILY: Well, we shouldn't have to wait for whistleblowers to blow their whistles before we understand what's happening inside these firms. And so one of the heroic consequences of Frances Haugen's testimony was that we got a window into some of the practices that are happening inside Facebook. But it's a rare window, and it shouldn't just be opened when employees decide to risk their futures by testifying before Congress. We really need a sort of steady stream of data that will be analyzed by independent researchers in a privacy-protected, secure way so we all understand what's actually happening on these platforms.
CORNISH: What can third-party investigators do that governments can't?
PERSILY: Well, the government can do a lot of this stuff, but we don't trust them to have sort of the keys to the kingdom when it comes to, you know, private posts and the like that are on these platforms. We don't want government to be sort of going into Facebook and essentially surveilling the population. We need to have a system that's set up that keeps the data with the firm, but that the reports, the inferences that are developed can be released by independent researchers who are not beholden either to the government or to Facebook or these other platforms to have that responsibility.
CORNISH: The flip side is if we look back to the Cambridge Analytica scandal, that involved a firm that was improperly saving Facebook user data, right? You know, how can the government go about assuring users that none of that data, which you say is private, wouldn't be kind of misused or abused by a third party tasked with analyzing it?
PERSILY: That's the key question here. And so that's why it's important that the data remain with the firm and that researchers essentially have to go there and analyze it in clean rooms and not be able to take the data outside of the facility. The data already exists there, as we've seen with Frances Haugen's testimony. The question is whether the only people who are going to be able to analyze it are those who are sort of tied to the profit-maximizing mission of the firm or whether you're going to get some independent sort of auditors or researchers to have the same ability to figure out what's going on on the platform.
CORNISH: You know, last week, Facebook CEO Mark Zuckerberg wrote this long denial of Frances Haugen's claims. And part of it, he said, quote, "The argument that we deliberately push content that makes people angry for profit is deeply illogical." What does your research say, and how do you hear the way he's responding to Haugen?
PERSILY: Well, this is, as we academics say, an empirical question. And I should say that from the sort of snippets that we've gleaned from the outside, there's a considerable debate about the role that algorithms and that Facebook is playing in things like polarization and harmful content and the like. But we just don't know. And that's because not only are these companies economic monopolies, but they're also data monopolies. Right? They have the unique ability to analyze sort of all human experience that's happening on these platforms. And until we get access to it, we're not going to be able to answer those questions.
CORNISH: So what would you do with that third-party investigation, right? Once it happens - if it happens, does the government need regulation with more teeth? Sort of, what do you envision?
PERSILY: So as a first stage, we need to really set up the infrastructure for this kind of oversight - making sure that privacy of users is protected, that there's a kind of nonpartisan, vetted way for researchers to get access. And then the hope is that once you sort of unearth what's happening inside the firms, then you can develop sound policy on it, whether it's in the context of content moderation or antitrust or privacy and the like. If we actually understand what's going on in these platforms, we can develop sound policy.
CORNISH: With the revelations of the last couple of weeks, we're hearing lawmakers talk about wanting to act. What are you going to be listening for that might indicate they actually are ready to?
PERSILY: Well, I really want to see if they're willing to sort of seize the day here and to treat this as the emergency that it is. We can't wait four years to set up a new Cabinet agency or to draft the perfect bill. I think requiring transparency right now is a first step. But I expect that you'll see bills dealing with, you know, antitrust, competition, taxation, privacy, as well as some of the harmful content, particularly as it relates to children, pretty soon.
CORNISH: Nathaniel Persily is a professor of law at Stanford Law School and director of Stanford's Cyber Policy Center. Thanks so much for speaking with us.
PERSILY: Thank you. Transcript provided by NPR, Copyright NPR.