FSFP Chairman and Senior Legal Advisor Ben Clements recently appeared at the Western New England Law Review Symposium and gave a presentation on the Digital Accountability Act and the urgent need to hold social media companies accountable when they amplify disinformation and threats of violence on their platforms.

The following is the full transcript from that presentation:

Thank you, Professor Taub. Morning everybody. And thank you also to Kathleen and Suzanna and Zack and the whole team at the Western New England school for putting together this very important and timely symposium. I don’t want to keep everyone in suspense. I’m just going to cut to the chase and get to Professor Taub’s hypothetical. The answer from my perspective is a categorical yes. Social media companies must be liable just as any other publisher would be liable. I do want to add an additional fact to the hypothetical, not a fact that I’m making up but a fact that I think is necessarily true, based on all the other facts that Professor Taub included. And that is, the harm and the damage and the danger of the restaurant owner that’s involved in that hypothetical, would be dramatically worse, many many times worse, if Alex’s false statements were posted on a social media site and then went through their algorithms and were amplified and targeted to various people that Facebook realizes will eat those things up and act on them, then if the false claims were published in a newspaper or in the traditional media. So in this kind of circumstance, we really have the law on its head. We are giving a pass to the most dangerous publisher of false information.

At Free Speech For People, the group it was mentioned I’m the chair of, we have put together a proposal, a comprehensive proposal to reform the social media companies and address this. It would change this aspect of the Communications Decency Act, and enact a whole bunch of other reforms. And I’ll get to that in more detail, but before I do let me tell you a little bit about how I got here, and the broader reasons why these reforms are needed. I’ve been a lawyer in the private, the public, the state and federal level, and the nonprofit sectors, for many years, and I’ve focused a lot of my work on how those sectors interact in our legal and Constitutional framework. And I’ve spent the last ten years working with Free Speech For People to protect our Constitutional democracy from a range of threats. From unfettered spending by the mega wealthy from corporations on our elections, to corporate control of the government, to voter suppression and other threats.

And in recent years we’ve seen perhaps the biggest corporate threat to our democracy has emerged to be the giant social media companies. They have mastered the corporate playbook, of huge lobbying spending, huge political spending, and huge campaign propaganda to support their position. And the effect of this has been to allow them unprecedented monopolistic growth, to essentially deter or ward off any serious government regulation, and to preserve this immunity that has never been available to any other communications company, whether print or broadcast. And they’ve also developed a multibillion dollar model that profits from exploiting and selling the personal information of virtually every American, and facilitating and promoting disinformation and violence and other misconduct online that undermines our elections, it undermines our civil rights, it undermines our health care, and ultimately it undermines our capacity for self-government. And this whole plan has worked so far brilliantly for the big tech companies. For the rest of us not so much. We hear greater and greater public outrage, we hear the shows of total outrage from our politicians, but so far, Congress has done what they usually do in the face of grave public harm from powerful corporations. Nothing. Every now and then, every few months maybe, we hear new revelations of abuses, have show hearings down in Washington. And the big tech execs are brought before Congress or members of Congress. They get angry, they threaten to do things, but basically they just offer a platform for the big tech execs to offer misleading and false defenses and then walk away with empty reassurances that they’re going to try harder to protect us from the very same venom that they’re actively promoting on their platforms. And big tech and their allies in the public and the nonprofit sectors have been equally successful at using this campaign to control the public debate. So we don’t hear serious talk in the public debate about actually holding them accountable. We buy into this myth that our hands are tied by the first amendment. The big tech companies have somehow convinced the public of utterly contradictory propositions, that they can’t be held liable or accountable for what people post on their platforms, because they’re not the speakers, the people posting it are. And yet at the same time, they insist that it would violate their supposed corporate first amendment rights to in any way hold them accountable for that speech that they claim they are not speaking. And this propaganda doublespeak has been so successful, that instead of talking about holding them accountable, we treat the big tech companies like they are sovereign states. We go begging them, pleading with them to put in better systems oversight, to moderate the content voluntarily. But these pleas are destined to go nowhere as long as these big tech companies can continue to profit without accountability from promoting disinformation and violence and other criminal conduct on their platforms.

So at Free Speech For People, we are trying to change that debate, to move from this myth that the First Amendment prevents us from holding social media companies accountable. And we have proposed a comprehensive set of reforms that we call the Digital Accountability Act. So the DAA includes five essential pieces. The first piece is, as we’ve alluded to already, to amend section 230 of the Communications Decency Act, so that the community that’s provided in section 230 is only available to social media companies, internet companies, that are truly acting as non-publishers. Passive platforms for people to post. But for companies that engage in manipulating the content, or using algorithms or other methods to target, to amplify it through targeting, they would be treated as the publishers that they’re acting like, and they would lose the immunity. This would be consistent with the original purpose of the CDA, to protect truly passive bulletin-board like platforms. But it would level the playing field by treating these big internet companies that are acting as publishers, just like any publisher. And of course it would address the hypothetical that Professor Taub has presented us with, so that the internet company in that situation would be liable. Second, the DAA would ban what’s come to be known as surveillance advertising. It would prohibit targeted online advertising, except for targeting based on geographical location or sorting for relevance based on search terms. Third, the Act would establish a new federal and civil criminal liability for knowingly disseminating what we call fraudulent civic information, or disinformation that relates to voting, healthcare, essential government services. The knowing dissemination of that would give rise to liability, as would targeting it, amplifying it, in a manner that presents reckless disregard for the public harm that results from disseminating and amplifying that civic disinformation. Fourth, the act would create new federal and civil criminal liability for amplifying, targeting threats of violent criminal acts online. And this is completely consistent with laws that we already have on the books that make it unlawful to solicit or encourage crimes of violence. And finally, the Act would direct the FCC to make recommendations to prohibit the aggregation and selling of personal data without informed, meaningful consent. Now, these are big reforms. They’re not easy reforms. These are harder reforms. There’s a lot of resistance to every single one of them. But they’re necessary, and they can all be enacted with consistency with the first amendment, and consistent with time tested models that we’ve had for holding people accountable for fraudulent promotions of violence in other contexts. Now, there’s no question that these reforms would require significant changes in the big social media companies, and perhaps other internet companies. But these changes are needed if we want to have a government that is compatible with democracy and with self government.


From the Q and A:

…just as it interfered with our inability to tackle COVID 19. Disinformation on the Internet interferes with our ability to deal with climate change, to deal with gun violence, to deal with racism, to deal with every issue that we face. So I think this is very serious. And I just want to add one more point. I think there is a general, every single of one of us in varying different degrees in varying different ways uses the internet a lot, all the time. And it’s like it’s our life in a way. Even though we don’t want it to be. So I think there’s this understandable fear that we can’t change anything because we’re going to lose it. 

There was a time when automobiles were killing hundreds of thousands of Americans a year. And for a long time nothing was done about it. And eventually the country said, “This is insane. We should not be letting this happen.” And we could have said, “Wait a minute. There’s nothing more American than driving your car. We can’t stop the automakers. We’ve gotta let them do what they’re doing. And if a bunch of us have to die every year, that’s just the cost.” But we didn’t do that. We had lawsuits. Courts started holding them accountable. We passed a lot of laws. And guess what? The car companies survived. We all still get to drive cars. The Internet will survive. These corporations, if they are forced to be held accountable, will find a way. We don’t have to have this system we have now.