Mark Wallace & Hany Farid

Combating Extremism Online

Air Date: March 3, 2018

Counter Extremism Project CEO Mark Wallace and Dartmouth Computer Science Chair Hany Farid discuss how they campaigned successfully to remove online extremist propaganda.

READ FULL TRANSCRIPT

HEFFNER: I’m Alexander Heffner, your host on The Open Mind. In response to Google-operated YouTube’s decision to remove sermons from an Al-Qaeda operative from its platform, one of our guests today commented approvingly, “This action will save lives. No longer will at-risk persons be able to casually encounter this kind of bigotry. As a society, we must say enough is enough to the extremist takeover of cyber-space. All other responsible social media companies, internet service providers, and communications platforms must take action once and for all to remove the world’s extremists from cyber-space.” Those are the words, truly important words, of Counter Extremism Project CEO, Ambassador Mark Wallace He and his colleague, Dr. Hany Farid senior adviser to the project and chair of computer science at Dartmouth College. They deserve enormous credit for their work. The center they lead and advise is a non-partisan, international policy organization, formed to combat the growing threat from extremist ideologies, they’re designed to pressure financial network, counter the narrative of extremists online, deter their recruitment, and advocate for smart laws, policies, and regulations. Gentlemen, congratulations on that achievement and welcome.

WALLACE: Thank you. That’s quite a wind-up. I feel like one of my family members put that into into your program! Thank you.

HEFFNER: Well, I think that we, we have to err on the side of pro-social solutions today. I was saying to you guys off camera, we did a show with the Anti-Defamation League CEO, in which I called the show “Incubators of Hate.” And you are combating, or attempting to combat these incubators of hate. What was your process to get YouTube, Ambassador, to that place where it would remove the content.

WALLACE: You know it was really a process of cajoling, begging a little bit of pressure, maybe a little bit of shame. Because you know look, as critical as we have been of a lot of the social media companies, that in our opinion, in my opinion haven’t done enough. These are great companies that have brought wonderful technologies, but that have frankly been misused in a manner that threaten our safety, our national security, our place in the world. And they hadn’t always done enough, so we had started calling on them through phone calls, letters, emails [LAUGHS], testimony, across the board. And, I remember having a chat with some of our social media company friends and said look, in my opinion, this is very much like any new industry that realizes- you know when the automobile industry first came onboard, we didn’t have airbags, we didn’t have bumpers, we didn’t have all sorts of things. An industry was either gonna be collaborative in advancing technological solutions that ensured that there were guard-rails to continue the metaphor. I think the social media companies had to realize that they had to be part of the solution, and now I think that they are catching up, probably a little bit late. But I think we have to celebrate them for now trying to do better. There’s still a lot more to do though.

HEFFNER: From the computer scientist vantage point, Dr. Farid, were you surprised that they finally took action after so many months, years of petitioning them?

FARID: I knew it was going to happen, it was just a matter of time. And it’s because this is a pattern of behavior we’ve seen- not just in the last few years, but frankly in the last few decades. So back in the mid 2000s we were fighting to help stop the global proliferation of child pornography. And the pattern of response on the extremism side was very similar. You deny the problem exists, you minimize what it is, you argue that there are no technological solutions, and over time with pressure, emails, phone calls, testimony, press coverage, they eventually get there. And so I think the end-game was clear to me. We were eventually going to get there, I just knew it was going to take several years because we have been here before. I, as Ambassador Wallace said, I would like to have it- to been faster, I think it was too slow. I think there is more to do, and but I’m extremely encouraged that we are able to now agree that the worst of the worst content, we are not talking about stifling free speech or expression or discussion or dialogue or disagreement, we are talking about people who are promoting violence against our citizens. We have to agree that there is no room on our platforms for this, there is no room for that, there is no room for child pornography, there is no room for lots of this type of speech. And once we’ve agreed on that, then there is a path forward at a technological level, at a policy level, and at a legal level. And I think, we’re in the early stages of getting there, and I’m optimistic, but I think as Ambassador Wallace said, there’s more to be done.

HEFFNER: Based on how YouTube arrived at Google, arrived at this decision. What next in terms of the proliferation of bigotry beyond Islamist, fundamentalist, international terrorism. We here in this country have had an epidemic of significant proportions of hate-fuel against immigrants, Jews, bigotry of all kinds. So what inspires hope that they’re gonna take action against this as a universal cause to champion?

WALLACE: Well look, in the case of Anwar al-Awlaki example, which was, precipitated our discussion, the removal of him from Google, YouTube you had a- a fascinating dichotomy. You had President Obama undertaking a finding to kill an American in Yemen at the time with a drone strike. But at the same time we were somehow protecting really free speech and the proliferation of his cyber-Jihad, his recruitment, his call to action. But somehow we were protecting him under our terms of service contract, which isn’t free speech. So I think that the future now has to become what do we want from a cultural- perspective, our social media platforms, the platforms, our neighborhoods effectively, our online neighborhoods where we want our children and our family to be there. I personally think that I don’t wanna have hate in my discussion. We protect a variety of speech, but we’ve had a robust free speech debate for hundreds of years now in the United States and I think we’ve done a pretty good job. I think that we can make decisions on terms of service that the companies that say look, that did it right, we should reward. The reality is is that pornography in many instances of course, is free speech and is legal. But for example, Facebook, if you wanted to go on Facebook today and to look at nudity, or pornography, you couldn’t do that. Because they conclude that they don’t want their neighborhood, their community to have pornography. That was a business decision. We hope that from a business perspective, from a social perspective, of our cultural perspective that these social media companies will say look, some of the worst parts of social media and the internet, we don’t want a part of our platforms. They’re not part of our terms of service and legitimate. Let’s take that out, let’s have a place that works and is successful, and really, you know promotes the best aspects of the internet. That’s where I hope this goes next.

HEFFNER: I presume some of those videos of the terrorist in question amass significant views, in the millions or billions. Which is, right? Is that accurate?

WALLACE: Well, you know. It depends. Anwar al-Awlaki certainly had an enormous following and in his videos, featured in a numerous completed terrorist attacks, they killed many people, planned terrorist attacks in the United States and in Europe. So to the extent that you could take that away and make it easy for somebody to follow an Anwar Al-Awlaki, I think that’s key. But why should, look, I studied terrorism back in the 1980s and the European, sort of nihilist, anarchist terrorist groups at the time, like Red Army Faction, Red Brigades, these are bad people. I don’t want to make, minimize that. But they were quaint in comparison to the conglomerate that would be an ISIS or an Al-Qaeda. They would I was talking to one of my friends Giulio Terzi, one of our colleagues, a former foreign minister of Italy, and we were talking about how the Red Brigades would hand out pieces of paper with five red stars in underground cafes to try to get people to join them or to recruit or to fundraise. Now look, if they were around today there’d be a Facebook page, there’d be a secure messaging, there would be a WordPress thing here. And they would have an entire conglomerate. From a legal perspective and from a business and cultural perspective, the terrorists have taken advantage of opportunities in the internet to reach into our neighborhoods, but we haven’t caught up from the business, cultural, and political perspective.

HEFFNER: Well you name it. Political, cultural, and societal ultimately right. So were those videos of the cleric being monetized in the way that Infowars, there was an excellent piece in 2017 in the New York Times Magazine on how some of the bigotry has been further emboldened through the vicious cycle of monetization.

FARID: Yeah yeah.

HEFFNER: Were does this parallel, in other words, would these companies have an incentive to not act in the case of the terrorist? And similarly not have an incentive to not act in the case of the terrorists and similarly not have an incentive because the KKK videos are being monetized?

FARID: So I mean you ask sort of the right question which is- what has been the tension with the technology companies not to remove this stuff? Right? That’s sort of a really interesting question and you know we might be able to have a reasonable discussion as to whether you should leave the Al-Awlaki akwi videos up or not, but I will tell you that back in the day when we were talking about child pornography, images and videos of eight-year-olds and four-year-olds and infants being sexually molested, the tech companies were also saying the same thing. We don’t really feel like this is our business to take this material down, we are a platform, we hide behind the Communications Decency Act that says we are simply a neutral broker, if there’s a law enforcement issue, let law enforcement deal with that. I have a real issue with that. I have a real issue with you as a company saying that we have no moral fiber here, that we have no responsibility to the world that we have unleashed this technology on. And the thing that I found very frustrating was that the same technology that we developed to counter child pornography, can be used to counter extremism, hate, and lots of other things. It is agnostic as to what it looks for. And so we have developed this very powerful technology that is capable of making these platforms safer, and we are choosing, the Googles and the Facebooks and the Twitters are choosing not to do that. We can have an interesting conversation as to why that is. I think part of it is financial, I think part of it is philosophical, we are a platform, we don’t wanna get in the business of arbitrating. I think that discussion is over by the way, and we, we’ve agreed now, we have to start cleaning up our platforms, they are poisonous. And I think the reason we are doing that is public pressure. I think it is institutional pressure. But I think also is the public- has grown weary of the platforms. We are tired of hate and the horrific that people are saying and doing online. And I think the platforms are saying, wow if we don’t clean up, our platform is going to have a limited shelf-life because people are going to …

HEFFNER: Mustn’t they act now? Because it can’t be an ad hoc basis, they have to commit. And if they take the videos of an Islamic fundamentalist down, they don’t they have to take the videos of the KKK profiteers?

FARID: Absolutely and here’s where it gets tricky because once you open that door, so you can understand why they never wanted to open the door in the first place. ‘Cause as soon as you open the door, things get really complicated. And these companies have been doing business like this for a long time. And the thing you also have to understand is the reason why Google and Facebook and Twitter are so successful is because of the scale that they operate on. It is an unprecedented scale, billions of users. How do you get to a scale like that? You, the only way you do that is by full automation. You can’t have humans in the loop and deal with a billion customers. So the way all these companies operate is by developing technology that allows them to scale very quickly, either in terms of selling advertising, getting things uploaded to their site automatically. As soon as you have a moderation, you can’t operate at that scale anymore and this is a really deep, fundamental problem with the business model of these tech companies is- that it’s not clear that they can actually control what they’ve created.

HEFFNER: Ambassador, isn’t this where you need to create, you need to imagine the scientific creation, the possibilities of being able to eliminate imagery or footage on a on a computational basis?

WALLACE: Well I think that’s exactly right, and look the scientific basis is sitting next to me. Right. You know Hany’s the world-leading scientist in hashing technology which is, I know nothing about technology but- it is the technological solution to the moral question. We’ve decided culturally, and that’s the starting point, the moral point. We’ve decided that we don’t think child pornography, child molestation/abuse is acceptable. I think we’ve decided culturally now that foreign hostile governments interfering in our elections is unacceptable. I think we’ve decided now, culturally and socially and legally, that extremists that would use online platforms to commit violent acts and hurt our people is unacceptable. So that’s the moral case. I think that now the social media companies are coming around to it. The technology absolutely exists. Hany did this in concert with Microsoft and others in the context of child exploitation and abuse with the National Violence Against Children Center. And the technology is the same. You know what we’ve seen and we’ve learned is that whether it’s a child pornographer or a violent extremist, or a manipulative Russian bot that’s trying to take an election from us, they use the same content over and over again and we can identify it. We should be able to identify that content instantaneously as Hany has developed with a this algorithm called e-GLYPH, and have it removed from platforms that adversely affect our people in our communities. That technology exists today, it’s accessible, it’s easy, it’s doable, and there’s no excuse.

HEFFNER: I want to come back to you, Ambassador, in a second on the political will question which will be imperative. The scientific perspective, this formula that you’ve designed, how could it be implemented in terms of dealing with the- onslaught of memes and videos, the combination of multi-media dimensions of that hate.

FARID: So the one thing, one of the things we have been hearing a lot about these days is machine learning and artificial intelligence and how these technologies are going to come to the rescue. That just give us some time, we’ve heard Mark Zuckerberg say this, and we will develop technology that will moderate the bad content from the good content. The problem with that is, I think it’s at best naïve. We are not even close to being able to automatically and accurately deal with a billion uploads to Facebook a day. Hundreds of hours of uploads every minute to YouTube. To operate an internet scale is phenomenally hard from an engineering perspective and there is still things that humans are uniquely qualified to do. And one of those things is to understand subtlety and nuance and intent in content. So what we advocate is a sort of a collaboration between human moderators and technology. Human moderators are at the front end, saying this is child pornography, this is a call to extremism, this is a bomb making video, this is a hate video, this is hate speech. Humans make that determination because that’s really, really hard for computers to do.

HEFFNER: And that’s what Facebook has refused to do, establish itself as a media company even though it’s made some acknowledgement to that extent. The fact that they don’t have an editorial board,

FARID: That’s exactly right.

WALLACE: But look, I learned this from Hany. Next time you have a guest from one of the social media companies, ask them in the last 15 years how much child exploitation and abuse they’ve removed from their platform. They don’t want you to know that ‘cause it’s a really big number. You know and they’ve been chasing it for a long time. And they don’t want you to know that because it’s, it’s embarrassing for them, it might hurt their sales. It also happens to be morally wrong. And we can do a lot better because the science exists.

HEFFNER: One of the things from the political vantage point that impressed me during the recent testimonies on Capitol Hill was Senator Sass’ question, which got at the core of the human dimension which is, you know, how is a computer programmer equipped to address the interpretation of the Quran and assess whether it’s fair use or not. And that really fundamentally is the question so are we not petitioning or don’t we need to? And you, as the Counterterrorism Project need to ensure that they have the editorial equipment to grapple with the challenges.

WALLACE: Look, we could have a vigorous debate upon what should or shouldn’t be removed. We’re not even there yet. If we were actually having that debate, the world would be a much better place, and look, maybe there would be parts that I would disagree with and agree with, but what I can agree with is that there’s so many not close calls. The, from Anwar al-Awlaki to child pornography to Russian bots, there is an entire universe of clearly wrong, amoral, illegal content that can be easily removed, that is replicated very quickly around the internet. Look, there will be close calls. And we as a society can debate those close calls. Before you get to the close calls, let’s agree as a community and as a society that we’re not gonna tolerate the most egregious stuff.

HEFFNER: What I think what you said so earlier was so compelling that we’ve lost this shared common value with respect to speech.

WALLACE: Right. You know it, there’s a real red herring in the room, of this online extremism debate. The social media companies try to wrap themselves in a First Amendment debate. Remember, they’re governed by terms of service as I said previously. But we have a really robust and well-developed First Amendment discourse that we’ve had over several hundred years now. And I think it’s been pretty well-done. We’ve concluded as a society that some speech isn’t acceptable, that we don’t want to have that. We need to be able to take that thoughtful discussion, debate that we’ve had, and extend it into social media platforms. And say look we’re just not gonna tolerate this. We don’t tolerate hate speech. If you put a Swastika on a Temple here in New York City, it’s- graffiti, it’s desecrating a building, it’s probably vandalism, but it’s also hate speech. If you put a Swastika on a website, in a way, I think it’s the same thing. Why can’t we say culturally, no more?

HEFFNER: And according to one expert, the climate in which Timothy McVeigh operated in the 90s where this was closed off to the web primarily is now rampant. And these companies have in effect enabled the peddlers of hate to have a legitimate platform. And, and I wish what you were saying I think is achievable or the sentiment of the new generation coming of age, but we hosted Trevor Timm, president of the Freedom of Press Foundation, and he basically said he thought that Oliver Wendell Holmes’ decision about, you can’t, you know, you don’t yell fire in a crowded theater was obsolete. That that notion was obsolete. And to me that’s so problematic because it gets at the core of this problem.

WALLACE: If we’re in a theater, and there’s a fire, not a fire, and people are trampled to death, it’s just as applicable when when Justice Holmes [you know said that, as today. And, you know our free speech, look we have free speech, but no speech is totally free. We’ve concluded that things, we won’t accept as a people. And all we’re saying is let’s extend that into things that are online. The argument that they make is that they’re just the phone line. The information coming, ah, we can’t do anything- and that’s not true because they have advertising, they have pop-up ads, and- and frankly one of the things that moved social media companies recently is that their ads and I’m probably mis-describing it. The pop up ads it would pop up on an- on an Anwar al Alakwi video or the like, were prominent companies and they said to their ad-sellers, wait a second, you can’t be selling products when you know it’s a- kill video, an ISIS kill video.

HEFFNER: So next steps for the technology and the implementation and the lobbying quite frankly, which you did feverishly and successfully. What’s next?

FARID: Well I can tell you on the technology front, what you have to understand about these groups whether they’re the child predators or the extremists, they don’t just go away. You don’t just create a technology and they go away. So think about the spam issues we’ve been dealing with for decades, the virus issues, the malware issues, it is an ongoing fight. Right now, you know for years there’s been no barred entry for these groups, right? YouTube, Twitter, Facebook, easily accessible. Now we’ve created a slight barrier to entry, and now we have to keep moving that front because they will adapt. We know the adversary in this situation will adapt, either for financial reason or ideological reasons. So we can’t simply say, okay we’ve deployed a technology, we’re going to now sit back for a decade and see what happens. We know it’s going to go bad again, and so there has to be a constant development of new technologies, thinking about how the adversary will adapt, and how do we move that forward so we are constantly fighting them? We have to make the internet hostile to the organizations and it is not hostile right now, it is in fact welcoming.

HEFFNER: And making sure, Ambassador, that they are holding up to the al-Awlaki standard in removing future videos of clerics that preach the same hate. How can we stand guard to make sure they are?

WALLACE: Well look, I’m about to say a slightly sarcastic comment and I have enormous respect for these technologies that have come out of mostly Silicon Valley. But the gods of Silicon Valley are no different than the titans of other industry. And what we want from them is we want transparency going in, knowing what they’re gonna remove because it’s very clear, I think the artificial intelligence learning is what the stuff that may happen in the future, we pick up quickly. But we know the stuff we don’t want on their platforms. So transparency, let’s have a discussion about what they’re gonna remove, and then accountability coming out. Meaning, tell us about how much you’ve removed, where you’ve removed it, how you removed it and why.

HEFFNER: Did they promise that?

FARID: No.

WALLACE: No. So, and I think…

HEFFNER: This could have the potential of being a one-off.

WALLACE: And, in my, in my comment calling them the gods of Silicon Valley, I assure you, if we, if we were referring to oil companies or car companies, which are more mature companies that have had felt the criticism, appropriately so, of community and social criticism over the years, I think the social media companies are, and internet companies are still new. But they have the same responsibilities of community and social accountability that car companies, oil companies, and every other company do. And frankly, the dangers and risk in this industry are just as profound than others.

FARID: Right. And I’m convinced by the way, without pressure from media, politicians, NGOs, there would be no action. So that pressure has to be sustained. We can’t simply step back say, you’ve taken al-Awlaki down? We’re done. That, that is not going to …

HEFFNER: And- and why is the American body politic or at least our representatives, why are they not up in arms about this?

WALLACE: They are!

FARID: They are now. Think this year has been the- and we had three Senate hearings, or two Senate hearings, and one House hearing this year alone. There is more to come. I think they finally the combination of the fake news, the election tampering, the extremism, the child pornography, the poison that is frankly affecting them personally. When these guys are, and women are up for re-election, they are worried about the fake news, they are worried about the election tampering, and I think you are seeing bipartisan support for real hard questions. We saw, it was, it reminded me of the tobacco executives. Right? The Google, Facebook, and Twitter…

HEFFNER: Exactly,-

FARID: Exactly. Putting their hand up in front of Senate and talking about how their platforms are being misused. How they are designed to be addictive. Right? It was not that dissimilar …

HEFFNER: Well I have to say, I’ve said this on a recent show, but I’ll have to repeat myself which is, it was really a shame that those elected officials did not subpoena or demand that the CEOs themselves appear

FARID: I agree.

HEFFNER: Because it’s the only industry where that hasn’t happened before.

WALLACE: I personally yearn for subpoena power in this context. [LAUGHS] But, but no it would be incredibly powerful because think of all that you’re citing the oil companies, but we actually attended, we’re not allowed to talk about it because of the ground rules, where that it had to be private, closed to the press. But it was a briefing with high, fancy leaders of social media companies in front of important Senate leaders, and without breaking those confidences, one Senator was a good, great man, said look, you guys, you, social media companies, either get your act together and deal with this, or we’re gonna deal with it for you. And I think that that’s sort of- the- the posture of Capitol Hill right now, and I hope the Capitol keeps up that pressure. And, I think we’ve seen you know we called the al-Awlaki thing a watershed moment. I do think that that was a big moment, and hopefully the social media companies are trying to do more. We’ve sensed, I think, a change…

FARID: This year has been really pivotal.

WALLACE: And you know, as critical as I am in this discussion, I, you know I do want to say, we want to work with them. We want them to continue to do the right thing, it will make their businesses better, more valuable, and they will be more respected and admired across a community and business.

HEFFNER: I’m unconvinced, gentlemen, that they will stand up against what has become this purist attitude around the First Amendment, which seems to have so much traction…

WALLACE: It’s ridiculous. It’s not the First Amendment. It’s terms of service. Go try to look at some pornography on Facebook. I probably shouldn’t, maybe this is TV, you probably shouldn’t say- go look at some nudity or some pornographic acts on Facebook. You can’t. They’ve decided that that’s free speech. They’ve decided from a business perspective that they don’t want that on Facebook. It’s not a First Amendment issue it’s a business

HEFFNER: It’s mind-boggling to me that the television regulation can be intact for the most part since the creation of the FCC and there’s no understanding that the internet has been the Wild West and has been ungovernable, thanks to the absence of any kind of terms of use being adopted and followed, or regulation.

WALLACE: Don’t say anything that you can’t say to prove the point.

FARID: I understand. [LAUGHS]

WALLACE: We’re limited in what we can say on this program. [LAUGHS]

HEFFNER: Thank you both, gentlemen.

FARID: Thank you.

WALLACE: Thank you.

HEFFNER: And thanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.