Ali Breland
Breaking Up and Deradicalizing the Socials
Air Date: March 23, 2020
READ FULL TRANSCRIPT
HEFFNER: I’m Alexander Heffner, your host on The Open Mind. Resuming our recent discussions on dis and misinformation we’re delighted to welcome Ali Breland who reports on tech and viral deception for Mother Jones magazine. Breland covers Internet culture and its impact on society including race and politics and has also appeared in the Guardian, Vice and on NPR and CNN. Welcome. It’s a pleasure to have you here.
BRELAND: Thank you.
HEFFNER: One of the most fascinating things that you’ve written about recently is the efforts to de-radicalize the far right on online platforms where they have been culpable for spreading, not just innuendo and dis and misinformation, but often the origin of extremists hate and violence, most recently in Germany, but it’s happened often in the United States. Can you tell our viewers about where this de-radicalization is occurring?
BRELAND: Yeah, absolutely. So to start, the radicalization occurred on the traditional platforms. People found radical content on places like YouTube and Facebook and Twitter. And there’s this newer platform that’s kind of lesser known. It’s called Twitch and it’s a video game streaming platform. The quote unquote normies don’t really know about it, but gamers are very familiar, has a very surprisingly large viewership and in the millions and the tens of millions. And you’d expect gamers to be sort of libertarian or far right. Gamergate was this important moment in their culture where they had a big harassment campaign against the women and against social progress in the gaming industry. And so gamers have this reputation for being sort of right wing trolls. But there’s this sort of constellation of Twitch streamers who are fairly high profile who are progressive. Some are socialist or leftists who are trying to sort of win over hearts and minds of gamers and sort of de-radicalize these people who are far right.
And so the most prominent one that I’ve spoken with is this guy named Destiny who’s a streamer. There’s others like Hasan Piker who’s also a host on the Young Turks, excuse me, and this streamer named Trihex. And they basically just sit down and try to push their own progressive arguments and try to debunk a lot of right-wing conspiracies. Destiny’s particularly interesting because he’ll go and debate white nationalists, he’ll debate far-right figures. And that has a surprising effect in addition to just challenging these people’s ideas he’s getting his name in the YouTube algorithm against someone like Lauren Southern’s name who’s a known white nationalist. So you search Lauren Southern. If this video doesn’t come up on the first page, it’ll eventually come up in the recommendation algorithm and you’ll see not just Lauren Southern’s ideas, but her ideas challenged and interrogated by someone who’s like really adept at tackling the sort of dangerous hoaxes she peddles about things like white replacement and things like that.
HEFFNER: Is this de-radicalization occurring in dialogue on the platform at least initially in chat rooms while folks are gaming, and not in person?
BRELAND: A little bit. So it’s hard to say. I mean if anything, some of the radicalization is probably being reinforced there and then it’s sort of changing when these people go and watch these streams and then these stream, these streamers like Destiny and Hasan and Trihex have very devout and passionate followers who are going out to other platforms presumably and arguing with other people and doing some theoreticalization efforts on their own.
HEFFNER: But the de-radicalization of that opening dialogue, does it start with the video, just as the radicalization, or is it linking up initially in beginning conversations in chat rooms on these platforms or other platforms? It doesn’t really start necessarily with meeting in person.
BRELAND: Yeah.
HEFFNER: And I’m wondering because technology has been at fault in a lot of instances for giving weapons to the arsonists, is this playing out in, in reverse or at least in some corrective measure where you can start having a conversation with someone in a sort of more peaceful or non violent context?
BRELAND: Yeah, totally. It’s facilitating both of those things in every avenue. It’s important to note too, that a lot of this is anecdotal and there’s a lot of good anecdotal evidence, but like we don’t have data that firmly shows us that this is like doing a tremendous job. But there’s a lot of anecdotal evidence to suggest this is definitely the case. But yeah, no, it’s definitely fostering these kinds of communications online and reversing the course in this kind of radicalization.
HEFFNER: I mean, have there been forensic analysis of the platforms so that we can say with authority now that the opposite is not anecdotal, in other words, the radicalization shouldn’t be understood is anecdotal anymore.
BRELAND: Yes.
HEFFNER: It is a scientific fact that the preponderance of hate occurs on these platforms and leads to, in some cases, massacre.
BRELAND: Yeah, that’s been pretty established by a lot of researchers. One of the preeminent researchers in the space who’s written about it is Becca Lewis who did a really, really good study breaking down how radicalization pipelines occur. Other people have gone, Bellingcat for example, the open source website, investigations website has done a really good job. They did this study of 75 different extremists and track their progress from places like the YouTube, comments of YouTube videos for Alex Jones’ videos through their introductions to Reddit through their introductions to far right message boards on websites like 4chan and 8chan. So it’s been well documented and there’s a lot of substantial data that shows that radicalization is definitely occurring through these tech platforms.
HEFFNER: Gaming, just like they say about sports, can be a moral bridge and a kind of healer you know, in the 90s and two thousands, we thought of the gaming culture as being an underbelly that may prompt people to actually enact violence.
And I’m wondering if, you know, the research now can establish the opposite, that in many cases, psychologically and morally, it’s tempered any sort of desire to physically harm people.
BRELAND: In terms of it being a positive force, I would say that that’s like, that could definitely be the case. The research firmly shows though that at the very least, it’s not the force that we thought it was in the 90s. At at worst, gaming is just neutral and doesn’t like have an impact, but at best yeah, it’s a really good release for people. But it’s also important to think about gaming not just as like an entity within its own, but like a community, and so when you have a community of people who have ingratiated their voices in certain ways and become more powerful, their words go further. And so Destiny is a much more valuable source for de-radicalization than like you or me going on Twitch and trying to talk about things because to a lot of these gamers, you and I are these normie-like shills who just hang out and we’re not a part of that community. Destiny is a gamer. He’s one of them. He’s extremely good at Warcraft. And so what he says in that community goes way further than anything any one else could say.
HEFFNER: Ali, I feel like I’m immersed in an episode of Mr. Robot right now with the names you’re saying. And just this conversation.
BRELAND: You mean do you need me to explain normies?
HEFFNER: Yeah, sure.
BRELAND: Normies it’s an Internet slang that’s like, I think maybe started in the far right but has been co-opted universally. And it’s like someone who is a normie, is a person who is like, it’s equivalent to another slang word basic, where it’s like you are a very normative person. You like to go to Starbucks and you like normal brands and things like that, watch The Office and like, you like good general things. Or it’s like if you’re not a normie, you’re deep in the recesses of the Internet.
You like weird things. You’re an edgy person.
HEFFNER: Well, I don’t go to Starbucks, but I think I still am a normie, I mean, you know, and I don’t like the, I haven’t really watched The Office, but there are shows that like the West Wing, you could say it’s more normie territory but I think what’s going on in Twitch is an interesting example because you were talking to me about our episodes with the executive directors of the Wikimedia Foundation and you know, we have described to our viewers how that Wikimedia and Wikipedia are the exception to the rule of incentives that are striking at the heart of democracy or civil society. And you know, what Destiny is doing that is, that is a corrective course that, you know, is sort of the M.O. of a Wikipedia where we want to engage people with content. Now it’s behind closed doors and the moderators are not in discussion with the readers. But I’m wondering how folks like Destiny can integrate more fully into an algorithm or incentive system that is better because we know that the Alphabets, YouTube, Twitter, Facebooks have corrupted us and, and society in a way that’s been unhealthy. So, you know, I’m just spit-balling here, but wondering how Destiny and Wikipedia or at least those incentives can hook up.
BRELAND: Yeah, definitely. For me, that’s like, that’s a hard question to answer. Destiny in some ways is taking advantage of the arguably perverse ways that these platforms operate in. So he’s like operating like they, all these platforms are trying to attract our eyeballs for as long as they possibly can. And so that creates an incentive where you’re going to produce very, very polemic content. And the right wing figured out that that was very easy for them to do to gain a lot of eyeballs and that YouTube was going to prioritize their content because it was very polemic.
And so Destiny figured out how to do the inverted left version of that. And it’s sort of using the, using this own weapon against the far right, against extremists. But I personally am inclined to believe that even though Destiny’s doing and Hasan Piker and Trihex are doing really useful, thoughtful things. And it’s important to, yeah, they’re, they’re doing useful and important things, but…
HEFFNER: They’re doing a public service.
BRELAND: They are. But I think that the ultimate answer isn’t in these guys doing one-off things where they’re de radicalizing people with think. The ultimate answer is in restructuring platforms, the way that Wikipedia works where you don’t have these sort of perverse incentives to produce hyper-polemic content that might or might not be true. You’re just creating a fully actual democratized system where there is not an algorithm prioritizing certain types of content.
The best information just wins out naturally because there’s no secret algorithm guiding everything.
HEFFNER: Right. And we’ve talked about this. You were alluding to a piece I wrote in Wired where greed is not prolific and isn’t the dominant incentive. How can we strive for any even small molecule of the system that you’re describing when we’re in a climate that allows YouTube and YouTube’s decision-makers to say to the Trump campaign, which they have as of this recording, come on down by YouTube’s homepage for the entire duration of pre-election, the days leading up to the election, Election Day.
BRELAND: I’m not sure what the best answer is. I’m not a policymaker, but there’s a few answers that different communities that people are pushing that I think are very interesting and worth considering. So I think the most common mainstream answer right now is something that Senator Elizabeth Warren is pushing for where you break up these companies or introduce some level of regulation to restrict their ability to prioritize certain types of information and operate in certain ways.
Breaking them up also would reduce their power in prioritizing certain types of information and feeding that to the masses. There’s also people who, these are like way less popular ideas, but there’s some interesting thinkers further on the left who are, who are socialists who just believe that these are like necessary functions of capitalism and that like, you would need to either nationalize these companies or have private co-ops replace them that, or nonprofits like Wikipedia where there isn’t a financial incentive. A lot of people would be opposed to that for a lot of reasons. There’s a lot of Americans that like, don’t think that that’s a very good idea. But I think all of these considerations are very interesting and can be fused in different ways, potentially create the scenarios that we want.
HEFFNER: How do the folks that you interact with in reporting on disinformation, both the disseminators of disinformation and the folks that you mentioned who are trying to correct disinformation and help in a way pacify or moderate the more extreme voices how do they react to the idea of breaking up the big social media companies, the folks who are in these chat rooms both on the left and the right is there any kind of agreement that breaking up the big social behemoths would be a good policy?
BERLAND: Yeah, I think that probably save for some, you know, extreme exceptions; this is a very bipartisan sort of like post-ideological point of consensus. Ironically enough that people that probably would benefit from these things, people like Ben Shapiro and others want these companies to be broken up. They’re very critical of them both from the left and the right. The right constantly calls YouTube and Facebook and Twitter biased against them. The left thinks that they’re elevating them. And everyone thinks that there needs to be some solution. There is some disagreement as to specifically how to get to these types of things.
People like Ben Shapiro and Steven Crowder who are two big right-wingers would not advocate for nationalization, would not advocate for co-ops, worker owned co-ops. But…
HEFFNER: There is this tension, Ali, right, because there’s a huge cohort of libertarian activists on the web. And so do they want to preserve the rights of these major companies? In other words, do they follow the Mitt Romney idea of a company has human dignity and human rights, or do they come together with maybe some leftists, some liberals and say this is out of control and we need to democratize and have not, you know, three or four major companies in control the whole thing?
BERLAND: I’m sure that they exist. I personally haven’t come across them that frequently. I’m sure they’re out there. I mean, Peter Thiel thinks like this. Peter Thiel thinks that pursuing a monopoly and successfully achieving one is the absolute apex of capitalism.
And he means that in a very good way. He thinks that that is an incredible thing to be able to do. But I would say by and large, I think that that’s probably a fringe position at this point. I think that most people both on the fringe right are skeptical of tech companies and think they need less power. People in the mainstream right also agree with that. Same thing on the left, across all sides of it. There’s, there’s very few exceptions to that.
HEFFNER: So I know you say you’re not a policymaker, but where does that leave us practically in thinking about the future of digital?
BRELAND: Yeah. In what sense actually?
HEFFNER: In the sense that the people who are occupying the space, the users if you will, or the readers or the viewers, they’re actually not very content with the system right now, but they’re still the cogs in the machine.
BRELAND: Yeah.
HEFFNER: And you know, that’s, I don’t know if you’re a fan of Mr. Robot, the show, watch it.
BRELAND: No.
HEFFNER: You just haven’t seen it, but I’m wondering, I urge you to, because I asked Joy Boulamwini the same thing when she was on here. I said, you have to see it. I hope she’s seen it now. But the reason is that show is foreseeing what happens when you have the spillover from a dissatisfied populous web wanting something more. In that case, it’s you know, it was zeroing out people’s debt and taking over the banks basically and saying, you know, this system is not equitable, but I’m just wondering if you see there ever being a perfect storm if we don’t resolve the policy issues on tech, that there’s an opportunity for some massive upheaval. You know, right now tech is really top-down. In order for it to be bottom up, there has to be a revolution and it can’t just be clicks and likes. And I’m wondering where that, or when that may happen.
BRELAND: So lawmakers keep telling us that we’re not going to get to that because they’re going to legislate on this prior, but you know, we haven’t seen any movement after years of rhetoric on this. If that kind of thing does happen, like I don’t, I don’t know. It’s been a long time in the United States, I think since there’s been massive upheaval and it’s, I think a lot of people are having a hard time conceptualizing like how; they know that something is bad and, but they don’t understand the exact mechanics of it. But you might not need to if you’re just really frustrated. I think other countries like France with like a history of like more quickly revolting and getting very angry, like would maybe be where these things start. But Europe also has a, at the same time, Europe has better frameworks for trying to address these kinds of issues. So I’m not really,
HEFFNER: It takes one deep fake.
BRELAND: Yeah, that’s true
HEFFNER: To kill a nation. Yeah. It takes one deep fake to inspire that revolt. Right. I mean, one powerful enough moment.
BRELAND: Yeah. I like where you’re going. Yeah.
HEFFNER: I mean that could lead to saying we need either regulation or a new system or better system. But you’ve done some reporting on deep fakes. Tell our audience what deep fakes are. We’ve said it before and how they’re potentially going to influence the digital economy.
BRELAND: Yeah. In that sense the deep fakes are these they’re algorithmically generated videos where basically it allows someone to take someone else’s face and put it on someone else’s. So, for example, if someone were making a deep fake, they could take footage of both of our faces and essentially in a weird way, have algorithms compete and basically make my face overlay yours. And then you could talk or I could talk rather, and it would look like you know, whatever’s happening on your face is actually happening on mine. And it’s this very interesting process. And the fear is that basically like I could have a situation where my face is being used to control video of Donald Trump’s face or Jeff Bezos’s face. And the harm in that comes from, you know, let’s say I have bought in, I’ve shorted a bunch of stock of Amazon. I make a video using a fake of Jeff Bezos’s face. And I say, hey, you know, I’m stepping down and I’m selling the company to someone who’s really bad at business. And then the stock tanks, I make a bunch of money off the short position. But the real concern in deep fakes is not these sort of like effects in the West and things like stock markets, which are very important but in nascent and fragile democracies, which to get to your point earlier could be a revolting point and actually has been in this in this country called Gabon in Africa. The leader, Ali Bongo, of the country disappeared for a while. People didn’t know where he was. People were very curious. It was being it turns out that he was in poor health, but this was being hidden from the public. And then the conditions about his actual health when it came out that he wasn’t doing well, weren’t revealed. It turns out that he was out of the country. People were starting to get frustrated. Unrest was happening in the country. So every year in Gabon, there’s a presidential address that the president gives. So Bongo gave the address, but it was very different than it normally was. It’s normally a 30, 15-minute 30-minute address.
This one was two or three minutes long. And, and it, it appeared very stilted. He barely moved. It had a lot of characteristics that made it look like a deep fake. To this day, we’re not sure if it is or isn’t. Experts have told me that they could go both ways. But the point is that what they also stress was that it doesn’t even matter whether or not it is because at this point just the threat of something being a deep fake is enough to spark unrest. And so people thought that this might’ve been a deep fake and it was very critical in sparking a military coup in the country. The coup wasn’t successful, but it’s still an example of just like just the presence of this technology and like the impacts that it can have and yeah, it’s, it can lead to these potential revolution, we talked about.
HEFNNER: In, you know, developing countries. And certainly I think too, in developed countries, I mean on the horizon. Could you foresee what I was hinting at – in a deep fake that has more societal repercussions than a single stock even if it is a deep fake and it’s from a company or a government, like you’re saying, we don’t know necessarily whether or not it’s authentic. So it may not be that someone is hijacking a verified Twitter account or Facebook page. It may be that it’s intentional. This is what you’re saying about the African example, right? I mean, how, can we be prepared for intentional, deep fakes arriving at our doorstep from being produced by malevolent parties that are intent on dissident forming us.
BRELAND: So I think in some cases we will be ready in the sense that we have a freer information ecosystem where we can debunk these kinds of things.
And countries like Gabon the government can control the flow of information. But that being said, even the freest media can’t account for time. So you have to clean up the mass of a deep fake in real time. And if someone puts out a deep fake and the press, can’t do their job or verify it or get things done quickly enough that can have massive implication, that can’t be reversed. If someone were to put out a deep fake on Election Day or something like that. And you know have all of these votes go in a different direction in a specific county in Wisconsin and have the person who wins that going to win the presidency. Like even if it’s a very small error, a small aberration that could have a massive effect on the fate of the entire country. So you’re totally correct. And I, there is a reality I can see, I don’t know if this would actually be the case, but if Facebook and Twitter don’t do anything to stop this maybe there is a reality where people get frustrated in like these companies letting these things happen on their platforms and people do get mad and potentially auger some sort of social upheaval over this.
HEFFNER: Is the situation so serious that every board of elections ought to have a data scientist who is prepared to forensically analyze these kinds of situations?
BRELAND: I think that that would be great. It’s also important to note too, that even in, even if you have that too, the need to have a way to get information out rapidly because sometimes you won’t even need a data forensics person to be able to look at these kinds of things because they’ll be a very obviously fake video that won’t be deep faked. It’ll just be edited. A great example of this is the drunk, infamous drunk Nancy Pelosi video, which is just a video of her speaking, slowed down with a caption of, Oh, Nancy Pelosi is drunk. She wasn’t, she was totally fine. But, and people believed it even though it was debunked immediately. It’s just…
HEFFNER: And to this day, Facebook and Twitter have permitted it to stay up as a video posted by many verified handles.
BRELAND: And exactly. And it’s, I think YouTube has a bit more stringent policy and I think YouTube will remove explicitly like fake videos. But even then, all of these platforms have loopholes and there’s no clear answer as to how these things get resolved. And that’s probably the scariest part is that no one, even the supposed masters of the universe have a clear way out for anyone.
HEFFNER: We hosted the directors of the Counter Extremism Project and the one campaign that has been moderately successful in removing content from YouTube is propaganda from radical jihadists, not white supremacists. They have made some inroads with white supremacists, but how would you assess right now YouTube’s, the adequacy of YouTube’s response to white supremacy and racist content on YouTube?
BRELAND: I think it’s very easy to say that these companies, YouTube included, have not done enough. YouTube has taken some positive steps this past week. Prior to this recording they took down Nick Fuentes’s account. Nick Fuentes is sort of the next generation of these sorts of I guess like interesting white nationalists. He’s, he’s really good at understanding all of these companies’ terms of service, knowing exactly how to step up to the line and not go over them but still spread this sort of dangerous anti-Semitic ideology. But he will code it in irony or code it in jokes or like not exactly say things that will get him banned and it’s a positive sign that YouTube is like recognizing that people flirting with their, violating their terms of service don’t deserve to be on the platform either. So YouTube is hopefully going in the right direction, but I mean there’s still a lot of other stuff they need to deal with. Facebook included, Twitter included too.
HEFFNER: Ali has really a pleasure hearing from you today. Thanks for your time.
BRELAND: Thank you for having me. I appreciate it.
HEFFNER: And thanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews and do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.