How to Demolish the Disinformation Infrastructure
Air Date: April 5, 2021
READ FULL TRANSCRIPT
HEFFNER: I’m Alexander Heffner, your host on The Open Mind. I’m delighted to welcome our guest today. She is the founding director and co-principal investigator of the University of Washington Center for an Informed Public, Kate Starbird, thank you so much for your time today.
STARBIRD: Yeah, thanks for having me on.
HEFFNER: Kate, what is the latest you can relay to our audience about the state of disinformation in the wake of Donald Trump being removed from several platforms and also the QAnon cohort being removed from a number of platforms. We know that the disinformation network still exists, but now that those rather drastic decisive actions were taken, years after they were first really warranted, where does that leave us today?
STARBIRD: I think there’s a little bit of uncertainty about what comes next. Certainly we have seen the platforms begin to take action against the super spreaders of misinformation and especially disinformation. And I think a lot of us kind of applaud those steps. We might think they came a little bit late. We might’ve wanted them to apply those steps across maybe a larger range of accounts. But certainly we’ve seen some productive action for the larger, the larger social media platforms. However, we’re also seeing some really interesting dynamics where people are moving to sort of the long tail of alt tech platforms and beginning to re-establish their activities there. And certainly when we think about the spread of misinformation and the growing sort of radicalization of large numbers of people around conspiracy theory ideologies, and other kinds of things. What we may be witnessing is a shift from that happening on the larger platforms to these alt tech platforms. So there’s a lot of sort of uncertainty and open questions about what comes next, but certainly we do, we do see some, some positive change in terms of the platforms addressing that part of the disinformation problem.
HEFFNER: So, we saw during the insurrection, during the violence on Capitol Hill, that there were alternative platforms that you’re mentioning, Gab is one of them that are alternatives to your Twitter or your Facebook. And now you see the formation of an app like Clubhouse which so far has been issuing exclusive invitations to people, but not really assessing any kind of credentials about whether they’re engaged in information sharing or disinformation or misinformation. So, on the one hand you have those platforms like Gab that are associated with the alt-right or disinformation and extremism and even domestic terrorism and then you see something like a Clubhouse come along, which hasn’t delineated those lines yet. How would you compare kind of these two different versions of emerging technology?
STARBIRD: Yeah, there’s always, you know, there’s always a period when a new technology, especially social media kind of comes into being, where the initial communities that meet up there and the norms that form around those communities shape the future of those platforms and what they look like. And they can go a lot of different directions. It really depends on who shows up on your platform and what you want to do about moderation and so it’s kind of a, an interplay between the platforms decisions that they make and the communities that begin to establish themselves on those platforms. And certainly there’s a lot of different opportunity for a platform like Clubhouse, there’s a lot of different directions for, for that to go. And in fact, there’ll be many communities probably establishing themselves there as we’ve seen in other places. So yeah, no, I think there’s a lot of again, you know, a lot of different possibility for where things are going as these platforms begin to develop.
And certainly there’s some things we can learn in the past about how they’d been used and there’s some things that we might expect to see. I mean, what we continuously see around sort of the radicalization problem and that spread of disinformation is that a lot of organizing happens on these, in these sort of niche areas and more private conversations where they kind of plan what they’re going to do. And then they bring that over into the more mainstream platforms to reach larger audiences, to recruit new people. And so there’s an interplay. We can’t think about activity online, it’s happening here or there, it’s happening across these different platforms and when we see disinformation campaigns and these sort of like efforts to radicalize people, these are cross-platform efforts and they’re using platforms and complimentary ways to achieve their goals.
HEFFNER: Isn’t the problem that as the Parlors and the Gabs and the Clubhouses spring up, that there’s still not a regulatory framework in place to address the disinformation crisis of this last decade?
STARBIRD: Yeah. I think there is a growing consensus that we don’t have a set of guidelines or a framework to guide, you know, to guide these platforms. In some cases, the platforms have said, oh, we, we, we don’t want to be the arbiters of truth. We want some guidance in others. I think you know; they might be resistant to that. So legally in the United States, there doesn’t seem to be anything to hold these platforms to account. They can make their choices. The fact that that Twitter and Facebook are now choosing to act on misinformation and disinformation, that’s their choice. They don’t have to, there’s no legal obligation for them to make any changes. And so certainly if we were just looking at disinformation and radicalization as a singular problem, then you know, a regulatory framework okay for, you know, guidelines on what’s allowed and what’s not allowed. Now we have to balance that against freedom of speech in sort of our, not only our values, but our, our legal commitments to that. And, I think there’s a lot of unanswered questions on how we’re going to bring those together for some kind of set of guidelines that may, that may help us shape healthier information spaces going forward.
HEFFNER: So, from the research that you do at the lab at the University of Washington, what would be the most effective way to approach and for this new Congress and administration, to approach reforming big tech specifically developing standards of community and information gathering so that there can be integrity or at least there can be the credentialing of integrity to understand whether something has been vetted or not?
STARBIRD: This is such a, this is a hard and open question, and I’m not going to pretend to say that I know all the answers in terms of like, what should come next and how it’s going to work. I think there are, you know, value trade-offs that need to be addressed. There are economic, you know, to, to do moderation on something like YouTube, YouTube is a cesspool for, for disinformation. It just, it goes there. And it, contaminates other sites because it serves as a resource for campaigns that continue sort of leveraging these videos and bringing them back into conversations over and over again. And yet, you know, it’s actually a very difficult task for that platform to be able to moderate that content and to be able to find, you know, even if they had policies that they were applying, it’d be very hard for them to enforce those policies based on just like it’s too much content. And we don’t have the sort of automated tools that can effectively do that work. And so to think about the trade-offs between these economic models that currently, you know, these platforms make money because they don’t have to moderate at scale. And what would it mean to have guidelines at force in the moderate at scale? Well, that would mean that perhaps some of the big platforms could pull it off, but certainly it might quash some of the new platforms from being able to emerge. And so I think you know, there’s a, there’s a lot of work we need to do to work out what, what would be fair, not only in terms of holding the big platforms to account, but also to enable new platforms to emerge and, and not just sort of like solidify the power that the big platforms already have by making a regulatory framework that basically, you know, pushes out emergent platforms. So it’s a, it’s a very, very difficult problem.
HEFFNER: What’s also a problem, Professor, is that the monopoly players like the Facebook and Alphabet of the world and of the American digital landscape, they think they can just restart political ads after an act of terrorism that could be repeated because nothing has changed systemically on their platforms to prevent that from being the ground zero of domestic terror mobilization. So, I just think it’s absolutely myopic and regrettable for these platforms to think, oh, we can restart. Insurrection’s over, we can restart the political ad making on our platforms.
STARBIRD: Yeah. I mean, we can look at a lot of the decisions that have been made by a platform like Facebook over the last four to six years and say, okay, maybe you’re within the legal, you know, your legal obligations, but certainly your moral and ethical obligations are not necessarily being met by some of the actions that you’re taking. We can talk about that on political ads. We can talk about that at allowing accounts that they knew were spreading mis and disinformation and other harmful information that was in violation of their policies, but they didn’t want to be accused of being politically biased. So, they allowed those accounts to continue. And there are other cases that we can see, you know, of you know, repeatedly sort of ignoring downplaying the impact that they hadn’t in 2016, in terms of allowing mis and disinformation from a foreign power. And then in 2020, allowing it domestic disinformation campaign to just, you know, flourish on their public pages and on their groups and, and, and, and further in their ecosystem. So yeah, I mean, there’s, it’s not just the fact that they’re monopolizing, it’s the fact that they, I mean, they have this immense amount of power on a global scale in terms of shaping discourse and seems to be, we have pretty good evidence that there’s some relationship to the, to, to this platform and the rise of sort of right-wing populism and some radicalization and authoritarianism in different kinds of places. I mean, Maria Ressa has been screaming about this for years since 2014 and 2015, about how Facebook is basically allowing for the establishment of authoritarian governments that use propaganda and disinformation to silence critics. And so, you know, they know there’s a problem there. And I think, you know, if we start thinking about their ethical and moral obligations, if they care about democracy and they care about, you know, the values that we share, you know, democratic values, certainly there, a lot of the actions that they’re taking are not upholding those values.
HEFFNER: Kate, within the very flawed system that we have today, how do you suggest deradicalizing so that democratic discourse itself can be viable?
STARBIRD: Yeah, this is such a hard, I mean, they’re all hard questions. In terms of the, what the research says is theoretical, you know, radicalization online, there’s a lot of like, okay, the algorithms or algorithms seem to have something to do with the recommendation systems, the networks, you can see how people are becoming radicalized. There’s no, like here’s the way we’re deradicalizing people in online spaces. So, these tools turn out to be really good for radicalization and not offer much in terms of helping people deradicalize. So what is it going to mean to start bringing people who are increasingly living in very disparate realities back together in some sort of shared understanding of the world and some kind of common ground, and it doesn’t seem like the platforms are designed to do that. They’re designed to have us pull apart and become isolated and in, and focused on ideas that we agree with and yelling at things that we don’t agree with, but it’s not, they don’t really seem to afford a lot of constructive conversations and the building of common ground and, and the deradicalization like the pulling people back from the rabbit hole. It’s not something that these platforms seem to be good at. So, I really think we’re going to have to look further, more broadly in society in terms of how and how we might go about de-radicalization or just like bringing people back together. I think it’s going to be in personal relationships. I think it’s going to be, you know, in intense conversations with family members and loved ones who as frustrated as we might be like somehow having to keep those pathways open, to start to start kind of building common ground. And I think, you know, somehow these platforms are going to have to cut off some of these pathways that are continually bringing, you know, bringing people down and pushing them into these echo chambers. I am not particularly confident that something like this is around the corner. I think we’re in for some hard times in terms of, you know, the polarization and the radicalization. However, I’m hopeful. I mean, I do think, you know, a lot of people are becoming to recognize this as a problem. And, and I’m hopeful that we can collectively find some solutions.
HEFFNER: If you were just to think of the digital topography, if you will, or infrastructure of web 1.0, 2.0, and let’s say the social media context of today is 3.0, are you someone who would contend that the viciousness and even the hotbed of bigotry and, you know, criminal organizing that is online today, has not necessarily multiplied since the 90s, but is just more visible thanks to the Twitters and Facebooks basically making their common denominator, you know, what the water cooler public arena is interested in, amplifying those voices. Is your sense that there were just as many people using internet, not for good in the ’90s and the 2000s as there, or the tens as there, as there are today. It’s just more visible to us?
STARBIRD: I mean, as more and more people come online and people begin to recognize opportunity whether it is to do good in the world, or whether it is to exploit people for their own benefit, we, you know, we can see escalation of things. And so we have probably more people accessing these tools. They’re sharing techniques, they’re beginning to recognize new opportunities. So, you know, I would say there’s, there’s more exploitation out there. There’s also, my dissertation was looking at online volunteerism during disaster events to help people, there’s more of that too, right. Because people can come together in new ways and they can organize them. They can, you know, try to help people halfway across the world from their, from their living room. So, you know, yeah. There’s, there’s more probably of both. Yeah, I mean, we’ve known since the 1960s, we researchers have known, I wasn’t alive, but since the 1960s that that online conversations, conversations that are mediated through computers, can go in directions that in-person conversations wouldn’t, it can, you know, flame wars happen. There can be bullying, these kinds of things. People are not always their best selves when we, when we remove the visual, we think that’s why. You know, fast forward, we have so many people doing so much of our interaction in these online spaces, the norms that we’ve seen just a bit, we can watch online norms change over time, right? So the norms that we can see develop are new. They’re, they have to do with how people are interacting in this world. That is, you know, often textually mediated, where we’re not having a lot of, you know, feedback from the other people that were there, in terms of visual feedback. And so we’re developing new norms of how and how we interact. And those norms are increasingly, if we watched January 6th, those internet norms are now manifesting in physical spaces, with how people present themselves to the world and what actions they take in the world. And so, we have to really think about this as a, you know, this is a profound difference. We might call it a problem in society right now, because you know, the ways that we interact are being changed by these, by these tools and these tools allow for some really dark things.
HEFFNER: How do you think the pandemic has exacerbated that? Or on the other hand, as you’re pointing out, more people transforming their lives through the internet, and its constructive use of technology, but there’s probably a way statistically to magnify the huge import importation of new users since the pandemic started.
STARBIRD: Yeah. Well, I mean, being my research comes from the crisis space, right? And so we initially started studying, you know, how people organize online during crisis events, crisis events always sort of occasion people doing new things in new ways, and as well as improvising with old tools. And so historically we saw a crisis event happen and people would adopt technology for the first time. For some reason, here, we have like a pandemic that’s causing us not to be able to interact in person and doing more and more of our interaction online. And in some ways it’s been fantastic because it would have been very, very isolating to not be able to do some, in fact, our, our economies would have shut down, right. We wouldn’t have been able to work in the way that we’ve figured it out in this pandemic, so that part’s been great, but it’s also, you know, brought a lot of people together in online spaces during a time of uncertainty and anxiety, when we’re actually really vulnerable to misinformation and vulnerable to manipulation. And it allowed people to develop new connections and new affinities between, you know, people who are worried about government overreach and people that are worried about vaccines, have been able to connect in new ways and recruit new people into their ideologies. And in some ways I think the online activity has limited our response to the pandemic because it’s become a huge factor for misinformation and for people to, to get a false understanding of what’s going on to turn away from, from, from the official voices that were trying to help us take action, that was going to be collectively helpful. And instead, you know, kind of radicalize us into in some ways, and people, you know, kind of projecting the knowledge that we had about the disease and, and not necessarily taking the actions that helped themselves in their communities stay safe.
HEFFNER: Kate, what are some transformations within the research that you do that can help better identify the motivations of people when they’re behaving online. You refer to the anonymity problem, which has been illuminated through comments section after comment section of websites. It speaks to the truth, which is the wild west of the Twitter and Facebook era unchecked, was those comment sections run amok destructively often. So what are some ways, when we know that the polling thing for our political campaigns in many states, and in many elections is not always honest and reflective of the reality on the ground, how have you all kind of modulated or recalibrated to consider the most effective ways to do research on internet users?
STARBIRD: Yeah, I mean, I think when we look at the research on internet users, you have to really, to get a good view, you have to look at multiple, look at the phenomenon from multiple perspectives. Now, whether that’s in one research group or whether that’s, you know, you’re looking at a phenomenon, one paper from this group, and one paper from that group and one paper from another group, because they all use different methods. I think, to really understand what’s going on we do a lot of digital trace work. So we’re actually using the, the tweets and the posts and the comments and the blogs and trying to make sense of that both like at scale using, you know, quantitative things and visualizations and also qualitative research, like looking very deeply at this content. Others might want to interview the people that are producing this to get a kind of perspective of how they are approaching things. It’s often really hard to get people to do those interviews. But I mean, we want to look at that, that kind of complimentary perspective as well. But even our view of the digital trace data we know is extremely biased because, you know, there are people not using these platforms. People are using different platforms that we can’t see, there’s all these kinds of, invisibility is based on, you know, constraints on access to data. And so I think the really important thing is to put in these complimentary perspectives and to not isolate, not think we’re isolating social media activity as a particular focus, but to understand that it’s blended into mainstream media, cable news, you know, much larger information ecosystem and to kind of understand this interconnectedness using diverse sort of perspectives into the phenomenon.
HEFFNER: In your mind, Kate, what has been the most effective study that is intersected the technical with the emotional, the human emotions here, this really, the social psychology, because you know, I haven’t seen any landmark work in connection with the current political environment. Kathy Cramer at the University of Wisconsin did some extraordinary work in communities, in focus groups in rural Wisconsin to understand what motivates folks and how they arrive at kind of their connection from the personal intimate position or posture and their public policy positions. I’m just wondering, as we conclude here, what’s been the most effective way to get at that, right, to get at the human experience on the side of the keyboard and the psychology of that person.
STARBIRD: To be honest right now, where I learned the most about that intersection has been through the journalism of, you know, a few really great reporters who are on the mis and disinformation beat, who have been able to actually kind of delve into these spaces and then talk to the people and interview the people that are experiencing some of this radicalization or having their family members experience this radicalization. The research may coming, the journalists, journalism is able to get out there a little bit faster. But, right now, you know, if you think about the field of disinformation studies, it’s very new, it’s very interdisciplinary. We have psychology, sociology, computer science, information science, these different kinds of things. Our group has been particularly successful because we use a mixed method approach on big social data. So we’ve been able to really kind of look at the trace data and gain some understanding from that. But when I, when I learn a lot outside of our groups work, it’s often from, from people that are interviewing the individuals that are having these experiences. And so I think there’s a lot of work to be done there, but I haven’t seen the landmark study that brings it all together, and I’m not sure we’re going to have a landmark study that brings it all together. I think what we’re going to have is a bunch of different kinds of studies that approach the problem from different perspectives. And then we begin to gain this holistic understanding of what the heck is going on.
HEFFNER: Finally for our viewers and listeners who are interested in your space, how can they follow your research?
STARBIRD: You can tune in, so the Center for an Informed Public, we have, you know, Twitter accounts and you know, a webpage that we keep updated. And then my Twitter account, I do a lot of tweeting. I’ve been studying Twitter for way too long, and so have spent too much time on the platform, but it’s a place where I make a lot of connections to other journalists and academic researchers. And we try to put our work out through that venue when we can, and otherwise have a webpage with lots of papers if you want to read it. But we also try to put up blogs and Medium posts to make that more accessible to a broader public.
HEFFNER: Kate Starbird of the University of Washington. Thank you so much for insight today.
STARBIRD: All right. Thank you again for having me on.
HEFFNER: Of course. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.