Four More Years of Social Media Failure
Air Date: October 19, 2020
READ FULL TRANSCRIPT
HEFFNER: I’m Alexander Heffner, your host on The Open Mind. I’m delighted to welcome to our broadcast today Sinan Aral. He is author of “The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health.” Welcome Sinan.
ARAL: Thanks for having me.
HEFFNER: Sinan, you also teach at MIT and are a scholar of the digital economy. Let me ask you point blank. What are you most concerned about as it relates to the election, in terms of combating disinformation right now?
ARAL: I think that this is probably the most consequential election of our generation, if not the last hundred years in the United States. I was speaking to someone who is not in or from the United States and she said it was actually a Maria Ressa, Time’s person of the year, 2018. She said, actually, it’s probably the most consequential election for the world in a long time. And we have foreign governments interfering as we speak, including primarily Russia, but also for the first time China and Iran, but primarily Russia. We haven’t done much since the 2016 election to deal with that. We also have a tremendous rise and affective political polarization in the United States. We see it in terms of animosity between the parties. We also see it on the streets of the United States these days. And so the book really covers the hype machine or social media’s role, both in terms of the spread of false news, which we’ve done a lot of the research on, as well as foreign interference, as well as its contribution to the polarization of society, all three of which are tremendously important in this election.
HEFFNER: Sinan, I want to play for you a series of PSAs that the News Literacy Project and The Open Mind Legacy Project developed that are airing nationally and in battleground states and have you respond to them.
VIDEO, PSA NARRATION: “Voting is how you can help preserve and participate in democracy. There are several legal and secure ways to cast your ballot during the pandemic that are safe. All states allow voting by mail. Studies show it is reliable and voter fraud is rare. 40 states allow early voting in person too, and you can still go to the polls on Election Day, November 3rd, you cannot vote by text on social media or over the Internet, and the election cannot be postponed or canceled. Voting depends on you. Democracy depends on us.”
VIDEO, PSA 2 NARRATION: “With the November election on the horizon we all need to critically examine the information we read and share. Your friends and family may share your views, but we all need to challenge ourselves to break out of our bubbles. Look at a variety of news sources to see if the claim being reported is accurate while always staying alert to misinformation. We tend to lean into what feels right and ignore what doesn’t. Be receptive to news that may challenge your assumptions. Voting depends on you. Democracy depends on us.”
VIDEO, PSA 3 NARRATION: “As the election approaches, we need to be on the lookout for fraudulent content masquerading as the real deal. If you see a damaging post, image, or story about a candidate, make sure to verify whether it’s authentic or not before you share it. Falsehoods spread much faster than facts. Whether it’s a claim of long lines at the polling site to keep us away or fake videos to sway our votes, let’s make sure to double check our facts before we act. Voting depends on you. Democracy depends on us.”
VIDEO, PSA 4 NARRATION: “Deciding who to vote for is important. There are people, organizations and governments trying to trip us up, manipulate our vote, or keep us from voting altogether. Watch out for phrases that frequently accompany political disinformation, like make this go viral or conspiratorial statements like the media won’t cover this or attempts to pray on our emotions like just let that sink in. It’s your vote, not theirs. Voting depends on you. Democracy depends on us.”
HEFFNER: Sinan, what do you think?
ARAL: Fantastic. We need more of this. I love the fact that you’re citing our research. So we published the research that showed that false news travels farther, faster, deeper, and more broadly than the truth in every category of information, especially in political information, on the cover of Science in 2018. I also love the fact that these PSAs are prompting people to be reflective, which has been shown in experimental studies to increase discernment between true and false news and also to reduce people’s likelihood to believe and or share false news. I also think the notion of just Google it is a really good PSA primarily because when I see false news spread across my Facebook or Twitter, it’s usually accompanied by the preamble “I don’t know if this is true, but it’s really interesting if it is.” People have to just stop doing that. And the reason is because the 80/20 rule applies to false news; just a few clicks can typically debunk a good amount of what’s being shared. If we’re just a little bit reflective and a little bit introspective and just do a couple of Google searches. But the other thing that I like about this is that it demonstrates voting is a collective action problem. It says voting depends on you, right? So it indicates to people that how they act, their individual sharing belief and decisions on voting matter. And that’s really important to their self-actualization and therefore their changes in behavior.
HEFFNER: Sinan you said something interesting, Google it, but is that sufficient? I recall in the 2016 cycle, when there were disinflation websites that were being indexed and you would do a Google query and in that first page, you might find something that’s dis or misinformation. I think Google has improved since ‘16, but how much has it improved?
ARAL: It’s not sufficient. It’s a really good first step. It’s much better than sharing without thinking, which is what a lot of people do today. Now the list of fake news websites is long, and there are many of them that are essentially written to look like real news websites, especially the URLs and so on. But you know, just being a little bit reflective and doing some Google searching is an important first step. I also think that the parts about your PSAs that indicate that it is emotionally charged, that it typically involves all caps sometimes you know typos and so on is also important. It inspires anger and inspires surprise. It’s shocking. It’s salacious, if it is that, and it’s hyping up your emotions, that’s a reason to kind of check yourself and to think about whether this is true or false.
HEFFNER: Right. And of course the social media landscape from which we will separate Google as a search engine is totally different in that we are expecting today the top 10 shared things on Facebook, if not Twitter, as well, and possibly YouTube, which we should note is owned by Google, so YouTube and Google are part of this equation, but when it comes to Facebook and Twitter, what is being reported by those who cover disinformation on these platforms every day is that of the top 10 shared posts, a majority are not just from hyper partisan news. Now they are actually debunked myths. And a majority of the top shared items on Facebook, if not Twitter as well are big fabrications.
ARAL: Yeah. So the point of the book is to go into the science behind all of this and to kind of reveal how it works, but also what’s true and what’s false about it. And what we realize when we dig into the science is that the business models of the platforms are designed to create short-term engagement. So salacious, shocking news travels farther, faster, deeper, and more broadly than the truth, primarily because the algorithms have objective functions, which are designed to maximize short term engagement and human brain’s, susceptibility to that, which is salacious and shocking combine to create kind of the spread of falsity online. And it’s important to note that the debunking is never as fast or as broad or as deep as the falsity itself. So even if it’s debunked, the debunking never catches up to the falsity.
HEFFNER: So in your book, what is the thesis for how we can adapt? I mean, that, that is part of the subtitle of your book, adapting to a culture online that is going to favor falsehood is not what we need to do. So isn’t it really these companies that have been challenged to adapt, and they’ve been challenged before 2016 and in the four years since, and they’ve failed.
ARAL: Yes. So the book goes into what’s under the hood of the hype machine, the social media industrial complex and how it works. The last chapter, the longest chapter in the book describes in detail what we have to do to achieve the promise of social media and avoid the peril. And it really hinges on the four levers, which are money, code, norms, and laws, money being the business models of the platforms, which set up the incentives for how they behave and thus how the algorithms and therefore the users of social media behave the code, which is the design of the algorithms themselves. And I go into exactly how they’re designed, why they’re designed that way and what the outcomes are, how they should be designed in order to solve some of the problems that we see with social media, norms, which is about we can’t abdicate our own responsibility as users to be responsible users of technology. How do we establish those norms and how do we make them pervade our use of social media? And finally, we know that there are a lot of market failures that exist with social media. So there need to be laws. There need to be regulation. And I go through all of the major regulatory questions, including antitrust, federal privacy legislation, election integrity, free speech versus hate speech, as well as misinformation and what we can do about it in the book.
HEFFNER: That fourth item that you mentioned, legislation or regulation, is that a necessity to prompt these companies to adapt in the three other realms, you know, it’s a chicken and egg, but in all honesty, these past years have proved his companies failures to be decisive. So you end with the regulation, but don’t we need to begin with a regulatory framework in the United States?
ARAL: Yes. So in fact, the regulation is the first part of the, how we adapt chapter. It begins with all of the regulatory questions that exist. And the first question that we need to ask is how do we establish competition in the social media industrial complex? The reason why that’s important is that if social media monopolies, if they are monopolies in a true legalistic sense, do not have competition, they have no incentive to reduce the negative, the pollution of our information ecosystem or the negative externalities that they create for society. You might be surprised to read how I think we get to competition in this marketplace. It’s about structural reform of the economy. It has more to do with data portability, social network portability, interoperability that’s more akin to how we regulate the cell phone market and achieve competition there, than it is necessarily to trust busting.
HEFFNER: Can you expand on that for our listeners and viewers?
ARAL: Absolutely. So if you think about whether we should break up Facebook, okay, you have to begin with the first principles of the economics of the marketplace. The social media marketplace is guided and run by network effects and economies that run on network effects tend towards monopolies regardless of what any particular company is doing. The reason is because the bigger a company gets, it creates a winner-take-all monopolistic situation. And the reason why that’s true is because there are, there’s no legislation that forces them to be interoperable with their competitors. So if you were to break up a given monopoly, let’s say that Facebook is a monopoly for argument’s sake. If you break it up, the next Facebook would just tip into a monopoly because you haven’t regulated the underlying market structure of the network effects of the economy itself. So, as I say, in the book, breaking up Facebook, if you’re talking about competition, is like putting a band-aid on a tumor. You need to solve the root causes of the tipping towards monopoly and the economy. And that happens through interoperability legislation, like the Access Act, which is in front of Congress now, as well as data portability, people owning their data from their social networks and forcing companies to be interoperable, just like we did with the cell phone market, which created competition there.
HEFFNER: Is part of that legislation users’ ownership of data?
ARAL: In a sense it is. It’s about a social network and identity. Now ownership is a very tricky concept, but the legal right to take your identity and your social network and easily interchange messages from one social network to another is the essence of how you create competition in markets with such strong network effects as we have in the social media market. And the same is true of the telecommunications or cell phones. When, if you remember you used to not be able to take your number with you if you moved from Sprint to Verizon. We legislated that you could do that and that you should be able to call from one network to another. And that created a ton of competition. It’s almost surprising when you think about the cell phone market that that does not exist today in social media, given our experience with cell phones in the U.S.
HEFFNER: And what would be the analogy in terms of mobility?
ARAL: In terms of mobility, the analogy would be that I can take my social network and my data with me from Facebook to some Facebook competitor, with a very easy set of clicks, rather than what we have now, which is no interoperability. It also means that I own my own quote, unquote, I am able to do what I want with my identity and my social network. And I can easily exchange messages from one network to the other. If I can switch easily, if users could switch easily from one social network to the other, the social networks would try harder to make their platforms cleaner and better so we stick with them rather than switch to a competitor because we can’t switch now, they have no incentive.
HEFFNER: Sinan, that’s such an important concept. And that thesis in the book is one that our viewers I know will grapple with, and it may stimulate them to lobby for reform, which is urgently needed. Let me ask you about the mobility question with respect to privacy. In Europe, they’ve taken more advanced measures in the wake of Cambridge Analytica and other scandals of stolen data. But have they taken the step towards the mobility that you’re describing?
ARAL: Great question. And privacy is such a huge interrelated topic with regard to this because as I describe in the book the interdependencies of all of these seemingly separate questions make regulating social media, all that more difficult, what do I mean? Well, if you have interoperability, then Facebook has to by law give access to third parties to the user’s data so they can make those services interoperable, but that starts to bring privacy under threat, because now that’s how Cambridge Analytica got access to Facebook data was Facebook sharing data with interoperable companies. And so is it’s such a great question. You’re obviously very well versed in these topics because people sometimes don’t understand that inter-relationship when it comes to privacy, we have three regimes in the world. We have the Chinese model, which is essentially a surveillance state. We have the European model, which is very, very strongly in favor of privacy. And we have the United States in the middle, which is a hodgepodge of 50 states doing different things with California leading with the CCPA. And we will probably be moving towards federal privacy legislation in the U.S. the question is, what will that privacy legislation look like. As I described in the book, it should balance many different important goals. So for instance, we want privacy both from a utilitarian, but also from a deontological perspective, it’s important for our rights, but it’s also important for free speech. It’s important for being able to freely express ideas without being a downtrodden on the Internet. However, if you protect privacy without regard to other values, you will stifle scientific research. For instance, genetic research is stifled in Europe because of privacy laws that make it difficult to share genetic data in order to do science about it. As well as for instance, auditing our elections, if privacy laws force companies to expunge data from Facebook, how can we audit Russia’s attempts to manipulate the election, as well as many other important scientific public policy needs for data traces in order to understand what’s happening to the hype machine under the hood, a good piece of federal privacy legislation in the United States can be written and needs to take all of these competing values into consideration as it’s written.
HEFFNER: Does the regulatory environment in Europe nevertheless, suggest that those new competitive entities are going to emerge there, rather than here, because countries in Europe are going to force social media’s hand so that that mobility will be required by law. And the next generation of the companies that want to adhere to those values will come from outside of the United States.
ARAL: I think it’s yet to be seen. It’s unclear what kind of interoperability we’re going to get in Europe and what kind of interoperability we’re going to get in the United States.
HEFFNER: So there’s nothing mandated by the EU in terms of mobility, regulatory environment, where you can demand from a Twitter or Facebook that you can move to a small child of a media platform. It may have 10 users, but you’ll bring your data, your friends, that’s not mandated yet.
ARAL: Well, the devil is always in the details of these laws, in other words. So for instance, when Senator Kennedy asked Mark Zuckerberg if there was data portability on Facebook, in a congressional hearing in the United States, he said, yes, Senator, you can download your social network and take it with you now. But the trouble with that is when you hit the download your information button; it is a list of your friends.
HEFFNER: There’s nowhere to go.
ARAL: So exactly so, well, there’s not only nowhere to go, but it’s not actual interoperability. It’s this illusion of portability.
ARAL: The question is, what does this actually look like when companies, when Europe or the United States pass and ratify across all of the European nations, these laws, and then you see the attempts at compliance and then the, you know courts’ attempts to enforce the actual letter of the law, what the code looks like as the end result. And my answer to your question is we don’t know yet. All of the actual reality that evolves from the laws that are being discussed and, or passed and ratified, and then complied with, and then the case law that goes into whether they’re compliant, is still evolving, but the end result is really the goal, is really the thing that the devil in the details of what ends up happening will determine whether or not there’s innovation and competition and protection of privacy or not.
HEFFNER: You say we have to start with the regulation. The reality though, is that even if there’s a change in political control in the White House, or in both chambers of Congress the social media monopolies, and I’ll call them monopolies, exert tremendous control over Capitol Hill and legislators they’re lobbying constantly, if you thought that Shell was only giving money every minute of every day to politicians know the new Shell is Facebook or Twitter or Google. And it’s not that new you and I know that. The point is this; these companies have not self-governed with respect to proper regulations in the four years, since 2016. With respect to the algorithms, are we going to expect any self-regulation at any point because it doesn’t seem like the algorithmic changes are in effect or have been in effect?
ARAL: Well, there is no silver bullet to the social media morass we find ourselves in, but there are a number of solutions if implemented together would work. So to answer your question you know, this is not a book about campaign finance reform, but obviously the regulation of campaign finance as well as lobbying and so on is a really important question in the United States and in fact abroad as well. We need a national commission on the relationship between technology and democracy that’s populated with experts, journalists, activists, scientists, as well as the platforms and policy makers because this is such an important, you know, topic in the future of our democracy. I also think that if we have more activism, you know, we’ve seen baby steps, the Delete Facebook movement, the Stop Hate for Profit movement, which for a second, did have a meaningful impact on Facebook’s bottom line, but it was a drop in the bucket. It was for only one month. If we have more activation of what I call the norms that could create more pressure in combination with a national commission that’s advocating the right regulatory frameworks and more PSAs like the ones that you’re putting out that indicate, that create awareness on the part of voters for what’s important to create both innovation and competition, as well as protection of our societal values like privacy and free speech and the curtailment of harmful speech. I think we can make meaningful progress. And I describe how all of these things can work together to create…
HEFFNER: The minute that we have left. And I regret that, but hope that we’ll come back to discussion in person at some point. In the minute we have left, what immediate algorithmic changes could these sites make in the absence of legislation that would be effective in information integrity?
ARAL: Yeah. Great question. I have so many answers to that one, labeling of information to give us provenance for the information that we’re consuming, the food you consume from the grocery store, extensively labeled the information you consume no labels as to its veracity, the prior voracity of the source, the number of sources that were quoted in order to get to a conclusion, what you’re reading, what, where the source comes from. Another thing: algorithmic amplification. Okay. So we know that algorithmic amplification creates a tyranny of trends. The platforms have been successful in demonetizing and reducing the spread of things like anti-vaccine misinformation by reducing its coming up on search results, by not allowing people to profit off of ads next to anti-vaccine content, by reducing the amount of shares on WhatsApp, slowing all information down. First they reduced it to five. Then they reduced it to one hop and all of these things in combination or technical design-based solutions to the code, which are all discussed in detail in my book that can make meaningful, immediate advances in cleaning up.
HEFFNER: And we do know Sinan, that when we shared stories that were not sensationalistic on Facebook in the early days of the platform, they still went viral through our friends and they were more effective and it wasn’t hyper partisanship, but it was still trending. You can have trending and you can also make money as an imperative without being the lowest common denominator. I’m so grateful for your insight. Sinan Aral, author of “The Hype Machine,” really delighted to host you today and please stay healthy and hope to meet you in person soon.
ARAL: You too, thank you so much for having me Alexander.
HEFFNER: Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews and do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.