Renee DiResta

A Union of Concerned Technologists

Air Date: February 24, 2018

Renee DiResta of Data for Democracy discusses how antisocial media and technology companies are threatening society.

READ FULL TRANSCRIPT

HEFFNER: I’m Alexander Heffner, your host on The Open Mind. She wanted of peer-to-peer misinformation, and congress listened. The New York Times profiled last year a digital leader on the battle ground against anti-social media that are increasingly corrupting society. Author of “The Hardware Startup, Building Your Product, Business, and Brand” Renee DiResta is now head of policy for Data For Democracy, and previously worked at O’Reilly AlphaTech Venture Capital. Diresta, who wrote her college thesis on the influence of propaganda on the 2004 Russian election, studies the practices of manipulative automation, algorithms, and disinformation campaigns. She advised the United States congress in anticipation of last year’s testimonies of the social media executives. “We were closely monitoring to see when the companies gave misleading or partial answers, so that we could follow up,” Diresta told the New York Times. On Wall Street, she observed an overreliance on algorithms to identify bad actors. And later, when her son was born, saw firsthand, Facebook’s viral amplification of conspiratorial, anti-vaccine propaganda. So, today we explore this dark side of the new web. Renée, I was really heartened to read this New York Times …

DIRESTA: [LAUGHTER] Thank you.

HEFFNER: Piece, a profile of you and of the crucially important work you’re doing. What has been the response since the Times article?

DIRESTA: It’s been very positive. One of the reasons that I felt that it was kind of OK to, to be public about it was we wanted to encourage more people to, to begin to participate, particularly in the tech community. So, folks that have worked at platforms, or been involved in technology in some capacity built recommender systems, we feel that there’s a moment now in Silicon Valley to begin to think about the impact that the technology we’ve been building is having on us as a society.

HEFFNER: You and Tristan Harris and others have considered the prospect of a kind of union of concerned technologists.

DIRESTA: Yes.

HEFFNER: How can you take the, the positive energy around that movement and execute, and get real reforms on these social media?

DIRESTA: So I think there’s a couple of different ways that you can push reform in an industry, right? One of those is market reforms, it’s customers deciding that they’re going to spend their dollars elsewhere, or have a boycott, or demand change. That’s a little bit more difficult in the tech industry, because the platforms are so hegemonic. They, you know, Google and Facebook and Twitter are, there are other smaller search engines perhaps. There are other smaller social networks, but really those are the, the big three. And so we wanted to, you know, absent the opportunity to have market driven reform, the other two choices are legislative reform, which takes a very long time and is often not really effective. And then more of a self-regulatory model is kind of a third choice. And that’s where you encourage the industry to begin to think about steps it should take on its own.

And the union of concerned technologists, the inspiration really came out of the pubwash movement and the movement of nuclear scientists to think about regulation with the Russell Einstein manifesto and other convening’s that they held to inform policy makers and have discussions as a community. So we think that there’s a time to begin to have that conversation in tech right now, conversations around design ethics, conversations around misinformation and disinformation, conversations around the way we use our devices and the Tristan focuses quite a bit on the physical and individual and interpersonal implications of the technology. So there’s a number of different levels of the problem. There’s personal and then there’s societal. And I think it’s time to really galvanize the energy that’s come about in the Valley.

HEFFNER: We’ve been concerned though, since the social media executives testified, and again, it was not the CEOs. It was the counsels, the lawyers for these social media companies, that the questions that Chairman Burr and ranking member Warner asked, and other members of the committees went unanswered. Which of the questions, if any, have been answered?

DIRESTA: So, there’s an opportunity, in addition to the public hearings, to submit questions in writing and request responses in writing. They’re called questions for the record,

HEFFNER: Right.

DIRESTA: And so we’re waiting to see what comes back on those now. I think that the number one area that has been insufficiently dealt with for, for people, is the right to know, the idea that if you were targeted, if you saw misinformation, I think it’s very important that people begin to understand that this isn’t something that happens to other people. This isn’t, you know, something that happens to the low-intelligence or the highly partisan, or the Democrats, or the Republicans. This is something where these campaigns targeted everybody. And so I think that the first step in beginning, in people beginning to understand the impact that this is having on us as a society, is understanding that they themselves are not, are not immune to it, and that they themselves were also potentially a target for this. So I think you know, Senator Blumenthal is also continuing to advocate for right to know. We’re waiting to see what the companies do.

HEFFNER: I love that you said that, because I was very interested in Senator Reed of Rhode Island, his exchange, in which the corporate executive really had no idea what he was talking about, and he was saying to the executive, this is what a correction is in the newspaper. And you need to do this to all the people who saw the disinformation on their feeds,

DIRESTA: Yeah.

HEFFNER: And there was, they, they sat there, dumbfounded. Like they didn’t understand that yes, we still operate in a society where if you need to correct the record you tell all your readers. You tell your millions of subscribers. And they were suggesting, there’s no mechanism for doing that.

DIRESTA: Right, and that’s just nonsense. And the, the problem is, I think, this stems from the companies unwillingness to acknowledge that they’re media companies. So that, I think, was another exchange that, that came up during the hearings, the requirement that rather than hiding behind the idea that they’re a neutral tech platform and anything that’s out there just happens to exist and it, they have no responsibility for it, they do have responsibility, because the algorithms are designed in such a way as to surface particular types of content over other types. So they are, in fact, making editorial decisions, just not in a way that a traditional media newspaper, perhaps, would do.

HEFFNER: They’ve resisted any attempt to create an editorial board. How can you get them to frame their decision making in a way that is consistent with the science, and yet consistent with the value system so they’re not profiteering off the propaganda?
DIRESTA: So, I think, one of the things that we saw was when they removed all human oversight from trending topics, when Facebook removed all human oversight from trending topics, the top three or four trends were blatantly false stories. I think one was that Megyn Kelly was fired for supporting Hilary Clinton. Then there were, you know, just a range of other content that never should have been amplified. And I think that by itself, should have been a huge red flag indicating that the algorithm, you know, the algorithm by itself was being very easily gamed, and was being very consistently gamed and I think that, as we’ve moved forward from this the, one of the most important things that I think we got out of the hearings, was the beginning of an acknowledgment. It forced them to do a self-reckoning. It forced them to look internally. You recall, after the election, Mark Zuckerberg’s response was, fake news is such a small percentage of what we see on Facebook, and it had no impact at all, and then we went to, it was $100,000, 100,000 ruble ad-buy, and only ten million people were impacted, to, through the, you know, very coordinated effort between researchers, researchers and media, to really, kind of open the kimono and say, no, no, it was hundreds of millions of users who saw this stuff. And we can identify some of the people who did. So why, how, how can you plausibly make the claim that you can’t? It’s just simply not true.

HEFFNER: Renée, what do you say to the people who have withdrawn from Facebook since Zuckerberg’s ineptitude and really, incapacity to recognize the problem, or unwillingness to recognize the anti-social effect of his medium? Anti-social may be even putting it lightly. But I know myself, I stay clear of Facebook in general. I mean we all as a media complex have our respective footprints on these social media sites, but, one thing that I’ve proposed with sharing this with one of your colleagues at the project you’re working on is, you know, we each need to buy some shares of these companies and take ownership, they occasionally and maybe more frequently now will ask shareholders, even if you just own a couple of shares, for their input when they do surveys. And I think there was one, recently, at Google, and the shareholders decided not to do a study, survey an audit, a forensic audit of the platform and the impact of the bots and trolls on the platform. Would buying shares make an, an impact? What, what do you say to people who have otherwise withdrawn from these media?

DIRESTA: So, as far as the shares, I think that it, you know, it really is platform dependent. I believe with Facebook, Mark Zuckerberg still controls enough shares that there is, you know, I, I’m not sure to what extent the shareholder votes really have an impact. But I think that we’ve started to see people, particularly on Twitter use their direct channel to the leadership, to communicate a very specific point of view. I think, perhaps you remember when, every time Twitter makes a change some, you know, we go from stars to hearts. We get 280 characters instead of 140 characters, there’s a litany of people, thousands and thousands of comments back to Jack Dorsey that say, well, what about the Nazis. And it’s, it’s incredible to see. [LAUGHTER] It’s usually really funny also, because it’s just a, it’s, it’s users really kind of hammering home, the extent to which they’re disappointed by the product direction that the company is taking in making very small tweaks around the margin, but not really dealing with the core central problem that Twitter has, which is harassment.

So I think that, you know, I’m not sure that buying the shares really makes a difference in this case, but doing things to, you know, reach out to board members, I’ve seen people do that too, lobby the board members, say, hey, I’ve had some really serious concerns and I can reach you on Twitter and so I’m going to tell you exactly what I think about what’s happening at the, at the company whose board you sit on. As far as stepping back, I still use Twitter, Facebook, and Google pretty heavily. You know I’ve, I know that, I may be in fact a little bit unique of people who believe that there are problems, but also really love the tools and the promise of the tools. And there, and that’s where I think, that’s actually one of the things that motivates me, which is a feeling that this can be so much better.

We can return to what Twitter was like maybe a few years back, when the bot problem was not we quite the epidemic it is today. We can use things like Facebook’s groups, where I have met and made some wonderful friends in there. I think at the core underlying mission of the company is, is a valuable one. It’s just that the inertia and the lack of response is unfortunately hurting their user base and making people withdraw. I sincerely hope that what seeing this consistent withdrawal, public criticism from folks like Chamath and Sandy, and a number of other former insiders who’ve written very unfavorable op-eds recently as Sean Parker, I think that the effect of that may be, more than anything else, to inspire, to inspire them to take an inventory and, and have a, you know, a corporate reckoning.

HEFFNER: You said that you have hope about Twitter returning to its 2012, 2013 state, but,

DIRESTA: [LAUGHTER]

HEFFNER: It’s pretty clear that Facebook will never return to its pre-advertising core. And the inertia that you identify is driving it and leading the, the absence of any kind of moral compass,

DIRESTA: Yeah.

HEFFNER: Guiding who buys ads or who doesn’t. Who, in that sense, I don’t think you’ll ever get back to a, a 2008 Facebook or a 2010 Facebook. And, and I would be on Facebook more regularly if it was that platform.

DIRESTA: Yeah. I think the challenge, the real hard problem here is that the tools were not misused by the people who ran influence campaigns. The tools are not misused by propagandists, right? These are tools to reach people. These are tools to monetize people’s attention. And the very recommender systems, they’re designed to make sure that you see what you, quote-unquote should see in order to achieve the highest rate of return for the advertisers that are actually the customers of Facebook and Twitter. You are not the customer. You are the user. And that’s a subtle distinction, but it’s a very, very important one. I think that the, so, the features were built in such a way as to keep people on the platforms for as long as, as possible, because as long as you’re on the platform they can continue to serve you ads. So, the fundamental challenge here is, is there a way to reconcile how do they reconcile the tension between maximization of ad revenue with showing people, you know, content that is in fact beneficial for society, or acknowledging that maybe people shouldn’t spend 18 to 24 hours a day glued to their phones. And so these are these are the areas where we have to see to what extent the platforms can wean themselves away from the advertising business model.

HEFFNER: That’s exactly what I asked one of the founders of Twitter right here in 2014. Would you ever impose a constraint on being a good Samaritan? You believe in 140 characters, or now 280 characters. What about ensuring that there is goodness? Or at least the viability of goodness and knowledge, outweigh, or outweighs the, the viability of, of hate? And you can maximize people’s attention span through a subscription model.

DIRESTA: Right.

HEFFNER: That was not beholden to advertisers. But in that sense Facebook and, and Twitter, it would be unlikely that they would resituate their position to, to do subscription based,

DIRESTA: Yeah. It, it is unlikely,

HEFFNER: I mean, you know, unless you want to pay for the Facebook of pre-advertising bonanza, and frankly, there’s not very much I want to see on my Facebook feed anymore.

DIRESTA: Right. And, and some of that is, you know, it’s, a, it’s a feedback loop, because people have gotten frustrated and left. I think Twitter is, you know, the most kind of clear example of this, which is, Twitter, it, it communicates monthly average users as one of its KPI’s, Key Performance Indicators for Wall Street. And it, the unfortunate reality is that the, many of the most active users are the bots, and the, you know the, what, the term is called cyborg, people who automate their account some percentage of the time and then take the reigns just enough so that they’re not a fully automated account. Those folks have a lot of activity. And a lot of the discord and a lot of the fighting and a lot of the, a lot of the hate that you see on the platform actually engenders a ton of engagement, because people respond to it.

And unfortunately that looks good for those metrics. And if Twitter were to go and clear off the bots from the platform, you know, researchers estimate it’s about 9 to 15 percent of the accounts are bots. Twitter claims it’s 5 percent. You know, maybe the truth is somewhere in the middle there. But ultimately at some point I think that it’s actually the advertisers that have an incredible amount of power here, because the advertisers can say, you showing my ad to 15 percent bots, that’s ad fraud. That’s, I shouldn’t be paying to show my ads to fake people. How are you dealing with this? What are you doing to, to have an impact here? So, I think that there are opportunities for the advertisers in the big companies to really come to the table as participants in the, in the reformation process, so to speak, to say, we don’t want our ads shown over terrible YouTube videos. We don’t want our, you know, we don’t want our ads targeted at fake people.

HEFFNER: I like [LAUGHTER] you use the term, reformation process,

DIRESTA: [LAUGHTER]

HEFFNER: Right, the reformation of social media, the second coming. I don’t see a change on the horizon, as long as those same companies are beholden to their, ultimately their profit making capacity. And if you can do a forensic audit and show how a company is paying disproportionately to what it’s getting because bots are on the receiving end of their ads on Twitter or Facebook, well, maybe there’s some there, there as far, but tell us about the solutions, because you know, I said on a show recently, that the, the white supremacists had essentially hijacked the internet as a, as a eugenics experiment, and Mark Zuckerberg has actually played into that eugenics experiment by allowing people to state not their preferences only for the Yankees or Socrates, but for white people or brown people or purple or yellow people. And, and that right there is the demise of civilization.

DIRESTA: The, you are talking about the targeting,

HEFFNER: The targeting,

DIRESTA: Yeah.

HEFFNER: Of people based on ethnicity, frankly, because they hate certain people.

DIRESTA: It’s, and, and it’s, I think there have been a number of instances, people who are renting apartments, trying to not have their ads shown to minorities which I’m pretty sure is actually blatantly illegal if done off, off Facebook.

HEFFNER: And, and this was even more viscerally,

DIRESTA: Yeah, no, no you’re talking about the,

HEFFNER: More generative than that,

DIRESTA: ProPublica, the Jew hater,

HEFFNER: Yeah,

DIRESTA: Yeah.

HEFFNER: Right, the, the,

DIRESTA: In other words, the awful,

HEFFNER: The idea that you’re making money off of ads that are celebrating a vicious cause and, and immoral cause, to bring more people to a page that is the Klan.

DIRESTA: In Silicon Valley, we love to think that there’s an algorithm that solves everything. You know, it’s one of the biggest, you know, it’s kind of a running joke in the valley, it’s a punchline, there’s an app for that, right? You know, no matter what it is, there’s an app for it. There’s an algorithm for it. Tech will fix it. You know we joke about it, mostly self-deprecatingly but, but in some cases it’s, it’s true. I think with this particular situation, we are, we’re faced with the reckoning that there is currently, there is really no app that, or algorithmic fix that can handle the kinds of things that we’re seeing on the platforms. And with regard to that the, the way that those targeting features regenerated was because a sufficient number of people on the platform had them in, had typed them in under employer, I think school of hard knocks is something that you would see, you know, under employer, and so you could ad-target for people who went to the school of hard knocks. So, it was pulling from what users had entered, and so there was no oversight. I think this is the problem, which is, one, there’s no people. So nobody looks at that. A human would have seen that in five seconds and said, no, no, we’re not going to accept those, that, that ad spend.

It, it actually does violate the terms of service to do that sort of thing, but because they make their money having a self-serve, low friction model, they do everything they can to keep people out of that process, and then you see things like this, where the algorithm doesn’t know what it’s recommending. Another example I talk about a lot of the time is, it, it takes people who are prone to belief in one conspiracy theory and begins to feed them content related to other conspiracy theories, because it knows that that content is gonna resonate with that person. This is a dumb algorithmic decision. The algorithm doesn’t actually know what it’s recommending because Facebook has no editorial oversight saying, you know what, we’re not gonna serve anti-government conspiracies to you know to, to various types of truthers and things like that. Maybe that’s not something we want to actively suggest. So, there is a, a, this is where the, the conversation about media and editorial oversight comes in. It’s so central to the conversation, which is, at some point you really just need people thinking critically about this.

I think they are so afraid of being seen as partisan in one way or another, or as they’re concerned about being called censor, you know censorship. They’re concerned about allegations of censorship, and it’s paralyzed them from taking steps that I think most of mainstream society would actually welcome at this point.

HEFFNER: Very much so, and the measure, we hosted recently the CEO and lead advisor of the Counter-Extremism Project, the movement of YouTube to remove an Al-Qaeda terrorist and clerics postings,

DIRESTA: Yeah.

HEFFNER: That was an editorial decision made but is, is there any way that there can be some kind of joint effort across platforms to create that curation?

DIRESTA: So this is, this is something that the Union of Concerned Technologists, you know, which doesn’t exist in a formal sense yet, but this is one of the things that we talk about a lot right now, which is the idea that this is a system level problem. It isn’t just Facebook, it isn’t just Google, it isn’t just Twitter, because the, particularly when you look at things like the Russian Front content, that appeared everywhere. That appeared on everything from, you know, smaller social networks like Reddit, all the way up to, you know, multiple properties on Google and, and Facebook. And so, if you have a systems level problem and everyone is taking a piecemeal solution that doesn’t necessarily, that doesn’t really solve very much.

HEFFNER: Based on your insight, is there finally a consciousness, a cognizance on the part of Jack and Mark and company, that they need to do something more, that if the FCC is not gonna step in, which it may not, ultimately, it threatens to but it probably won’t, that they’re gonna have to do that,

DIRESTA: Yes.

HEFFNER: This union is gonna have to do that.

DIRESTA: Yes. No I think, I think that, I think that they see, look, the threat of regulation and the, and the specter of those hearings got more done in terms of acknowledgment of, or a voluntary recognition that what they’ve been lobbying for for so long, with regard to being exempt from ad disclosure rules for political ads, they took that off the table voluntarily. That wasn’t pushed through through legislation. That was done voluntarily. The Global Internet Forum to Counter Terrorism, which is a self-regulatory body of these, all the tech platforms, Snap, Google, Twitter, Reddit, they’re all in there. It took several years after the ISIS problem emerged to get there. And it took pressure from Europe as well.

But there is the nascent beginnings of a thing. And so when I called it a reformation, I, I really do believe that there’s an opportunity now to take that momentum. You know, one of the things I always say is, right, we got there on ISIS. It took us three years, but as a you know, [LAUGHTER] as a community, as an industry, we’re, we’re getting there. Why not expand the mandate to include counter and propaganda? Why, why don’t we look at, influence operations, information warfare? Why are limited only to focusing on one adversary? I think one thing that, in my own experience, has been that, there are many, many different types of actors that are all using the same tools to achieve disinformation and misinformation, to run disinformation and misinformation campaigns. And there are places where you can intercede in the, you know, in the sort of systemic propagation of that content, if there’s a, a backbone and a willingness to do that.

HEFFNER: And the companies are not alien to, or the, or unwilling to listen to the, the Union of Concerned Technologists who have spoken out?

DIRESTA: They have not been unwilling, no. They, they’ve been, there’s there are people within the companies who are making these arguments internally. And they have an incredible degree of clout and influence. And that’s one of the reasons why, why we really feel that it has to be a community driven industry, driven effort. Self-regulation has come to other industries through whistle-blowers being upset and companies deciding that they’re going to take steps to protect their customers.

HEFFNER: Final question.

DIRESTA: Yeah.

HEFFNER: What is a realistic objective in this year of 2018, that we can say, at the end of the day, the Russians didn’t interfere in our midterm congressional elections? Is that too high a bar to …

DIRESTA: No. I don’t think so. I think that that’s a very reasonable ask. I think that that’s a very reasonable ask. I think that the transparency and accountability, those are the two, you know, the two things that we’re going for. On that front, if you have an internal team with, you know, sharing information across the platforms, I think that a, a task force can accomplish that, while we fix the more systemic problems,

HEFFNER: Well, maybe it’s a wish list,

DIRESTA: [LAUGHTER]

HEFFNER: Or a wishful thinking that these companies can resolutely determine how they, how they’re going to counteract these, malevolent forces, propaganda, bots, and trolls, but even if the Union of Concerned Technologists in concert with these companies came out with their own rules of the future, that would be huge progress.

DIRESTA: I think so. I think it’s something that is very important for the industry to recognize that, you know, we’ve been around for a while now. These aren’t start-ups anymore. It’s time to take responsibility.

HEFFNER: Thank you Renée. And thanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website @Thirteen.org/OpenMind to view this program online, or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programing.