At War With Cambridge Analytica
Air Date: May 12, 2018
READ FULL TRANSCRIPT
HEFFNER: I’m Alexander Heffner, your host on The Open Mind. Did they delete the models, the algorithms, the software, the intellectual property that they derive from having the data in the first place? That’s the only question that matters. So asked the digital rights advocate, David Carroll, a professor at the Parsons School of Design at the New School in New York, who is suing Cambridge Analytica through the United Kingdom’s Data Protection Act. For stealing user data and undertaking vast information-warfare against western democracies. At least that is the insinuation. “Technically, I haven’t filed a law suit. I’ve filed a claim for pre-action disclosure. So we’re asking the judge to force them to disclose- what we can file as a lawsuit. The beauty of that is if the judge forces them, that is Cambridge Analytica, to disclose then we don’t a lawsuit. Because we’ll get what we’re after.” Those are the words of my guest today. And since Facebook disclosed that Cambridge Analytica had violated its terms of service to snatch the private data of users without their consent, the latest bombshell in this rapidly-paced story is that the Analytics front, perhaps an espionage operation, assigned non-US citizens to message for the candidates in 2014. There seem to be- ever-evolving details every day on this story, David.
CARROLL: It’s exhausting to keep up with it. But I- I’m trying my best.
HEFFNER: And what do you attribute this high velocity movement? You know, the fact that Facebook, were they trying to preempt something here by making what was, in their mind not in the public’s mind, a transparent disclosure? Why do you think the chips have fallen so rapidly inow?
CARROLL: Well it’s been a long time in the making. It’s been more than a year in the making, a lot of people knew this, more than a year ago. And it’s just been a matter of building up the potential for all this to come out. And for, in particular, the whistleblowers to gain the protections they needed to come out with these stories. But in terms of Facebook jumping ahead of the story, a few reporters have told me they, this is Facebook’s tendency. That what happens is reporters call Facebook to do fact-checks, which tips Facebook off on these stories and so then Facebook goes and updates their company blog. And basically scoops the reporters on their own stories. To try to get ahead of the narrative.
HEFFNER: And I actually think, David, they had the idea that this was going to be viewed positively. That this was going to be viewed as openness and disclosure. And this was not gonna lead to their market value diminishing, their trust ensnared in scandal. And the reality, which is that there is a true deficit of trust right now.
CARROLL: Yeah they, the whole industry, not just Facebook, but I think the entire advertising complex has not wanted to hear me for years now, warned them that their whole enterprise is gonna blow up in their faces. And it did.
HEFFNER: What blew up in their faces, there is the expectation since Facebook became a for-profit entity, that they are going to engage and- maximize their profit, their bottom-line, as any publicly traded company would, to appease their shareholders. The shareholders seem not to be concerned with the automation of bots and trolls and disinformation, sufficiently to carry out real deletion of bad content. But those of us who subscribe to Facebook as users, knew we were entering into this age of Brave New World. Did we not?
HEFFNER: Right. A new term has been coined in effect, psychoanalytics, forensics to understand the individual data associated with users that was mined. Can you give our viewers the history of Cambridge Analytica? How this company was formed, what its intent, or what we’re learning now its intent was.
CARROLL: Sure. So the research field, psychometrics, has been around for a while. But it really advanced at Cambridge University, in their psychometrics center. Three scholars Michal Kosinski who’s now at Stanford, David Stillwell who’s still at Cambridge, and Dr. Alexander Kogen who is now this mysterious figure in this whole controversy. The three of them did a lot of the research and published a lot of the early papers on this idea. And the key idea is that- we give away a lot more of our personality traits than we realize. Simply by clicking “Like” in Facebook. And then they can examine a huge data-set of many millions of people, they can figure out how people are similar to each other. And that’s sort of this idea that you can- measure people’s personality against the averages. And they have created a scheme to do this and to sort of map out people’s outlying traits. Separate from this many years ago, was a company called Strategic Communications Laboratories which was basically a commercial, psychological operations firm that served militaries and governments and as well as did sort of- quote election management for politicians around the world. And as we see from the Channel Four sting videos, you know, this is really unsavory espionage, commercial kompromat kind of work. Which is really unsavory and very unsettling. But back in 2014/2015, a company was formed out of the research at Cambridge University and as it’s reported by Christopher Wylie, one of the whistleblowers, that they basically created this company for Steve Bannon. And Steve Bannon apparently named it Cambridge Analytica, and they set it up near the Cambridge University campus to give it the air of, that it was an academic entity. But it was really a commercial attempt to bring data and data operations into the SCL suite of services. And also, to create a brand to enter into American politics and beyond.
HEFFNER: How does this disclosure, with Cambridge Analytica, differentiate itself from the efforts in 2008 and 2012 to attract predominately young voters to then-candidate, then-President Obama’s campaigns? How is it different in terms of the harvesting of data and the cultivation of a political process?
CARROLL: Yeah this is important to talk about. So the first thing in the big picture to realize is every election campaign becomes more and more technologically advanced. That there really is an arm-race, arms race between the parties. So it was inevitable that from the Obama years and beyond that we would see increased aggressive set of practices. But what we didn’t realize was that the industry would become internationalized, would become militarized. And would become embroiled in, just crises of Democracy around the world. So it has leapfrogged you know what was standard practice in a way, in, you know in the previous election cycles. So I think, from the big picture, we’re kinda comparing apples to oranges because during the previous presidential campaigns, we’re dealing with domestic, civilian voter analytics operations.
HEFFNER: And also there was no finding that Facebook, which later decertified one of its third-party applications, there was no analogous circumstance that then Facebook was sharing or selling…
HEFFNER: Your data, your individual user-profile in the way that it’s now documented. It’s like an Equifax scenario, where you didn’t just lose your credit card information, you lost what was stolen from you was your personal identity, what you like, videos you post, anything that is part of that profile.
CARROLL: That’s right. That was an era when everyone was really, really promiscuous with data. And we didn’t really know what we were doing. But another key feature came online in 2013 after Obama’s second candidacy. And it’s something called custom audiences. And it allows people upload data into Facebook, and Facebook encourages political campaigns to upload voter files into Facebook so that voters can be targeted one-on-one by name. So this capability was simply not available during the Obama years, and it was a capability that we are really confident that was extensively used during the primary season, and really intensively used in the summer of ’16.
HEFFNER: And abused in the sense that if you believe that Facebook should not have hosted the campaigns of folks who were seeking white supremacists or KKK voters. This is what you’re describing, right? The way in which third parties who are seeking to advertise and build their followings, were able to use sometimes language and terms associated with hateful, genocidal movements. Is that part of this?
CARROLL: I mean we need to find out, we need the forensics to prove things. But in terms of what the whistleblowers are saying, we have really serious concerns that continue to escalate. So our sort of initial fears are starting to play out. But the key idea to think about is how Facebook sits on the beginning and the end of this controversy. It is the place where data was harvested in illicit ways to build sophisticated targeting models. Then voter lists were sort of pre-targeted using Cambridge Analytica’s algorithms. Those lists were presumably then re-uploaded back into Facebook in the summer of ’16, after those databases had been being enriched for more than a year. And then voters were targeted one on one and continuously, and they would respond to messages and their responses would inform the next round of messages. And so this idea of, you know, we don’t know if it works, meaning the psychometric techniques. What we do know is that the campaign was monitoring how it works on a very individualized level. And was continuing to narrow in on the people that were responding to it. So it probably did work for a small number of Americans. Those Americans may be among those who decided the election.
HEFFNER: So you are the complainant today. I think that’s in your lower-third banner right now. Complainant where does it go from here? In terms of your process and your would-be suit against Cambridge Analytica? Because the likelihood is they’re not gonna offer you a clear, transparent data-set of all the documentation that you’re seeking. So where do we go from here?
CARROLL: Yeah, it’s perplexing that they wouldn’t disclose the full voter profile, all 5,000 data points that they complain, because if they were interested in being cleared of any wrongdoing, that would be a really easy way to do it. Would be to give you my file, to show me there’s no evidence of Facebook data, it all came from legally purchased sources, and other you know disclosures that would show that it never got into the wrong hands and everything is above-board and compliant. But they’re not interested in that, and they’re making me fight for it.
HEFFNER: Well, you’re also relying upon European laws. It’s not clear to me you’re breaking any American laws, Cambridge Analytica that is, not you.
CARROLL: Yeah, it really illustrates how we need probably a law in the United States that prohibits campaigns from using international firms, and processing US voter data in other countries.
HEFFNER: It is illegal for foreign actors, whether they are clearly identified or not, to donate funds to American political candidates.
CARROLL: Yes, and provide direct support and advice. Strategy and communication-
HEFFNER: Right, and in so many instances- in effect, Cambridge Analytica would be viewed as an in-kind contribution to the Trump campaign. Or even if consulting services were used, were used in that manner. So in that sense, there could be a federal election law violation. But whether it comes to this issue or the Stormy Daniels issue and Trump’s lawyer, Michael Cohen who paid off using a company essential consulting services, right? Strategic communication and essential consulting services. This is the name of the name. So the reality of the situation is- you can have, and I’ve said this on the show ad nauseum chairman of the Intelligence Committee, Richard Burr, admit to the American people, get the social media executives to admit that they broke the law by selling ads to foreign actors that for promoting campaigns. And nothing was done about it. So your legal case, I believe, is hoping to set a precedent beyond the UK that could employ legal mechanisms internationally.
CARROLL: Yeah, hopefully it’s a wake-up call to all countries that we have to lock down the influence industry, that there’s this whole shadow industry of commercial, election-interference. Or what they call election management. And how offensive it is that candidates would hire foreign nationals to work on their campaigns. That it is, and there was just a most unsettling feeling that I had when I got my data from the United Kingdom. Just like, why did it leave my country? What is going on here? And why that wasn’t enough to cause a real re-think even during the primaries, because people knew that Cambridge Analytica was working for the Cruz campaign quite intensively. So we sort of knew this was going on but why weren’t the alarm bells going off then?
HEFFNER: Maybe the alarm bells were not going off because in this country, if you watch “The Untold History of the United States” whether you’re a Oliver Stone fan or not, you see the very real stories, one I’ll point to is Chile. Where the United States government propped up Augosto Pinochet and did so for two decades plus? So, the reality is the US has interfered in many, many elections over the years, sometimes for the betterment of those constituencies and sometimes not. So isn’t this just our chickens coming home to roost?
CARROLL: We still have to recover and set things, set us on a better path. For-
HEFFNER: Right but we can’t be naïve to refuse to recognize America is non-compliant. What kind of leadership do you hope Facebook or other companies will aspire to in the wake of this scandal?
CARROLL: Yeah, I think the interesting complicated question that’s maybe unraveling is the relationship between failing to protect people’s privacy as a national security issue. That is, you know if we just surrender to the business community, which says there’s no harm in collecting people’s data to target them for ads. Well, maybe there is a harm, and maybe the harm is a kind of idea of cognitive security, national security. That if we let foreign entities harvest data about voters, and then create information spaces that sort of envelope them to influence their behavior and thinking and continually measure their behavior and activities and continually sort of perform experiments on them without their knowledge, or their ability to sort of escape from it, we’re really not protecting the country in a sort of cognitive way through data and the sort of automated, algorithmic media environment. So we just never thought this would happen. But I think it really is happening, and I think the advertising industry’s insistence that they not be regulated or limited in any way, has made us extremely vulnerable to maybe the practices that we’re guilty of- doing on other c- countries in previous decades. Just in a much more technologically driven manner.
HEFFNER: We talk often on this show about the scourge of misinformation, disinformation and that being the most corrupting influence. Whether it is conspired by foreign actors or domestic actors. To me, that’s the gravest challenge that we face. And the questions that you present are equally important. But in the shorter term not as salient in the sense that we want, whether it’s being fed by McDonald’s or the US State Department or the government of Chile, we want the information to be sound and accurate and honest and have integrity. And so what if the companies, ultimately, if they’re not gonna fold, what if they wake up and say we need to ensure integrity in our platforms? Is there, can you envision the day where we have that kind of integrity on these platforms?
CARROLL: I’m not sure because the incentives to tell the truth are competing with the incentives to just capture attention. And those are powerful incentives to countervail. So that’s why I think sort of pushing on data protection and privacy is a way to sort of, create some, a protective layer for people. And then also I think the bigger questions of sort of anti-trusts, like have these companies gotten just too big? And do they need to be broken up, are questions that we’re hearing now for the first time. We’re also hearing law makers talk about things like data-portability, that is, you know that you can take your data from one platform, potentially bring it to another. So that these companies don’t have such an iron grip over our lives and our data and our identity, they give us more sort of freedom in the market-place to move around. And so the idea that these companies are being considered sort of too powerful for their own good. They’re like- sort of beyond too big to fail. They’re like too big to, to just [LAUGHS] to maintain democracy.
HEFFNER: Right. Mark Zuckerberg was insufferable when he said that if it’s the right thing to do, he will testify. Giving a complete picture to his amorality and idea that Facebook is this neutral arbiter, if it’s the right thing to do. Because he can’t stand up, like a mature adult and say it is the right thing to do. And it’s also the right thing to do if the US Senate, Amy Klobuchar and Mark Warner’s legislation is not gonna move forward, that Facebook has to require any third party, political activity, political communications to be transparently identified as such, as our FCC requires television ads to be. None of that has come to fruition, none of that has been acted upon.
HEFFNER: How can your legal challenge, which is now specifically in the UK, bring about reform more globally?
CARROLL: So, the UK law is very similar to the laws around the EU member states. And on May 25, a new EU law comes into enforcement called the General Data Protection Regulation or many people say GDPR. And Sheryl Sandberg announced at Davos that Facebook is adopting the standard globally for all two billion users. And this is very exciting because it shows how the EU standards, which are very strict and powerful, are incentivizing the big companies to just adopt the best standard. And so I’m optimistic of what this signals. That it’s going to give people European-style data rights because it makes sense for Facebook’s business to do so. And then people are going to learn more about the rights that I’ve discovered that we have, such as the right to get your data, the complete data, all the data. To better understand how it’s working and how it’s affecting us. And then more rights to recapture that data and take control over it, take ownership of it. These are things that are enshrined in the European idea because for them, privacy is about dignity. For Americans, it seems to be about keeping secrets. And that’s, these are different ideas. And I’m a big fan of the European way of thinking about this.
HEFFNER: And how does the European way of thinking about this extend into the political norms that we hope to codify?
CARROLL: So part of the precedence too is to demonstrate that political profiling is a particularly sensitive category of data. And when the same tools are being used to sell ski vacations, and razor club subscriptions as are being used to potentially uproot our culture, we need to understand how it works, and we need to give people power to take control of their own- identity. And freedom.
HEFFNER: If these tactics are employed in the context of war-making, which may be the future. It’s not just the information-warfare, but it’s the actual warfare born out of that information, or misinformation.
CARROLL: I think there is an invisible war that’s been going on. And maybe…
HEFFNER: And persuading people about Iran and North Korea, the current administration is signaling it’s going to take a more bellicose approach. The same tactics employed by Cambridge Analytica towards the election of Donald Trump could be used towards the involvement in military affairs, could be used to start a war a nuclear war. Because, because Cambridge Analytica is still operating as business as usual.
CARROLL: Trying to.
HEFFNER: It, it still can though right?
CARROLL: It still can. And,
HEFFNER: Have sanctions been enforced against the company?
CARROLL: Not to my knowledge.
HEFFNER: That’s another approach.
CARROLL: But I mean Bolton’s super-PAC hired Cambridge Analytica to make war more favorable kind of concept,
HEFFNER: As a political message.
CARROLL: As a political message.
HEFFNER: And- maybe now is public policy.
CARROLL: I know.
CARROLL: It’s scary.
HEFFNER: Any final thoughts? You know my final thought is just that Donald Trump complains and complains because it’s convenient for him to complain. But also, complain about FISA I am waiting for Ron Wyden and others, the Democrats, to turn that against him and say really? Well, what about Cambridge Analytica? You’re concerned about FISA but not Cambridge Analytica?
CARROLL: Yeah, no it’s…
HEFFNERL I refer to the FISA warrants that he claims that he was a victim of- surveillance.
CARROLL: That potentially every single US voter was the victim of incredible, unprecedented surveillance scheme by a foreign military contractor hired by the Trump campaign. And the fact that the President’s data company was raided by the British authorities. Wow.
HEFFNER: So every time he says FISA, you say Cambridge Analytica. David, keep up the good work.
CARROLL: Thanks for having me on.
HEFFNER: And thanks to you in the audience. I hope you join us again for a thought excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.