The Hacked States of America: A Classic 2019 Episode
Air Date: September 23, 2019
READ FULL TRANSCRIPT
I’m Alexander Heffner, your host on The Open Mind. On his quest for digital justice, Parsons School of Design professor David Carroll joined us last year to discuss his suit against Facebook and Cambridge Analytica for violating his right to personal data. As we filmed our episode a Netflix crew surrounded us and trailed Carroll’s journey. You can see part of our interview in the groundbreaking documentary directed by my guest today, Karim Amer. The Great Hack is a tour de force of the digital dystopia, law-breaking and anti-democratic realities of the Internet age. Amir explores how Cambridge Analytica exploited the demons of social media in the Brexit campaign and in the run up to the 2016 U.S. presidential election. Today we’ll discuss the weaponization of the digital economy that preys on an increasingly vulnerable, misinformed; I think we must say too, dis-informed globe and it is exacerbating the greatest inequities society has perhaps witnessed. We are honored to host you Karim. Congratulations on this mesmerizing international sensation: The Great Hack.
AMER: Thank you for having me.
HEFFNER: It’s a pleasure. You were here trailing David, filming in this very studio, and let me tell you the response to that episode before The Great Hack made its debut on Netflix was astounding. I mean people finally got what was going on and you now have been in the weeks of seeing the public respond to your work. The reaction, please tell us.
AMER: It’s been, it’s been extraordinary. I mean really it’s this journey to get here, took over four years in the making with an amazing team of people with me in this process from, you know Pedro Kos who is our writer, producer and Judy Korin and Jehane and Mike who also directed it with me. It’s been a great ensemble, a project to be a part of. And I think beyond the team, what’s been so exciting is that we have, I feel like the timing couldn’t be better for this conversation. You know the same day that the film came out on Netflix was the day of the Mueller testimony. It was interesting to see that Facebook was fined $5 billion dollars the same week that the film’s come out and Mueller was talking about kind of the continued insecurity of our information highways. I think the audience couldn’t be more ripe for this conversation and it’s great to be back here where so much of it all started with David’s journey and you guys being one of the first places that was willing to give them a platform.
HEFFNER: One of the only opportunities for justice is David’s journey and your journey of not only the fines of these companies, but ensuring that their, they adhere to policies that are going to protect the privacy of people and they’re not going to inspire demagoguery. And that’s where this continues to be adjudicated. Because you know, the Special Counsel’s office is closed.
HEFFNER: He indicted Russian entities. They’re not facing the music, justice here in this country, but you and David through the course of this film and one of the star whistleblowers of the film are still demanding justice.
AMER: And we have to, you know, I mean I think that what David’s quest is really about you know, to use his own words in the film is it’s a desire to make the invisible visible, right. And we’re living in a world where so much of our of our reality is being recorded under the guise of surveillance capitalism in ways that we really don’t understand and under terms that we may not really have agreed to, even if we click that button long ago saying that, yes, hurriedly we accept these terms and conditions.
We’ve entered into a space where we don’t really understand our rights and these rights are important to understand because they are fundamental rights. They are not about privacy. They are not about just even one election. They, these rights are about power and they are about who determines what we see and when we see it and how, and who determines what we don’t see. And I think that David is asking the fundamental right of what does he have, what does he have the ability to request about himself? Does he have the right to have his own data? Does he have the right to know what they know about him and how they target him based on that knowledge and that question which he asked led to this incredible journey, which, you know as you all know, to him, you know, asking Cambridge Analytica for his data then eventually putting a lawsuit against them in the U.K. and then going on a journey to kind of to try to find out what’s happened.
So it’s been a remarkable adventure and I think an important civic duty and lesson for us all that one person can make a difference and we have the right to step in to the, into this battle as citizens and get stuff done.
HEFFNER: Karim. What is the status of David’s suit?
AMER: So David is, still has not gotten his data back, which is quite a shame to see that in 2019 and with everything we know now about the importance of data that we still don’t have the right in the United States as citizens to just simply file a request and get a understanding of what our data is and how it’s being used against us. But David has not given up because he’s a fighter and he is hoping in the next couple of weeks to hear some updates from the U.K., which I cannot share, but may, there may be some good news for David around the corner.
HEFFNER: What about other jurisdictions? That’s the jurisdiction that stuck in terms of his capability to fight this out legally. But what about here in the United States?
AMER: Here’s the reality. Facebook has become a crime scene and Facebook needs to be held accountable and we need to be able to use the rule of law to do so. The $5 billion dollar fine did nothing for Facebook. In fact, their stock price went up and they made 6 billion that week. So actually it was the market signaled that fine as, as quite as a nothing. So I think we have to, we have to decide as a society what’s for sale and what’s not for sale and if the entire democratic process has become commoditized, what are we going to do about it? Are we going to sit back on the sidelines as Carole Codwalladr, who’s the other character in our film says and let them get away with it.
Or are we going to demand something better? I think it’s time for a new social contract. I think that’s what the clamoring is about. But I think in 2019 or as we enter into 2020, that social contract is no longer between citizens and government. It’s between citizens, government, and tech platforms. And it comes in the name of a user agreement. So I think we should be looking for who the new authors of an equitable user agreement are and how that agreement could be something that reignites confidence and faith in our ability to use the most important phenomena of our lives, the Internet, in a safe and clean way. And whether that means the Internet needs to come with safety, with seatbelts, maybe it does, but we have to start. We have to start having that conversation because our democracy is at stake.
You know, when you hear what the official report from parliament said after doing a multiyear investigation post-Brexit, they said simply as we say in the film, the, the electoral laws are just not fit for purpose. And they labeled Facebook as digital gangsters. So this is what we’re dealing with.
HEFFNER: When chairman Richard Burr hosted the social media executives, and that’s how it was because Zuckerberg wouldn’t come initially, he hosted them, he didn’t interrogate them, but he just pointed out the fact that they were violating over the course of the entire U.S. presidential campaign, FCC and FEC regulations, the FEC regulations in terms of not knowing who was buying ads on their platform. I mean that was in violation of the law. And they admitted to it and there was no, it was impunity. There was no punishment for the fact that these deep dark centers of disinformation, the Internet Research Agency, among them, were able to buy ads in whatever currency, rubles in this case, and Facebook didn’t have to retain the information about who they were selling ads to. And that was in violation of U.S. law. So here’s the question, is there anything stopping Facebook in effect from, from being in complicit with the next Cambridge Analytica? Is there anything stopping a third party from complicity with Facebook?
AMER: Facebook’s definitely taken some steps since 2016 to, to clean up their act a bit. However, we still don’t have basic knowledge from Facebook about what did or didn’t happen during Brexit and 2016, they still have refused to turn over the evidence about what ads were run, what ads weren’t, and who paid for it in full detail. So they, we need more transparency and we need Facebook to do better. I mean no one believes that Mark Zuckerberg and his team wake up and think, how are we going to wreck democracy today?
AMER: But the reality is that their actions, their actions and their negligence is leading to that and across the world, right?
HEFFNER: Negligence is the key word here because they, I don’t think they’ve said in terms of their oversight prerogative that it is their responsibility. They’re not going to contract with third party vendors who will potentially exploit data in the way Cambridge Analytica did. So the psychological analysis of users and their data, the metrics that are being analyzed at, I mean they could just bring in a different third party and say, Oh, we want you to do this task or allow the Trump campaign or other campaigns to run ads based on psychological profiles of their users. There’s nothing stopping that from happening again. Right?
AMER: And the thing is that running ads based off of people’s psychology and personalizing ads is not actually inherently bad, right? These are all tools, technological tools and tools in their very nature aren’t good or evil, right? I think the problem is that this is happening in what I think is going to be called the wild west of the data world because there’s just no parameters and a kind of, there was this swashbuckling attitude of just get in there, grab the data and exploited as quickly as you can.
And that isn’t, that is because of an entire business model that is fueled by venture backed Silicon Valley startups of everybody wanting to be the next great big data company. And so without having any kind of ethical limitations or regulations saying what people can or can’t do, and most importantly, you know, as you said, one word was negligence. Another word that we have to talk about is consent. I think the big problem with this conversation is consent. We don’t have a consensual relationship with the way in which our data is used and not used. And that’s something that we need to figure out. You know, if data is simply your recordable human behavior, what aspect of your life are you comfortable having surveyed and having shared and having sold to a whole network of brokers that can then lead to information and profiles being created about you?
Are you okay with your sexual history being fully examined that way? Are you okay with your education history being used that way? Your medical data? So we need to have an understanding of what buckets of our behavior are we consenting to participate in this, in this kind of data economy? And what aspects of our behavior are we not? And that should be our choice. These technological tools have the ability to create incredible improvements for our lives. But in order to do that, they need to be serving us. And what’s happened is we have become the commodity, and that’s where we have to start. We have to change that relationship, flip that seesaw and that imbalance of power and demand a better-connected world.
HEFFNER: One example of that is using Mozilla Firefox. I mean, as soon as you use Firefox instead of Safari or Chrome, you’re saying, I’m not going to let my browser make money off of my personal browsing history. Right? I mean, those are choices that we as human beings and responsible citizens have to make too. In this case though, the for-profit browsers corner of the market, I mean, there’s really one viable nonprofit that’s saying to you, that’s pledging to you: we’re not going to steal or sell your browsing data, Firefox, right?
HEFFNER: And in the social media environment, there are no options, right? I mean, it’s Facebook; Twitter.
AMER: This idea of opting out is not a real choice. Right. And you know, not being able to be a participant in the connected world because you’re rebelling against it isn’t going to get us.
HEFFNER: But you can potentially opt into nonprofit …
AMER: You can opt into nonprofit entities and that, and that is an important and noble thing to do. However, I would say that the challenge we are facing in this space of information warfare is, you know, at the size and level of something, like climate change, we just haven’t had the words for it.
There’s been this deficit of language like, you know, if you remember before films like inconvenient truth, people didn’t use the word global warming in the same literacy and frequency. So it takes films, it takes conversations to, to allow people to see this world and create a language and lexicon around it. And I think that’s where we’re becoming, well, that’s where we’re at in this conversation. So with that in mind, I think it’s important for us to realize that, you know, we, yes, if we all recycle and if we all stop using plastic straws, that’s a good step. However, just like it requires a massive societal shift to, that comes in the participation of citizens, governments, and corporations to begin to have a real climate change effect. We need that same level of societal shift if we’re going to clean up the Internet highway.
HEFFNER: That’s right. Mature sophistication in the way that we reconceive the social media climate and platforms.
HEFFNER: Wired, The leading technology publication of this planet says about your “Great Hack,” “The Great Hack.” “Netflix brings our data nightmare to life.” You were referring to it before as a horror film as they do, “the new documentary about Cambridge Analytica uses thoughtful narration and compelling visuals to create a dystopian horror movie for our times.” And they say; “if you’d rather not think about how your life is locked in a dystopian web of your own data, don’t watch the new film on Netflix, ‘The Great Hack.’ But if you want to see, really see the way data tracking, harvesting and targeting takes strands of information we generate and ties them around us until we are smothered by governments and companies, then don’t miss this film.” What’s next in terms of your pursuit of justice and striving for the goals you just set out that are in effect, you know, the potential solutions to all the damage that the film accounts for?
AMER: Well you know, I think that “The Great Hack” is the beginning of a conversation in that you’ve opened up this door that is clearly a space that a lot of people from around the world want to be talking about. And I think we are looking now at how we can continue that conversation. We’re working on a couple of other projects including a series of short films that we hope to be releasing throughout 2019 and 2020. That can allow us to continue to see where some of these record sites are in the area between technology and society and particularly disinformation tactics being used in that space. I think what’s been really important is to look at this perspective that is happening from, what’s been really important in my, for me what keeps me kind of going about this is that Cambridge Analytica was a behavior change agency. That’s what they bragged about being. And as we’ve unpacked that story, we realized that, you know, Facebook is ultimately a behavior change agency. And so is Google in many ways and so are so many of the companies operating in the new era of surveillance capitalism. And behavior change isn’t implicitly a bad thing. You know, we need advertising; we need marketing to effectively communicate. But I think that there’s a difference between persuasion and manipulation and we need to start figuring out where those boundaries and borders are in different aspects of our life. Are we entering into a world where algorithms that are amoral in nature are going to be determining, you know, who gets certain access to education? Who doesn’t? Are we living in a world where criminality scores determined by algorithms are going to be the ultimate arbiter of justice? Are we living in a world where, you know, our algorithm is predetermining your ability and your family’s ability to have access to certain financing and to not have access to certain choices in life?
And if so, who’s determining the ethical boundaries of these algorithms and who’s watching the watchers? And I think that’s where it takes citizens. That’s where it takes people like David Carroll. That’s where it takes shows like this, of people having the conversation and demanding that we do better, and demanding that at this critical crossroads in human history that we band together and realize that despite our tribal differences, we still have a human connection that should bond us and should allow us to figure it out before we ultimately surrender ourselves to an amoral algorithm that we know very little about.
HEFFNER: A moral or immoral?
AMER: Well that depends on who’s in control of it.
HEFFNER: Right now, the unifying principle and impetus is greed.
HEFFNER: I mean in WIRED too, I want to send this to you, and our viewers can check it out. A piece that I wrote on the monetization of disinformation,
HEFFNER: But specifically the unifying factor here is greed. And that’s why it’s curious to hear you say that you can opt in to the nonprofit, but right now there’s one psychosis. There’s one reality that unifies the companies you mentioned Alphabet, Facebook, Twitter, and that is greed. I mean, the incentive is, you know, we want people to share the doctored video of Speaker Pelosi, and what about all of the deep fakes, the videos of folks saying things that they didn’t actually say, where’s the ignition of greed going to be paused so that they can kind of reset principles and go in the direction you’re envisioning?
AMER: Well, I think that that has to come from the, you know, from, not to be cliché, but from we the people, right? I mean, we, the people need to band together and demand that our government serves us, that our technology platforms and our data serves us. Our data is coming from us, yet it’s going into spaces that we don’t really understand and it’s not working for us. And that’s by design. That’s not how it has to be. I call upon the engineers of the future in Silicon Valley to remember that this incredible valley, which is in many ways not only a hub of global innovation, but and national security has in many ways, is standing on the shoulders of the open society. These ideals of openness, multiculturalism and human rights were forged out of the, you know, conflagration of Great War, of genocide, of great human loss.
And we came up with these ideals as to serve a better humanity, which allowed for places like Silicon Valley to exist and become a refuge of engineers from all over the world. So they’re great contributions to this connected world. And this great engineering of the future comes with a debt that they have to be aware of and that debt is to the ethics and ideals of openness. And if they’re not willing to protect those ideals, then we’re in trouble. And we have to demand that our government systems which are supposed to represent us, are there holding that power accountable and forcing them to do the right thing.
HEFFNER: Their definition of openness is different from ours. I think Jack Dorsey in particular, you know, he has had an appetite for white supremacy, the modern Klan and all of its manifestations on Twitter. I mean Sleeping Giants, the campaign to eliminate bigotry and advertising points out regularly just how much a defamatory content lives on and is marketed through algorithmic practices on Twitter.
HEFFNER: Their definition of openness leaves out democracy or liberal democracy leaves out rights-based societies.
AMER: But here’s the problem. If their business model it seems is now one in which the polarization of the American people is a built-in growth engine for many of these platforms. If that’s the case, then our biggest concern is not Russian interference or some outside player. Our biggest concern is that this Republic is going to fall on its own sword because of the incredible engine of growth of technology platforms propelling polarization further and further the American public and having that baked into their business model. And if that’s what they want to be remembered as, then that’s a shame. But we don’t have the time to wait for Silicon Valley to do the right thing. We have to; we have to demand that these platforms do better because without us, these platforms don’t work. They need us.
HEFFNER: And we have to be honest too about the fact that Cambridge Analytica may be dead, the organization may be dead, but look what they accomplished. I mean, look at the manipulation in your word and, and the forces we were discussing or debating, whether it’s immoral or amoral. Well, there were some immoral folks behind the curtains who were intent on that disruption through Brexit and the Trump election. So you know, in the sense that Cambridge Analytica is still living in the polarized, bigoted politics that is fueling new demagoguery and resistance to democratic norms, right?
AMER: Absolutely. And we have to ask ourselves if we believe that an American election, which we have another one coming up quite soon, if the integrity of the democratic process is worth something for us, and what is it worth? Can we have a democratic election that is free and fair when we have crime scenes in these tech platforms and know very little about how information warfare is being waged on them? Can we have a democratically free and fair election when we can see point and see as you’ve said so clearly that there is a business model incentive for these tech platforms to be continuing to polarize the American people? Can we have a free and fair election when we’ve allowed for the entire commoditization of the democratic process? We need to decide as a country what’s for sale and what’s not for sale.
HEFFNER: Final question. Korean. Besides watching “The Great Hack” on Netflix, what can folks do to safeguard democracy in 2020?
AMER: What can folks do? I think it’s important that if you, well step one is I think it’s important for people to be aware of their own digital footprint and understand that everything you do and say and act upon in the connected world can be used against you. I think it’s also important for people to realize that, you know we are all much more vulnerable than we may think we are. The film talks about the persuadables, the people who were influenced one way or another during 2016 and Brexit. The reality is that’s not one bucket of people. We’re all persuadable in a different way, shape or form, and we just have to realize that we’re constantly being targeted and that the admission fee that we’ve accepted to the connected world is this is this space of targeting that we don’t really understand. So be aware of what you think and how you think and keep an open mind because if you’re not, there’s someone out there thinking for you.
HEFFNER: Or because your brains will fall out if you don’t keep an open mind. But also Karim, when you were here trailing David, I said you were working on the next “Icarus,” the Oscar winning documentary from last season, I think you, you’ve done it.
AMER: Well, thank you. Thank you. We’ll see.
HEFFNER: Pleasure being with you.
AMER: Thank you for having us.
HEFFNER: And hanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access other interviews and do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.