Danielle Citron

How to Preserve Dignity in the Digital Age

Air Date: July 25, 2022

Digital privacy leader Danielle Citron discusses her forthcoming book "The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age."

READ FULL TRANSCRIPT

HEFFNER: I’m Alexander Heffner, your host on The Open Mind. My guest today is Professor Danielle Citron. She is the Jefferson Scholars Foundation distinguished professor in law at the University of Virginia, where she teaches and writes about privacy, free expression, and civil rights, and their connections to the future of democracy. Danielle, a pleasure to host you today.

 

CITRON: It’s a pleasure to be here.

 

HEFFNER: Let me ask you, since you delivered what is a million-times-over viewed Ted Talk on deepfakes and their potentially disastrous impact on society and democracy where, that was 2019, right?

 

CITRON: Yes. So 2019 I gave that talk mm-hmm…

 

HEFFNER: So, and, you know, where have we gone in terms of fortifying any protection against that threat of deepfakes and, and more broadly, the threat of invasions of our privacy and how they can manipulate real information and outcomes of, you know, democratic processes.

 

CITRON: You know, we’re, just to set the stage, you know, so when I first gave that talk, there was something like 15,000 deepfake videos that were detected online. And 99 percent of them were of women, right, of deepfake sex videos, women’s faces being inserted into porn. Now today, there are over 50,000 deepfake imagery that’s being, you know, video and audio being detected online. And again, 96 percent of it is women’s faces being inserted into porn. So I always say that women are the canaries in the coal mine. That the kinds of disruptive uses of deepfake imagery it just portends problems down the road.

 

And so that’s what I talked about in my Ted Talk was the concern that we would see deepfakes used to accomplish election mischief and other ways that we might see like democracy undermining efforts. And we’ve seen some of it, right? We’ve seen deepfakes in India used to, actually by a candidate, to make them seem that they could speak different languages that they don’t, dialects, that they didn’t speak (laughs), right? So in a way it was mischievously used, right, to deceive, but by the candidate himself. And we’ve also seen, I think it was Pakistan where we’ve seen uses of deepfakes to undermine a candidate, to show them doing and saying something that they never did. But what lies ahead of us, I fear, is uses of deepfakes that do two things. One is that it’s fake, right?

 

And that timed right, the night before an election, you see an election turned. At the same time, and I think we’ve already discovered the other downside, which is the existence of deepfakery, and its persuasiveness, allows people to say, eh, everything is fake. Don’t believe that video of my actual mischief. Right? My co-author Bobby Chesney and I call that the liars dividend. And I fear that we’re here, right? That the here and the now of trust decay is we are living in it, unfortunately. Right? And deepfakes just makes matters worse. Right?

 

HEFFNER: When you see those examples from India and Pakistan and think of the potential threat in the United States, which hasn’t been fully realized in the political impact…

 

CITRON: Right.

 

HEFFNER: You’re talking about the social impact and, you know, the impact of violations of women’s trust and their rights. But when you see those examples from elsewhere around the world, in what way are you most concerned about that manipulation taking place in these next few years, here?

 

CITRON: So, sorry, what most worries me I think, is sort of politicians using deepfakery to show their political opponents doing and saying things that they never did or said, or activists, you know, doing and saying things that they never did or said, to discredit real important rights and movements on the ground. And, you know, in some respects we’ve seen the liar’s dividend already, you know, Trump, when he was in office said of the Access Hollywood tape. “Oh, that wasn’t me. That wasn’t my voice.” Right. He tried the gambit of saying it’s a, it’s all fake. Don’t believe it in many ways didn’t take off because it was very difficult to believe anything he said. But people are already living in that world of like, ydo ou have some proof of my wrongdoing? That’s not real. Right. And I guess I had this, I have every election, like since I’ve been studying deepfakes, I just pray a little, like two days before because, you know, the sophistication of, you know, the kinds of ops that we’ve seen from foreign actors, you know, domestic actors, the use of falsehoods to sway the electorate is already profoundly afoot. Right? You know, we’re already seeing whether it’s QAnon on or falsehoods about Hunter Biden’s laptop, whatever it may be. Right? We see falsehoods perpetrated or people or facts, real facts, taken out of context and misused. And so I think we have elections before us, in which I worry that, you know, people of good faith will be undermined that as their candidacy will be undermined in ways that make it very difficult. Because you can’t redo an election. Right. We have an election and if the deepfakery comes out like the night or two before, right? You don’t say do over, right? It’s just a smashed democracy, right? And I think what follows from that is just that we already have such deep distrust of government. I think something like 30 percent, we have massive distrust. So that only 30 percent of people say they remotely trust Congress, the President, the, you know, the Article Three and the court system. And that’s a dangerous place to be in, right? So that, and especially the distrust, which I think is so upsetting is of the mass media. That is the, you know, the fourth estate, they are the guardians at the gate, right, to truth. And the idea that we would have this like complete lack of faith in their findings, right, of real repositories of trust. Have we just so eroded that? And so, you know, I think we all need a lesson in civic engagement, right, each and every one of us, not just young people. But I worry that we are, we could be in many cases like before an election, that’s some of the work that Rick is doing in his UCLA project. And Safeguarding Democracy is to, in some sense, educate us so that we can help ourselves, right?

 

HEFFNER: Yes.

 

CITRON: From the mischief…

 

HEFFNER: Danielle is referring to the UCLA Safeguarding Democracy Project on which she is an advisor and contributor to those efforts. I think the testimonies ongoing regarding the events of January 6th show that you can see horrible things that are not fake. For those who’s not familiar with deepfakes, you know, the construction of manipulated video; taking someone’s real face or presence and giving them words that they didn’t actually say, but it looks like they said those words. In the event of the January 6th documentary and the videos that have been exposed by the committee you, in a sense, in terms of looking at the deterioration of civil order don’t need anything fake or inauthentic to see the tension and the very active threat to the transfer of democratic power in our country. So how do you, how does that factor in, that these very awful real events took place? They weren’t manipulated video. They were real, you know, in terms of the disbelief and inauthentic content, what do you think about in the context of our democracy and deepfakes when you think of January 6th and those videos?

 

CITRON: I think it just goes to show that, of course, all that was premised on a falsehood, right? That the election was stolen and that we had massive voter fraud and that voting machines, you know, somehow flipped votes at a massive scale, right? And so what worries me is that someone who has the trust of so many people, if that person is a would-be tyrant, right? You know that the lies that he told didn’t need us to see them, right? He just said them, like the election was stolen and we know, and he made up facts at the ready, right? But what’s even more potent is if he had video, right? Remember there was a one video clip in what you were like saying, oh, that video clip shows that someone is passing like a disc or whatever it was like, passing, and it was just people exchanging what… they’re doing, their jobs. And that was pointed to, and that of course was a true video of someone handing someone, something else, completely benign, but then used to shore up his fakey. Right. The Big Lie.

 

HEFFNER: Yeah.

 

CITRON: And I think what I worry about, so we did have some audio and video in there, right? It was just a misdirection. It was a lie about what that was. And my deep worry is that when you have a charismatic figure who people believe, who will believe lies, that are just, you know, words, right, with a little video, but really words. That we know that video and audio is so potent, it hits us in the gut. Recall that when we’re watching these hearings for once. When we get to see the videotape in someone talking, there’s nothing better, or to show clips right, of the events. We rely on video and audio to have proof of the world, because we don’t need anything else to bear witness. We just say, we have our eyes and ears. We see it. Right? It’s visceral. It punches us in the gut. We say, of course, it’s true. We saw it. So my worry is that if, as President, Trump was so successful telling lies that were just words, right, that if he points to fake video and audio, that’s a recipe for disaster, right? Many more people will believe it. And so to me, it’s just preparation for the future. And I’m not saying it’s going to be Trump. It could be somebody else.

HEFFNER: Right. I guess what we have going for us is that the act of depicting a stolen election, you know, in terms of chads and voting machines, it’s just not a very sexy thing. I mean, you’d started talking about porn,

 

CITRON: Right.

 

HEFFNER: And the fact that, you know, this was criminalizing women doing these acts that were, you know, considered by some to be unethical or immoral or whatever. In this case, the best you could have, I suppose, is you know, a group of a hundred people who go on camera saying, you know, that they tainted computers or systems of elections and people weren’t doing that. So I suppose you could invent a 60-Minutes, if you will, a faux 60-Minutes report where all these fake people are saying that this was stolen. Right? I mean, that’s kind of the…

 

CITRON: I have a much more mischievous mind than you. So maybe I shouldn’t give people ideas. But you know, think of Brad Raffensperger, the Georgia Secretary of State. And imagine a deepfake of him and it looks like him, it sounds like him, saying something like, to the line, you know, to reaffirming that it was all stolen and we’re just covering it.

 

HEFFNER: Right. And preempting…

 

CITRON: Does That makes sense? So that is, that’s the potent stuff. Right – I think we could come up with other things.

 

CITRON: Well, especially preempting him, preempting the real Raffensperger

 

HEFFNER: Right.

 

CITRON: And say like, are you lying? Like we saw you on tape. You know what I mean? Like, you’re just trying to cover your …. Just imagine how a tyrant can spin it.

 

HEFFNER: Right. Right. Because sewing those seeds of doubt. And having a fake Raffensperger come on, and then he comes out and says that that wasn’t me.

 

CITRON: Yep.

 

HEFFNER: Have those people who wanted to believe the first iteration and now disbelieve the second. And so it’s, I hear what you’re saying.

 

CITRON: It’s like chaos, right? And confirmation bias, as you said, like when you see something that confirms what you already believe, you believe it. You’re like, that’s proof. I don’t need anything else, right?

 

CITRON: And so it creates such discord right.

 

HEFFNER: Now, let me ask you a question in, in a broader sense about your expertise in cyber-hygiene and you know, what basically occurs online. We’re behaving like there’s still a pandemic, because there still is a pandemic. And a lot of people in America aren’t right now. But you know, we’re recording this in this climate of still-growing pestilence and variants of this new disease and other potential diseases. In this climate, many of us have operated online and have a renewed appreciation, or maybe a first-time appreciation of the sanctity of privacy online. Folks who didn’t think about what browser they used or where they bought their products. And now over these past two and a half, three years, they’re considering that in their decision making. Who’s tracking me? And what is my digital footprint and how does it protect or shield my privacy or not? So have you found that there is a more mature constituency of digital actors’ behavior than before the pandemic or not?

 

CITRON: Some of the most difficult thing is that, you know, as you think about hygiene as an individual thing. That is, what any one of us can do is so limited. It’s really a systemic problem. And so that’s why in my new book “The Fight for Privacy” I talk about ways in which we need companies, governments, platforms to view our intimate information and to be the caretakers of that information, right, to not collect it. Some stuff you don’t collect. Some stuff you don’t sell or should never sell, right? Intimate information. And so there’s only so much that each and every one of us can do because we see a privacy policy and we think, okay, we’re good. (laughs) we click on through. And even if we objected, right, but there’s so little that any one of us can do, right? We have few rights that are guaranteed in law, and we need to change that, right? So we got problems that we need to change with the architecture of the way these services are built, how these companies collect data, share it and sell it. And then of course like that robust market of data brokers need to be regulated. So, in some sense, it’s bigger than us, right? It’s centrally about our life opportunities, our ability to love and to, you know, have and form relationships and to become our authentic selves. Right. And to enjoy dignity. But in a way it’s almost out of our hands because it’s so structural.

 

HEFFNER: I hear you.

 

CITRON: I know that’s depressing. (Laugh) I have some tips at the end of my book, but really like, it’s very, I say like, there’s only so much we can do, I’m going to give you 10 things, but (laughs) It’s with caution.

 

HEFFNER: I don’t mean to make this contentious at all, but it’s, I think it’s… To me, I would not like to believe what you’re saying in terms of these systems being cooked already or rigged.

 

CITRON: Oh, we could undo it. Right. It doesn’t mean that companies can’t change it? I work with companies on the inside all the time, right? Twitter, Facebook, Spotify, Bumble, TikTok, you know, I advise these companies. And so they’re building products and services right now that I’m urging them to build in, as they build them, with privacy and safety in mind and they can do it. Right. So it’s in, I’m not saying it’s useless, right? But I’m saying it’s difficult for us to say to, there are 4,000 data brokers, Alex, do you have time? Or do your friends have time to go to each and every one of them to say, please delete my data. And you know what they’re going to say to you? Too bad. So sad. (Laugh) I don’t care what you think you have no right. To ask me to delete it. So in a way, I feel overwhelmed myself. Maybe I’m just overcompensating for my own sense of inadequacy of what I can do to protect my own privacy.

 

HEFFNER: Right? No, I hear you. And for those viewing this, Danielle is really the nation, and maybe even the world’s leading expert on this and go find her book when it comes out, “The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age” by WW Norton and Penguin Vintage UK, this fall.

 

What’s disheartening about your, about that idea. And again, you just said, well, we can change course, the ship hasn’t totally sailed, is, you know, the fact that that Firefox and Wikipedia have been the exception to the rule of institutions online. And when I say that, for those who aren’t familiar, Firefox being the browser that has pledged for eternity not to sell your data. Wikipedia, being a source of information that is not profit-driven and that has a true democratic and increasingly diverse gender-base, audience and editing group. But my question to you is when you talk about systemic flaws and failures and your work inside of these companies to make things better, how are you trying to make things better where we are now, which is that the incentives that drove Firefox and Wikipedia to be the exceptions to the rule? Well, it seems like for a decade plus whatever web 2.0 became, it was driven by the opposite incentives.

 

CITRON: That’s right. I mean, the market, on its own, the incentive is to collect as much as you can, personal data, Share, sell. Right. And so the market incentives are such that there’s little reason for companies to distinguish themselves on privacy because the real upside that is the money, is with the sale of data and the purchase of data. And just to collect it for your own purposes, some of these biggest companies like Amazon, they’re probably not selling it. They’re amassing it because they’re thinking someday they’re going into healthcare. Right? That is you can’t think to yourself, okay, they’re all selling it. They may not all be selling it, but they’re amassing it. And they can use it against you. Right. And, and ultimately, right, in ways that that aggrandize their power. But so how do you get companies to, how do you, what’s the moral suasion, right, that we can bring to bear that more of them are Wikipedia, that more of them are akin to Mozilla, right. The Firefox approach to not selling data. And I think we just start talking, right? Like how is it? Because I think if these companies actually think we’re walking, right? It’s only when advertisers, especially true of some of these companies like Facebook. Like they’re not going to change anything until advertisers say, well, we’re not going to, you know, we’re not going to use your services. We’re going to move on. We don’t like your platform showing non-consensual pornography. Right? Or whatever, you know, rape joke pages, in 2014 we saw Facebook move because advertisers pressured them. Right? So how do we do that? Look, I mean, it’s not like we don’t have a voice. The problem is  scale. Right? Like we all have to leave Instagram, Facebook to get them to do anything.

 

HEFFNER: Right.

 

CITRON: Right. And maybe Alex, like we are here, we are on your show. Right.

This is the moment in which we tell everyone, vote with your mouths and feet, right. Like you’re not into this, Instagram. We don’t like you selling our data, exploiting it, manipulating us. Then we say that, right? But we’d have to walk to. And so that. And given network effects, like I, where am I going to go? You know what I’m saying? All my friends are on Facebook, Twitter and Instagram. Right? Like I can’t recreate those communities. I guess I have a few group texts, but they’re only like 10 of us. (Laughs)

 

HEFFNER: Right.

 

CITRON: And so it’s very hard to convince people to leave. Does that make sense? Like I can leave. Sure. But like the everyday person is going to say like, but where, how am I going to see my college friends, high school friends, photos, right? It’s because it’s, these are protected networks, right, that we cherish, it’s hard to walk away.

 

HEFFNER: Sure. And there was the hope that Facebook and Instagram were not the end game.

 

CITRON: Right.

 

HEFFNER: So you, you…

 

CITRON: I’m still hopeful, right? Look, TikTok is pretty … Like remember Myspace was, and then it wasn’t, you know what I mean? Like Julia Angwin wrote a whole book about Myspace and it’s like…

 

HEFFNER: No, I hear you. But it’s been, I don’t think it can be dumbed down anymore. Right. I mean, we’re at the dumbest point with the least amount of context, the least amount of the things that you value. And one of the things you value is the law of this country. And we are where we are, possibly because the nation has not practiced what it’s preached in terms of antitrust.

 

CITRON: Yes.

 

HEFFNER: And we haven’t really preached much antitrust anyway in the last half century. Right?

 

CITRON: Yeah.

 

HEFFNER: Bottom line is, I don’t think there’s a real dynamic shift here because we’ve reached that dumbest point of no return. Forgive me, but…

 

CITRON: No, I mean we’re kind of dumbing…

 

HEFFNER: Yeah. And so where we are possibly could change as a result of actual antitrust enforcement. But there’s no evidence and I would ask you and your colleagues at the law school this question. There’s no evidence there ever will be the kind of antitrust movement that would make Facebook have to break up or that whole network Alphabet, Facebook, Twitter, others. There doesn’t seem any evidence of that…

 

CITRON: Yeah. I mean the FTC is thinking hard, right? So we have Lina Khan, who’s the chair of the FTC. We’ve got new commissioners. We have a really interesting group, thoughtful group of commissioners. They’re sending signals that they’re thinking about privacy and antitrust as one story, which it is, right? What makes these companies powerful is they’re buying and selling, you know, they’re amassing our personal information. But I think we also need like, antitrust could be part of that story, but we need strong privacy laws because the entire problem is the business model is totally permissible.

 

HEFFNER: Right.

 

CITRON: Right. Like, as you were saying, we go back to incentives. Their market incentives are all on the side of collect more and more and more and sell more and more and more. And there’re no like, there’s no law (laugh) or very little, it’s very modest. Right.

 

HEFFNER: The absence, I mean, I see the things, these two things, and maybe my imagination is getting sort of away from the reality, but I see those two things, Danielle, as connected.

 

CITRON: Yeah. Of course.

 

HEFFNER: The absence of privacy standards are a pretext or legitimate legal motivation for antitrust to get busy…

 

CITRON: Oh that’s right. Oh yes. I mean, they should be sort of like levers of power that we can press on that help us. So, we’re inadequate on privacy. Let’s work on antitrust. Like we need to constrain, I mean, ultimately constrain power of these companies.

 

HEFFNER: Yeah. But I would think those colleagues, you mentioned that that would be the grounds and that would’ve been the grounds in the wake of the Cambridge Analytica scandal.

 

CITRON: Yes.

 

HEFFNER: Say, you know, we have antitrust grounds here because you’re not comporting with the law. Or even take, you know, the FEC, which has been, you know, largely absent from the whole conversation around anonymous spending in rubles and other currencies during the 2016 election, which was still never resolved.

 

CITRON: Right.

 

HEFFNER: The testimonies of these executives saying we don’t know who bought ads in 2016.

 

CITRON: They just paid in rubles (laughs) right. Right. Whatever total absurdity that is.

 

HEFFNER: So I, I’ll kind of to close, ask you a question I asked Carissa Veliz who’s also in this space.

 

CITRON: Oh, Nice. Yes. Terrific.

 

HEFFNER: You know, I was asking her about a way for the citizen to do more than just boycott. So boycotting is one thing. But I thought, and I proposed to Carissa, there ought to be some sort of Master Class of doing this and some kind of app you can download that can tell you how much privacy you’re in effect losing, using all the rest of the apps on your phone.

 

CITRON: Yeah. Wouldn’t that be so great! And I think some of the problem too, is like, you know, these apps will say, it’s our trade secret. You can’t know what we’re doing. That is, even though of course, advertisers, marketers, and data brokers have an invitation to buy. So it’s sort of like this odd, you know, we can’t tell you, but we’ll tell everybody else. And wouldn’t it be great if we had like, that is privacy with profit, right. That I could pay for an app that would give me some transparency. And I think what we would see is just the sucking of our data everywhere. That is, there are some computer scientists that will have like, create visuals for us to show the thousands of parties, right, that your one app is selling to.

 

HEFFNER: Right.

 

CITRON: And then, right. And ad infinitum, you know, selling to thousands more. And it’s really hard for us, right, to get like, I’m sure Carissa said this as well in her wonderful book. Like the idea that we can see it viscerally. We can’t. We don’t see data popping out of my hands, in my fingers, you know, out my fingers. And then on this stream, right, into those reservoirs of my information.

 

HEFFNER: Right.

 

HEFFNER: Danielle, thank you for your time today.

 

CITRON: Thank you for having me. I so appreciate it.

 

HEFFNER: Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.