Carissa Véliz

The Inhumanity of Data Theft

Air Date: February 21, 2022

Digital ethicist Carissa Véliz discusses corporate invasion of personal data and the continued U.S. failure to regulate Big Tech.

READ FULL TRANSCRIPT

HEFFNER: I’m Alexander Heffner, your host on The Open Mind. I’m delighted to welcome our guest today, Carissa Veliz. She’s author of “Privacy is Power: Why and How You Should Take Back Control of Your Data.” She’s a digital ethicist. She’s a professor of philosophy at the University of Oxford. Welcome Carissa, a pleasure to see you.

 

VELIZ: Thank you so much for having me, Alexander.

 

HEFFNER: My pleasure. Let me ask you at this stage in the pandemic, and you are in the UK right now, so you can assess it from both a British and European lens as well as a more global perspective, but looking at where we stand now in this pandemic and thinking about all that we could be achieving or still need to achieve in this surveillance capitalistic state, to take back ownership of our data, how would you assess, you know, how much we’ve gained perspective during this pandemic and how much we actually have achieved, if anything at all?

 

VELIZ: Well, there’s a lot of work to do. And I think the pandemic has helped in making us aware of certain things, for example, that our participation with tech is not as voluntary as it seemed. Today it is very, very clear that to be an active participant in society, you just have to accept interacting with certain kinds of platforms. Also, I think that we have made some improvements in regulation, not so much in passing new laws, although there has been some of that this year, but in enforcing laws that were already there. So the GDPR the European regulation, even though it’s European, it has had effects worldwide because many times for companies, it’s just easier to have one rule for everyone than have different rules for different countries. And that we’ve seen a sevenfold increase in fines and some of them quite hefty. So we’re finally seeing that it’s, it’s beginning to bite and that’s good. However, there’s so much work to do because as you know, every week we see a new privacy scandal and the pandemic has only made us lose more privacy because we interact more with digital tech. And because we are being asked to give our data in, in more and more circumstances.

 

HEFFNER: Right. So I hear you, when you say there, there is more of a cognizance about ownership of that relationship, right? That when you buy a product on a certain website, you know, maybe it’s just that the fact that as we’ve been more sedentary and isolated, we’ve had to think about all of our commercial and other relationships online, whether it’s a social posting platform, email, or other commercial uses. But my question to you is what is the best way right now for the average person, you know, I don’t say average disparagingly, I mean, for the real person, the layman or woman to consider, you know, all of the apps they use on their iPhone or their Android, or all of the websites that they visit, whether that’s for socializing or commerce and to have a sense of what they’re giving away in those relationships. I mean, there, I know my friend, Ethan Zuckerman at MIT now at U. Mass has worked on something like this. But there have to be some universal ways for people in whatever country they are, to understand what they are giving up as a result of some of those relationships. What’s the best way to do that right now?

 

VELIZ: It’s very hard. And that’s partly why I wrote the book because the whole ecosystem of data is so obscure, so opaque. It doesn’t feel like anything to have your data collected. It doesn’t hurt. So, and it’s not obvious what kind of data that is and how it’s used and what kind of inferences are made. So part of what I intend to do with the book is to make that apparent. So the first chapter goes through the life of just an ordinary person to exemplify just how much data we’re losing and how it’s being used. But something that has helped is that if you have an iPhone, now you can see a privacy label. So if you go to the app store and see an app, you can have a list of the kind of data that being collected.

 

And sometimes it’s not easy to understand exactly what that means and what data there is. But it’s helpful when you compare two apps. So for instance, just as a test compare the app Signal, which is for messaging versus the app WhatsApp for messaging. And just even if you don’t get getting to the details, just the, the length of list will tell you a lot about the difference between those two apps. But part of the challenge with regulation and, and with companies innovating is to make it more, much more transparent to users how their data is being used. At the moment if you read a privacy policy, it’s long, it’s complicated, and it’s very vague. In the best-case scenario, they will tell you that they will share your data with third parties to improve their services and so on. And that means nothing to us. It doesn’t tell us what kind of data and whether it’s going to be sold on then. And, and, you know, in which hands is it going to end and so on. And so part of the challenge of data is making it less opaque

HEFFNER: And those lists continue to be long. The kind of advisories that you don’t get to, if, unless you click on something in very small font. I mean, we know the way the world works, and it continues to work in that fashion. But I do wonder, I’m thinking about our conversation with Martha Tellado at Consumer Reports as well about a kind of one-stop shop where at a minimum you could get that, you know, safely. You can have an outfit like a Mozilla or a Consumer Reports, a nonprofit basically look at the apps that you’re using, right? Most of us have iPhones or Androids or iPads, devices of that nature. The same can be done on your, on your computer with websites you visit, and basically give you a report. And much like a credit score based on, you know, your credit history. Basically, does anything like that safely exist, where, you know, it just takes another app, you know, to essentially tell you, you know, oh, don’t use WhatsApp or Facebook Messenger, use Signal. Or, you know, it will look at what you’re using and tell you what the highest risks are in terms of violations to your privacy.

 

VELIZ: Not really. There’s some startups that are, that are trying to do that. And for instance, one that it is very exciting is a project by Sir Tim Berners-Lee, the creator of the World Wide Web. And he’s trying to create something that he calls pods, and it would be something like having something like a USB stick. It could be just not physical. It could be just in your computer, in which you, you have all your data stored, and then you can give permission to whoever you want to share that data with. And you can also withdraw permission, which is very important. And you can see who has your data and what they, what they are using it for. And I think part of the solution is there. So as a consumer today, you would realize, for instance, that data brokers are tracking, whether you sleep well or not. And how long do you sleep, whether you’re pregnant or not, whether you have loans, what kind of diseases you have, what are you searching for online? What do you buy? Your sexual orientation, your political tend, your religious affiliation, all kinds of things that I think people would be spooked. But part of it has to go beyond that. So for instance, today, I don’t have to say go into an airplane and check the engine myself to feel safe, because I’m not an engineer. And I would, I wouldn’t be able to do it. I can trust that there are certain of standards that are being met. And in the same way, I think there are certain things that are happening with data that are just so dangerous for society, that it shouldn’t be allowed. And we shouldn’t place that burden on individuals to go checking themselves, whether that’s okay because it’s, we are, I mean, regular people have day jobs and we can’t afford that kind of time and that kind of expertise. So I think it’s going to be a combination between setting like high standards and there are certain things that just shouldn’t be done, and then giving people that extra layer of transparency and control.

 

HEFFNER: Would you say Carissa, that Mozilla is still the gold standard? We’ve learned about privacy protection from the browser perspective, by knowing for some time that Mozilla was the one browser and then became the most popularized to say we are not going to sell your data, the cookies and other information from the websites that you visit. Now we know that apps are not making the same amount of money on the Apple Store because Apple seemingly has kind of taken a page from Mozilla in the way that it’s operated, in some of its decisions. Of all of the corporate behemoths, Apple, ostensibly is the most committed to privacy. But I’m wondering if that is at all outdated in its conception about, you know, essentially Mozilla and Firefox being that model and Apple being the single company that’s decided it can work within a business model that is not going to steal your data, sell your data, invade your privacy.

 

VELIZ: Well, the truth is always a bit more complicated than kind of those broad-brush strokes. I think that what you said is roughly true, and I think it’s very important that Apple’s business model doesn’t depend on personal data and they don’t sell that personal data. That doesn’t make them a saint. And that doesn’t mean that I’m not worried about certain kinds of data collection or certain kinds of practices. I think Firefox is a good institution in general, who cares about privacy, but there’s no one perfect browser. And what I actually recommend is to have different browsers for different things. So it’s really important that you can have, for instance, one browser in which you actually log in to things that identify you and then have a different browser to actually do your searches and so on. And that doesn’t identify you in any way.

 

So there are a lot of options out there. One very good option is Duck Duck Go for your mobile. I think they’re getting ready these days to release a version for your laptop. Vivaldi is another option. Opera is another option. Brave is another option. So what I recommend is two things. One is to have many different browsers for many different things. That will give you an extra layer of security. And also to install extensions. So Firefox at its best and works with extensions, like, you know, Duck Duck Go has extensions but also Electronic Frontier Foundation has a few really good extensions so that your browsing can be encrypted, so that you can block ads that are targeting you, et cetera.

 

HEFFNER: Carissa, let me ask you this, what is it now that the EU requires from those companies, Apple, but more concerning have been Alphabet, Meta (formerly Facebook) and Twitter too, but not to the extent of Alphabet or Meta. But, you know, we know, and I think it’s worth repeating as much possible that it just means something different to be a European and to be an EU citizen in effect, a citizen of an EU member state, or for that matter, you know, other continents too and countries, Australia. But can you give us and our viewers a global sense as of right now what those companies are required to do according to law to continue to operate in places like, you know, Europe, in Australia, that they’re not required to do in the U.S.?

 

VELIZ: Companies are required, first of all, to know what data they have on you, which is, you know, pretty basic, but you’d be surprised when it first got implemented, how many companies didn’t even know where to look and didn’t even know what data they had. They’re required to hold personal data very securely. People have a right to know what data is being kept about them. They have a right to correct that data in cases in which it might be incorrect. They have a right to apply to a right to be forgotten, which means that if you have data out there, for instance, if, when you search for your name on Google, and the first thing that comes up is something very negative that is either untrue or it’s not relevant anymore, or it’s outdated or something like that, you can apply for it to be taken off the index.

 

People have a right to ask for their data to be deleted. They have a right to ask for the data not to be passed on and sold to third parties, et cetera. And the trick is that it’s very hard to enforce this law because data is so opaque, like we’ve talked about and because there are so many companies dealing with data. So we’re seeing a gradual increase of enforcement that is very interesting. So just in the, in the last few days, for instance, Austria had a ruling in which it decided that a website that was using Google analytics was actually in breach of the GDPR because it sent personal data to the United States and the United States intelligence agencies, like the NSA sift through that data. And so this, this could potentially be huge because even though the ruling is only about this one website, of course there are so many websites using Google analytics that there’s a lot of talk right now as to how that’s going to change and what companies are going to do about that.

 

HEFFNER: One company I didn’t mention is Amazon. And obviously Amazon operates for retail commercial purposes, most actively in the U.S. and U.S. territories, but Amazon web services is an international behemoth. So Amazon too, like you’re saying Alphabet and Meta, have to comply with what are seemingly abstract ideas. But in principle, there are still ways for these companies to be held accountable and lawsuits that emerge. And I just want to repeat again, since we air here in these 50 states that, that is a difference between accountability in the EU and where you are in Britain and here in the States. And how much are of your work and book? Did you want to resonate specifically in the U.S. where these companies are in effect invulnerable?

 

VELIZ: Yeah. I wrote the book partly thinking about the U.S., partly because, you know, these companies are mostly based in the us, but also because I think, you know Europeans and people around the world, democracies around the world are really counting on the U.S. to be an ally here. I think that we are at a crossroads similar to the one after the second World War in which either we unite and we come up with minimum standards for cybersecurity, the regulation of AI and privacy, or we are really going to face a very hard time with rivals like China and Russia who are very good at hacking, who are very interested in our personal data, and who don’t have much respect for democracy. So my afterward, in this new edition of the book in the paperwork edition that is coming out this month, the 25th of January, I talk about the importance of a federal privacy law in the United States. And I think there’s a lot of talk about it. I think it’s coming. I hope so. And the important thing is that it be strong enough to actually make a difference and protect citizens. And right now, for instance, there have been a few bills. One of them I have supported about banning digital ads that I think would be an incredibly important first step, sorry, not digital ads, surveillance ads online. That would be a fantastic first step to protect citizens and to protect democracy ultimately.

 

HEFFNER: You say you want the U.S. to be an ally and it’s, it’s so hard to hear this because this has been the chorus for the last many years. I mean, really, I feel this way since we’ve covered this issue on the program and since disinformation on social platforms really exploded and have of long before our 2016 presidential election, started to erode the fabric of civil society and informed communities. So I just hate to say it, but I find it naive to believe that we, I mean, and obviously you’re extremely perceptive here about what the need is, but to believe that the U.S. is going to be an ally (laugh) is rather lofty, if not unrealistic and forgive me for saying it, but I just can’t see that happening.

 

VELIZ: No, that’s fair enough. So there are a few reasons for optimism. So I’ve, it’s far from clear that it’s going to be that way. I think, you know, like I said, we’re at a crossroads and a crossroads means that you can go either way. Right?

HEFFNER: Right.

 

VELIZ:  But one reason to be optimistic is that we have learned that having so much personal data sloshing around it’s a huge national security risk and every country, the U.S. among the main ones, cares about national security. So there are a few examples. But a couple of years ago, the New York Times published an article in which two journalists who describe themselves as not very tech savvy managed to find the location of the President of the United States. And they managed to identify important lawyers, public officials, and people in the military through data acquired from a data broker that anyone can get. And that’s a red flag. That’s a huge red flag because if the President of the United States can be located by just anyone, that means that they’re not safe. And if the president is not safe, the country is not safe. So for instance, China just passed a very strict privacy law, and people have asked, like, have speculated, why would they do that? And I think that one of the reasons is because they were worried that they know that all that personal data is going to be hacked by the West, just like they’re trying to hack our personal data. So that’s kind of one reason why somebody, you know, who might not have very much confidence would have a reason to be optimistic. Another reason is that because states have started to pass laws on privacy, it’s becoming a bit of a mess and it’s very hard for companies to comply with different states. So every time that a state passes a privacy law, essentially it, it makes the standards go up for everyone. Or at least it pushes, it creates pressure in that direction. So the California law, in that sense, is a very positive step in the right direction.

 

HEFFNER: Do you think if the companies were being honest, they would realize that, and maybe they already do realize this, Carissa, that they can operate with a lucrative business model, even if they were to have to emulate rules and regulations from the EU and Australia, and elsewhere, that in other words that their profit margins aren’t going to really change. And the fact that they realize this, you know, or if in fact they do realize this, I should say, is possibly because they on multiple occasions are being asked to be regulated and the U.S. government simply cannot navigate politically a piece of legislation that’s going to work, or that’s going to have enough support from both political parties. So, and I think it’s almost clear from some of their words that, that they know that they can still be lucrative, even if they are, they are forced to, or have to operate as they do in the EU or elsewhere in the U.S.

 

VELIZ: Well, so yeah, you just mentioned, reminded me of a third reason to be optimistic. And this is that, that it’s a bipartisan issue, which is, you know, obviously very important for the possibility of a law passing. I don’t know if it’s a question of dishonesty, I often find people in tech who are just so entrenched in their current view that they cannot see a different possibility. And of course, because they have financial interest in it they don’t have enough of a motivation to make an effort, to try to see the problem and try to see possible alternatives. So many times I don’t perceive them to be dishonest. It’s something similar to that, but not quite.

 

HEFFNER: Yeah, no, I, I think that, I think that’s right. I was correcting my own language and saying I think they actually do realize, and, and, and even have been somewhat direct about the fact that their being regulated in the U.S. is actually not going to drive them out of business. And you know, the, there, at least if they were regulated in the way that they are in where you are.

 

VELIZ: Yeah. So I think it also depends on the company. So a company like Facebook, their whole business model is based on the exploitation of personal data. They don’t have anything else. So for them, it’s either change the business model or possibly go under. And so that’s why we see them lobbying so, so hard against something like, like privacy, strong regulations. So, so it depends on the company, partly.

 

HEFFNER: But what do you want people to do with their newfound appreciation of privacy that’s going to make a difference? You can, you can tell us in, in very tangible ways like you did with the browsers, but you can also tell us from a more philosophical or psychological perspective.

 

VELIZ: My main message is in the title “Privacy is Power.” It’s not about having something to hide. It’s not about having done something wrong. It’s not about hiding criminals. It’s about power. The more people know about you, the more you are vulnerable to them. The more they can interfere with your life and manipulate your behavior. So we each have individual reasons to care about privacy. If you don’t like to be extorted. If you don’t like to be discriminated against, if you don’t like to be exposed, if you don’t like to have something like identity-theft committed against you and and have somebody use your name to essentially commit fraud, all of those reasons give you reason to protect your privacy. But furthermore, privacy is as collective as it is personal. So privacy is a fundamental pillar for democracy. Without privacy, we don’t have journalism, there’s no such thing because journalists cannot protect either their own lives or their sources. So what I want people to understand is that privacy is a matter of political power, and all these other reasons are like, well, I’m not very shy, or I have nothing to hide, or I’m not a criminal, those are distractions. And, and they’re distractions of the big picture of democracy. And one of the misleading ways to think about privacy is that it’s just a personal preference, it is very individual. No actually, every time you expose yourself, you expose others. So if you share your genetic data, you’re sharing the genetic data, not only of your kids and your siblings and your parents and your cousins, but very distant kin who you’ve never met, but who could get deported and who could get the denied life insurance as an example.

 

But when you, when you share your location data, you’re sharing data about your neighbors and so on. And when that happens, when we have so much personal data slushing around, it’s only a matter of time until somebody abuses it. It’s something like saying, well, I don’t want to have a front door locked because I just want to allow people to come in my house and leave me gifts. Okay. You know, somebody might do that. That’s not like outrageous to think. But it’s more realistic to think that sooner or later, if you don’t close the door to your home, somebody’s going to abuse that. And in the same way, if you keep on giving your private data to others, which is like the key that opens your vulnerability, it shows others where you hurt the most. And somebody might use it to help you. And that’s why we tell our friends, you know, what’s going on with our lives and so on. But if you give it to just anyone, sooner or later, it’s going to be abused. And it’s going to be abused in ways that will hurt you personally. And you might never know about it. So that loan, you got rejected for that job you didn’t get, that apartment that you didn’t get. It might have been for something related to your personal data that might be, even be inaccurate, but you have no access to it and no way of changing that, that information. But also collectively, if you care about your friends, if you care about your family, if you care about other citizens, then you should protect your privacy. And just finally, some people might think it it’s extreme what I propose.

 

And I propose that we should not sell or buy personal data, because even in the most capitalist societies, we agree that there are certain things that should be outside of the market, like votes. We don’t sell votes because that would be a complete distortion of democracy. And for the same reason, we, we shouldn’t sell personal data because it ends up us being used in exactly the same way. And you might think that sounds radical, but when you think about it, what’s really radical is to have a business model that depends on the systematic and mass violation of rights. That’s what’s crazy. And we shouldn’t get used to it, and it’s not necessary. We can live without it. There are alternatives. It would be better for the economy. It would be better for international relationships, and it would definitely be better for democracy.

 

HEFFNER:  Carissa Veliz, author of “Privacy is Power.” Thank you so much for insight today. Appreciate your time very much.

 

VELIZ: Thank you so much, Alexander. It’s been a pleasure.

 

HEFFNER: Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.