Dragana Kaurin

Technology to Combat Authoritarianism

Air Date: May 13, 2019

Localization Lab founder and Berkman Klein fellow Dragana Kaurin talks about communication tools to protect refugees, journalists, and human rights activists.


HEFFNER: I’m Alexander Heffner, your host on The Open Mind. My guest today is a fellow at the Berkman Klein Center at Harvard University and speaks French, Spanish, Arabic, Serbo-Croatian and yes, English. Dragana Kaurin is a human rights activist, researcher and founder of the Localization Lab. It’s an organization that builds bridges between developers, organizations, end users and communities in need, resulting in more accurate and timely translations and unlocking access to the Internet for users all over the world. Kaurin has helped make open source technology available to underrepresented communities in 220 languages, bringing equal access to information, better representation online, and growing our collective user base. Today she works specifically on civic tech for refugees to ensure that the design of technology can better help secure the rights of those seeking freedom. Thank you so much for being here today.

KAURIN: Thank you so much for having me.

HEFFNER: What was, Dragana, the origin of Localization Lab? How did you come to create it?

KAURIN: Well I think you gave a really good overview. It’s a nonprofit that facilitates access to public interest technology, to communities worldwide. And it started out just working with translation with languages. And it’s really kind of developed into a movement where we offer user feedback we push for co design we tend to work on, so tools that are anticensorship, anti surveillance, tools for journalists that allow people to leak information to them like GlobaLeaks and SecureDrops, as well as a technical education tools.

HEFFNER: You’re working a lot now to try to make sure that refugees or potential refugees, those who were escaping dictatorship, authoritarian regimes, that their biometric data is protected.

KAURIN: Building on my work at Localization Lab, I’ve done a lot of research in how refugees use technology and what opportunities and challenges this poses for them. Unlike 25 years ago when Bosnians and Rwandans were fleeing, when this was a refugee crisis, it’s a much different world for refugees now, where, well, for one we get to watch, we have to see what’s happening. We have to; we have to witness the refugee crisis happening. We no longer have an excuse to say well we didn’t know what was happening in. Secondly, the refugees have devices on them that serve as access to, immediate access to information. They let them stay connected to each other and to the world. These devices also act as very powerful surveillance tools for governments to make sure to know where refugees are going at all times. Coming into Europe, many countries have this kind of policy, UNHCR does as well of taking biometric data, which means the facial characteristics; iris scans or fingerprints. Coming into Europe nowadays there’s a policy if you are an “irregular migrant: quote unquote, your fingerprints are taken right away and are being sent to a database in Brussels. And the policy is that six years old and up, you have to give your fingerprints.

HEFFNER: And that doesn’t matter if you’re from a high-risk country or not. That’s the standard policy.

KAURIN: It doesn’t matter. And what’s most concerning to me is that the legislature says that it can be taken by force. So you can imagine then when we don’t have even agency over our own bodies, how can we have agency over our digital selves, over information with what’s out there about us. So not knowing whose taking your hand and pressing it against, you know, a screen to take your fingerprints has a really deep effect on people with causes, distrust. Also, they’re fleeing conflict oftentimes. They’re afraid of people who are in military outfits pushing them for it, you know?


KAURIN: And of course, you know, go back to language. There’s there, there’s so few resources available that are in Amharic or you know, in languages spoken around West, west Africa who are coming into Italy as well as south and southeast Asia who are coming through Turkey into Greece is the other, right?

HEFFNER: So your tools are intended to try to provide the linguistic support so that communication can be transparent. So often is the case if you are under arrest or if you are facing legal scrutiny, if you can’t explain yourself, if you can’t convey your meaning so that someone can be on the same page, so to speak, can understand your plight, then that’s the first trap that potential refugees fall into if they don’t have a resource, either a human body or technological solutions that can help convey their experience. Right? So that’s kind of the front lines of the refugee crisis these days.

KAURIN: And you bring up a really important point and that’s something out of this most recent research that I’ve done at Berkman. I’ve interviewed people who have arrived in the EU over the past four and a half years, about their process and what the asylum process is like, what kind of information they give away, how that makes them feel. Do they, what do they think is happening with this information? Right. And what I found I think is really alarming that people aren’t giving the most important parts of that information, which is surviving sexual violence, torture, witnessing war crimes, crimes against humanity, escaping genocide. These are the things that will make or break your asylum application and telling a complete stranger this very personal story and not knowing what’s going to happen with this information. And secondly, having to tell that story over and over again without an interpreter, oftentimes not knowing whether you have a right to an interpreter.

HEFFNER: Do you think the technology can give folks more confidence to tell their stories?

KAURIN: I think information would; transparency about what’s happening with this information, Even potential risks, what, might happen to this information? It might get hacked. A, I’m not saying we should explore cloud, which cloud storage, where he using to store things, but risks need to be very well explained to people who are so vulnerable, whose, we are responsible for their wellbeing and their data. Right.

HEFFNER: Absolutely. Let me ask you this. Are there countries that are handling the refugees or immigrants, those seeking asylum, respectful of their human and technological dignity and footprint. Are there examples of countries that are employing those solutions right now in a way that you can share with our viewers that this is a model or at least an experimental model for how to address the problem better?

KAURIN: I can say Uganda is an excellent example of a country that has, without, without keeping out certain ethnic groups, has largely been pretty accommodating to refugees and offering both space and access to NGOs and UNHCR education, which is important. But as for specifically around issues on digital rights and data protection, I really don’t know. We…

HEFFNER: I’m sorry, go ahead.

KAURIN: No, I was going to say we’ve seen this trend over the past few years, both with Australia pushing potential asylum seekers to Manus, the US now keeping everybody at the Mexican border, through this kind of really weird loopholes and the EU keeping people in Turkey and Libya without having to fulfill their international responsibilities of processing their asylum claims. So it depends on how you want to address that question. What’s the bigger, we can go after kind of the smaller like digital rights and data protection but when somebody so clearly makes the statement that they don’t want you there. It’s difficult.

HEFFNER: Has Uganda balanced, in the mind of their citizens, the priority of security and also wanting to be a home, a welcoming country to immigrants? It seems to me that that is often the conflict for the native population and among the politicians debating how we actually implement immigration solutions or policy. It is that balance of wanting to give your constituents security and simultaneously wanting to be compassionate human beings.

KAURIN: There are definitely tensions. There are definitely issues around resources, about being able to provide education, non food items, water and sanitation, both to local citizens and to large numbers of refugees coming both from South Sudan, from DRC, from other parts of the region.

HEFFNER: One thing I want to do your take on was the technology companies and how they’ve responded to the localization project and what they’re doing that is either counterproductive or potentially can be productive down the line because they represent both the technologies but also capitol…

KAURIN: Definitely

HEFFNER: to help.

KAURIN: So three things come to mind. Palantir, which I’m sure you’re familiar with, has recently joined, partnered up with World Food Program. Palantir, I think for everyone in the digital rights base, this causes a huge; it’s a huge source of worry that these are extremely vulnerable groups. While World Food Program is making the argument that this is going to help facilitate, you know, handing out resources and managing this data better, it’s not transparent. It’s not done in a very transparent way that we know people are being protected. When you are a refugee, you’re not going to have a chance to speak up for yourself so it’s really important for us to do this kind of advocacy. Refugees and asylum seekers generally are trying to stay out of the spotlight, away from legal structures, from police. They’re usually afraid of being deported or being locked up. They have less rights than, than locals do, right? Secondly, I think cash programs like World Food Program’s cash programs, funny enough, I interviewed one person who was from Syria who, you know, in talking to me about his journey to Europe, he was explaining to me, I asked, did you register as a refugee in Lebanon or elsewhere? He said I did in Lebanon where someone told me to because it was important. But at one point online I saw that they were mapping out where we’re using our cash cards. And later I found the same study and Lebanon is a small country so you can see exactly these dots and, and movements. And you and I don’t know whether this is personally identifying information for at-risk people. So of course this person who was interviewing said I don’t have any trust in the UN.

So then one of the big takeaways from this research is that all this innovation that we think that we’re doing might actually be causing harm by causing distress in communities with these large structures that they rely on. And lastly, the burden of proof is always on refugees when it comes to using biometrics, as their id. ‘Cause there’s no second factor authentication, right? As an example in Mauritania a few years ago, something like 6,000 people, the system malfunctioned and so 6,000 people, you know, who are relying on these agencies for shelter, food, non-food items have to wait until it starts back up again to be able to access necessary resources. IRIS has changed, there’s always these issues with biometric data but at the end of the day for you as somebody who’s coming in to buy water, food, whatever, if presenting your, your fingerprint, it doesn’t show up the burden is on you to prove to us who you are without any paperwork.

HEFFNER: What do the companies like Apple and Google and Twitter and Facebook, what kind of support do they have so that, you know, the users of those services can try to either retain their social network from their home country, or at a minimum secure whatever vital information is there and be able to translate it into whatever language they’re hopefully being adopted into. What are those, what are those companies doing now, if anything, to help?

KAURIN: Well, let me answer it this way, the best thing they can do is be transparent about what they’re doing with information from these people so that they know what’s happening with information. Is it going to the US government? Is it? And the biggest fear for these people is, is it going to the government of my home country? Is it being sold? With, I mean, people see what’s going on. They read about Cambridge Analytica, they read about data,

HEFFNER: Mar-a-Lago, Chinese happenings and Mar-a-Lago. Yeah. So they’re, they’re aware, but they,

KAURIN: Yeah, so they start to wonder what is happening with this data,


KAURIN: Can they actually protect it? And, and when it isn’t kind of given to you, when this information isn’t given to you straight away, you kind of have to start putting pieces, like the person that I interviewed who was talking about his cash card.

HEFFNER: Presumably in some of these countries, those companies can’t operate because they’re suppressed: the Twitters, the Googles, the Apples. So you know, in China, certainly they’re state run or state surveilled services where folks are communicating, merchandising. What is your experience been? Are you able to capture in the non-US social media conglomerate? Are you, are these folks able to capture their experience, they’re not, what they’ve shared on services outside of those, because there must be services that are home-bred in some countries from which folks are escape

KAURIN: You’re talking about Weibo or,

HEFFNER: Sure. Or, or any companies that frankly, myself and our viewers may not know exist in a place, whether that is Syria or an African country or elsewhere.

KAURIN: So the research I’ve done is on how people coming into Europe are using WhatsApp or Facebook, because one of our partners waned to hire me to see what is there a need to create an app for refugees to be able to communicate securely. And my research kind of came to: people already communicating securely? Let me tell you, you wouldn’t trust somebody online who you wouldn’t trust in person, right. And other way around. So they’re already Facebook groups and WhatsApp groups as I’m quite sure there would be with, with Weibo and everything else, people know what they have in front of them and they use it in exactly those ways. There are chat groups that that say “don’t trust so and so when you get to Izmir, because they took our money and they didn’t take us across, or they pushed us on the boat” or these are also places where people post photos of their missing loved ones to say, I haven’t seen this person since then.

So people are just as if they would meet in person. They’re meeting online for this. And one interesting bit, there is a – how WhatsApp is being used. And I say this because of my prior research of everyone is up in Silicon Valley deciding I’m going to create an App that’s going to solve refugees. And, and I’m sure you’ve seen many of these, these articles, Apps that don’t get used that no one’s ever heard of that, you know, in my interviews to people or respond as what, that really have no effect. But meanwhile, WhatsApp is being used by communities between Izmir and Lesbos, which is about a two-hour boat ride, people going, you know, early in the morning to avoid detection from coast guard. They will send their location to a person, you know, using WhatsApp, they’ll send their location to a person in Lesbos to make sure that they’re getting there safely. And they’ve had, this one particular group that I interviewed had zero deaths this way of using something that you and I use every day for memes or gifs. They used it as a life saving mechanism.

HEFFNER: What is your goal for innovation in 2019? If we look back at this year and say in the refugee crisis, the Localization Lab accomplished, you’re already in so many languages, you provide the technological tools on the front end, back end, but what are you hoping to achieve this year that might actually result in more sustainable progress for refugee communities?

KAURIN: Just to be clear, we, we’ve work with refugee communities. We also work with journalists. Yeah. We were in 220 something languages worldwide. We work with civil society, human rights groups. So this is kind of a smaller part of


KAURIN: Everyone that we cater to. My biggest goal is to push through this message that technology cannot be the solution; that there has to be a co design of here are the issues, here are the problems and here locally driven solutions and in these parts is where we can use technology to expedite our communication or other parts.

HEFFNER: And what about what you’re hoping to achieve for the journalism community that is under greater assault now amidst the rise of authoritarian regimes and the pressure sometimes to censor,

KAURIN: Definitely

HEFFNER: To harass, to intimidate the forces that would assert the truth.

KAURIN: That’s a difficult one. I mean, considering what has been happening in this past year, it’s been one of the most difficult years for journalists,

HEFFNER: You still want to trade seats?

KAURIN: Well considering what happened with, with Khashogghi, it scared people that I think people in our community are no longer so simply wanting to say like, well, we can just provide a secure communication tool. In this case, you know, WhatsApp was used in certain ways, we need to listen intently to listen very well to what are the needs of, of journalists, especially ones in at risk communities who are living in countries where they’re targeted directly by governments and non state actors.

HEFFNER: Authoritarianism can be incrementally built, but it can also happen, like you said,

KAURIN: Overnight. It’s not the authority authoritarian person that one should be afraid of. It’s, it’s you and me. It’s us that we think that we would be, we would be the sane voice in that situation that we would do the right thing that we would say no, that we would help. I mean, and we’ve seen this throughout history. I think you also enjoy historical memory. I personally have a weird thing for historical memory museums, but you see this over and over, where people kind of tell a story of what happened and then there were the good people, and that’s just not how it happens. It’s, there is that voice of couragieg or a insert blank here of people who are pushing, xenaphobic, straight up fascist messages, supremacists messages, Islamophobic, anti-Semitic messages. And that’s, that’s just one person. It’s the rest of the population that flips so quickly that.

HEFFNER: How do you recount your own experience in when you said to me the genocide can happen in an instant?

KAURIN: That’s exactly how it happened. Normal, normal, normal and all hell breaks loose. That’s, that’s the case for so many people that, and this is why they’re left without an ID, right? That all of a sudden you’re, you’re outside of your own country that you are a refugee and you weren’t expecting, of course, he didn’t plan for any of this. The, going back to the to the issues around Localization Lab and why this organization matters so much to me, is another part of this is language preservation. That with genocide you have, I mean by definition in part or in whole erasing, erasing an entire people

HEFFNER: People who want to kill off an entire culture or language.

KAURIN: people by race, ethnicity, nationality and a fourth one. It’s been a long time since human rights class, sorry. But there’s this fear that’s always with you that not only will this be erased, but, but any, any kind of evidence that we were here is going to be erased. And that’s language, that’s art, that’s buildings, it’s architecture. And for a lot of people, not just refugees that we work with, with indigenous groups that we work with, they find it’s so important to be represented, to have their own culture live on in a certain way.

HEFFNER: I don’t know if you saw the Super Bowl ad, I think it was my favorite from this past year, in years that had been declining and Super Bowl ad quality, but it was a Google commercial.

KAURIN: Pepsi commercial?

HEFFNER: Well, I think there was a good Pepsi one this past year, but the, the top one in my estimation was a Google ad and it was, prefacing it with, you know, of course there’s, there’s horror in the world. But the inspiring, the inspiring notion that one of the most searched translatable terms is “how are you doing?” Or just “hello” or a greeting. But it was really an inspiring commercial that really it emphasized what you do and the idea that people want to be able to, convey an introduction hopefully, you know, courteously a courteous introduction and question, “how are you, how are you today?” And it was, the ad was basically in motion, all of these people asking that question. That to me is the essence of what you’re doing. How can Google and those companies be listening better to people who are actually in dire need of that communication?

KAURIN: Work with us, work with us, give us digital and physical spaces for co design, because mistakes are made. This is proprietary. And if we’re talking about a public interest of we should be better. We’re supposed to be doing better. We’re supposed to be addressing, we’re supposed to coming from a no harm principle, right? Co design and participatory design is offering some economic opportunities. It’s offering usability and security issues being addressed. It’s offering dignity. It’s offering opportunities for people. So stop creating solutions for us, work with our organization and our partners so we can create solutions together.

HEFFNER: Thank you, Dragana,

KAURIN: Thank you so much.

HEFFNER: And thanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews and do check us out on Twitter and Facebook @OpenMind TV for updates on future programming.