Silicon Valley's Spy Game

THE OPEN MIND
Host: Richard D. Heffner
Guest: Jeffrey Rosen
Title: “Silicon Valley Spy Games”
VTR: 5/10/02

I’m Richard Heffner, your host on The Open Mind. And I must say that I’ve particularly enjoyed having today’s guest at this table, as he has been several times before, always talking intensely, and I fear, rather grimly about the lot of privacy in America, to be more accurate about its strength in our past, its threatened and precarious position today, and if I read him right, its increasingly likely destruction in our future.

Indeed, we began our relationship discussing Jeffrey Rosen’s brilliant book, The Unwanted Gaze: The Destruction of Privacy in America here on The Open Mind. Mr. Rosen is Legal Affairs Editor of the New Republic magazine and Associate Professor at the George Washington Law School in the nation’s capital.

And today I want to talk about my guest’s compelling recent article in The New York Times Sunday Magazine in which he warns us about what he calls “Spy Games Being Played in America’s Silicon Valley” that derive as so much else does now and seems destined even more so to do so in the future, from our natural concern for national security.

Professor Rosen notes, not too kindly in his article, that the “gonzo entrepreneurs of Silicon Valley like to think of themselves as anti-government libertarians.” Indeed, he writes, “When the E business technologies of tracking, classifying, profiling and monitoring were used to identify the preferences of American consumers and to mirror back to each of us a market-segmented vision of ourselves, Silicon Valley could argue that it was serving the cause of freedom and individual choice. But when the same software applications are used by the government to track, classify, profile and monitor American citizens”, Professor Rosen insists, “they become not technologies of liberty, but technologies of State surveillance and discrimination.” And he warns us, “They threaten the ability of Americans to define their identity in the future free from government predictions based upon their behavior in the past.”

Indeed, I want to begin today’s program by asking Jeffrey Rosen to develop his rather ominous point about the games Silicon Valley boys and girls play. The information about each and every one of us they are able, and promise to gather, and the grids they would develop for national security purposes. His point that they threaten the ability of Americans to define their identity in the future, free from government predictions based upon their behavior in the past is particularly frightening. You want to elaborate upon that for us.

ROSEN: I’m delighted to because much of this article, you know, arose from our last conversation together. We were talking about Britain’s experience with surveillance cameras, which I’d studied and which we’d talked about quite seriously. And that was a strong example of a technology that is one of classification and exclusion and threatens our ability to change our identity in the future. The surveillance cameras, remember, both can track our movements through public spaces, be linked to data bases of our faces and identify us based on our past misdeeds. So every time you walked into a store in Britain, if you’ve committed some low level form of shop lifting in the past, you might set off an alarm and be unable to enter. And that seemed to me, and as we discussed it, a pretty tangible example of a technology that brought, not a tremendous amount of security, because as we talked about, these technologies are used mostly to focus on shop lifters and low level thugs, not on terrorists. But do potentially threaten liberty in a profound way. It’s not an American notion that you should be tracked constantly and held accountable by some data base or collection of your past trivial misdeeds.

HEFFNER: Wait a minute, Jeffrey, you say “it’s not an American notion” … wouldn’t it be better to say, “it had not been, or has not been in the past?”

ROSEN: Very much so, indeed. It has not been, it has been a presupposition of the myth of America, the idea of America, the ideal of America that you can redefine yourself constantly and set your own terms of identity. And I had hoped that it was because of Britain’s experience with hierarchy, their comfort with the idea of a class system that they would be quicker to embrace those kind of technologies. And when we talked, you challenged me to go out and learn some more about these technologies and try to find some examples that might more effectively balance liberty and security. So I want to tell you about what I found. One of the distressing things that I discovered is that that model, the British model of applying technologies that have been used in the commercial sphere for authentication, identification and really discrimination among customers or consumers are now being applied on a quite widespread basis on the national security level. Technologies that had been developed for business intelligence are now being applied for national intelligence. And what do I mean when I say that like the cameras they threaten our ability to define ourselves.

One example is a profiling scheme that’s being tried out in airports by the Federal Aviation Administration. It’s developed by two companies called Accenture and HNC. And they propose to gather a tremendous amount of private information about us. Not only stuff that’s now currently available, like our addresses and Social Security numbers, but records of the phone calls we might have made to Afghanistan, or the credit card purchases we’ve made that would indicate that we took flying lessons. And based on this come up with a pretty complicated algorithm that matches all this personal information to the profile of the suspected terrorists of September 11th. And if you resemble these terrorists in a strong way, then you’ll set off a red alarm; and if you just look a little bit like them, a yellow one, and if not, you can go through happily.
Why is this distressing? It’s distressing first because I’ve been persuaded by people who understand these things that it’s not likely to work. That unlike credit card fraud which is the wrong that these technologies were originally designed for, terrorism occurs very seldom. Credit card fraud you can develop a good profile of. It turns out that people who steal credit cards are very like to go to a self-service station and buy gas and then go to a mall and buy clothes. So if you buy gas and then clothes with a credit card you’ll get a call from the security company because you might have stolen the thing. But there are tens of millions of examples of fraud. So the neural networks technologies, as they’re called, which can teach themselves to identify unusual patterns can be pretty accurate in finding these credit card thieves.

By contrast for the terrorists we have nineteen people out of 300 million, you’ve looking for a needle in a haystack, but the shape and size of the needle keep changing. You’re likely to get a lot of what are called “false positives” of retired businessmen in Florida who took flight lessons and made a call to a relative in the Middle East. So, it’s not likely to be effective. And statistically I’ve been persuaded that that’s a fair criticism of this application.

HEFFNER: But you’re going to have to persuade others that the statistical argument that you offer is compelling. Why is it compelling? I mean are you saying, “well it isn’t particularly effective, okay … we’re willing to pay the price of an ineffective device because we’re so concerned about security”. That’s not a good enough argument, is it?

ROSEN: Your challenge, which I took seriously was to insist that these technologies, if we’re going to adopt them, bring us measurable increases in both liberty and security. And I want to talk with you about technologies that more effectively than this profiling scheme can serve both of those values. But if you have a scheme that’s both … basically a placebo, as a security measure, it’s no more effective than making, you know, grandmothers take off their shoes at the airport. But also it’s likely to have quite serious threats to privacy. And I think that this is something that we should try to think hard about. What’s the threat to privacy? The centralization of this personal information, personally identifiable information, in a place that could be accessible to low level government officials or airport officials and linked to me, has the potential to hold me accountable for relatively low-level misdeeds. I don’t want airport authorities to know the magazines that I subscribe to, for example. And under one of these schemes they could. Although privacy law now prohibits that they want to relax these laws to make this information available. I think it’s an indignity to have a high level Google search, there are now search engines which can search what’s called the “invisible web”, not merely what Google, the largest search engine finds, but archived information that ordinarily would not be easily accessible. And it’s proposed to use technologies that companies like Ford now use to find out what their customers are saying about them in chat rooms … all of these things were developed originally with consumer applications. And apply that at airports. So I’d check in, they’d Google me on speed and find out, you know, all the silly things that I said in a chat room when I was a kid. This is … we’ve talked a lot in our conversations about the danger of being judged out of context, as one of the things that privacy protects us against. It seems that this danger is tremendously high when you’ve got a great deal of intimate, personally identifiable information collected and disaggregated ways, exposed to strangers and they’re asked to make spot judgments, like whether or not you get on the plane, that could really affect your life in a profound way.

HEFFNER: But wait a minute, you are making the assumption, aren’t you that these are really ineffective …

ROSEN: I know that …

HEFFNER: … when it comes to the primary purpose that we agree upon.

ROSEN: I know that this technology is ineffective because statisticians who are … understand this and also are sympathetic to the project have explained that to me. Now there are more effective technologies and when we begin to talk about that, then there are a series of design choices that we need to discuss. And one thing that struck me as I traveled out in Silicon Valley, the swimming in the Valley, as the entrepreneurs like to say, looking for the killer application and the perfect product which is going to bring us … the silver bullet that will bring us liberty and security at the same time … I was so struck by how little attention there is to these Constitutional level questions that we’ve been discussing. There’s nothing inherently threatening, or ameliorative about the technologies. It can be designed in ways that favor the values that we care about, or threaten them. But the market pressures are all in … on the side of security … are inattentive to privacy.

So let’s talk a little bit about the consolidation of data bases and the construction of national identification cards and my wonderful adventures with the Oracle Corporation, which is the largest data base manufacturer in the world. I was interested in Oracle because back in October, Larry Ellison, the head the Corporation had proposed a digital identification system. He said that the problem is not that we have too much information, it’s that there are too many data bases. That we knew that Mohammed Atta was wanted on a warrant in Broward Country, Florida, but the INS didn’t know that. So if all of these different law enforcement authorities could share information, and if all the separate criminal data bases were consolidated into a single data base, said Ellison, September 11th could have been averted.

Now the architecture that Ellison proposed, unsurprisingly is one that looks a lot like the business model of the Oracle Corporation. He says that he saved a billion dollars and brought a lot of efficiency by consolidating all Oracle’s data bases into a single data base, with the software, of course, manufactured by Oracle, which made it easier for the different divisions of the company to share information across the globe … the French division could talk to the German division. And this made it possible for them to discriminate, monitor and distinguish among customers. The best customers could get the best service, the less valuable ones to the company could be shunted off onto the low level service lines.

So, for Oracle this … the consolidation of the information led to classification and exclusion which was useful to the company. It’s not obvious that the same model would be the one that we would choose for a system of digital identification. Indeed, the urge to separate data bases, the fear of the single, centralized data base, had been the major concern of the privacy movement ever since the 1970s. We have a privacy law of 1974 because Vance Packard wrote a wonderful article for The New York Times Magazine on the book The Naked Society about the dangers of a government plan to consolidate all data bases into a single place. And there was a concern that President Nixon might abuse this information by looking at Vietnam protesters and arresting them for their … or going after them for their youthful marijuana arrests, that the Privacy Act of 1974 prohibits the sharing of information without good reason.

There’s no security reason or rationale that all the data bases have to be consolidated and available to all government officials at the same time. It’s technologically possible, even if the information is stored in different places to make it accessible to high level officials, if you suspected of a serious crime, but to limit access strongly, so that low level airport officials can’t, you know, check on my youthful tax records or marijuana record without probably cause.

HEFFNER: But doesn’t that de-limit and limits severely the effectiveness, the potential use we became involved with after 9/11. And at least the myth is that had there been this coordination, had there been the ability of people at the airlines to tap into some central data base, perhaps this would not have happened.

ROSEN: It’s a myth, it’s an American myth, because we have such a powerful faith in technology to provide us perfect security and perfect forms of identification. But it hasn’t been implemented in any country in the world, even with those with far greater forms of social control than ours. There were 200,000 outstanding arrest warrants in Broward County, Florida, I was recently informed. The inefficiencies of our criminal justice system are so great, the impossibility of perfect transparency and perfect coordination … so strong … as we learned just a few weeks ago from the INS’s decision to grant the student visa to Atta … the technological inefficiencies, Senator Maria Cantwell, who’s just sponsored an interesting new bill requiring biometrics at the border, went out and was shocked by how little the INS and border people knew about the information they had in their data bases, even though Congress recently required them to take a fingerprint or iris scan, as a form of identification. So it’s … you really do have the sense of the progressives at the tern of the last century, who had such a powerful faith in technology and in particular the ability of experts to regulate its use for good rather than for ill, that they were able … they were willing to countenance the adoption of technologies of identification … in particular I’m thinking of the eugenics movement, which was viewed at the time as a Liberal way of distinguishing honest Americans from immigrant stock that was flooding to the country and threatening our values. And the faith in expert regulation was so strong that they were willing to countenance the application of these technologies in ways that we now recognize as a, as profound threat to liberal values.

HEFFNER: Tell me your concerns. Do they, as I suspect and I really want a … as I know you’ll give me, a straight answer … do they fall on deaf ears, as I would suspect?

ROSEN: I think there is such a hunger for … for a silver bullet solution and there’s a recognition among thoughtful people that we’re playing with fire. That we really are on the verge of transforming social interactions with technologies that are barely understood, that I’m heartened by the eagerness for serious Constitutional level thinking about these questions. I was impressed, for example, will Gilman Louie, the head of In-Q-Tel, this interesting new organization, a venture capital firm, started up by the CIA to fund creative new technologies of business intelligence that might have applications for national intelligence. And when we talked, he said that he had found interest in the highest levels of CIA, down through the military, for Constitutional level thinking about the technologies. He said, “the technology and the government people want to be told how to implement access controls.: We understand that the technology can be designed in ways that favors American values, or not. But it’s almost a bureaucratic challenge .. agencies are suspicious of one another for the same reasons that they don’t want to share information, they don’t want think at a national level, about access controls. So he said that there’s a bureaucratic resistance … I’ve been struck by the fact that there’s political resistance because technologies are so unfamiliar few politicians are willing to think creatively about new statutory regulations. And there’s Constitutional resistance just because our Fourth Amendment doctrine, in ways that we’ve discussed … Fourth Amendment which protects the right of the people to be secure against “unreasonable” searches and seizures, has evolved in a rather deterministic way, that doesn’t encourage judges to balance the seriousness of the search against the seriousness of the crime. If we imagine that ancient Fourth Amendment balance be implemented in this new technological world. We could imagine designing data basis, for example, that would be accessible to people when the crime was severe and the evidence was strong. But would be carefully protected in other evens. But what we need is Constitutional level thinking …. and you say, “am I … does it fall on deaf ears?” Well, these are complicated design choices and it’s not easy to make arguments that can’t easily be encapsulated in sound bites.
HEFFNER: But you seem to believe, in the basis of your touring of Silicon Valley and other inquiries that one can devise equipment, machinery, mechanisms that can enable those who work them to balance the need for information, the presence of information and these traditional Constitution liberties.

ROSEN: I do, and in that sense I’m delighted to report to you that I’m increasingly optimistic about that progressive … I mean that with a large “P” vision. You chastised me during our first conversation for being too impatient about law. And for having little faith in the ability of regulation meaningfully to regulate information. I now see, based on travels and studies and learning more about the technology, that it is possible, with a complicated mix of statutory regulations, of Constitutional level thinking and of technological design choices, to construct architectures of identification that might, indeed, balance these values. I could tell you it’s … it’s technologically complicated and not so interesting to go into the details. But it’s possible to design an identification card, for example. Let’s say that it, I’ve been checked out in an effective background check and there’s a packet of information that says, “Jeff is okay to pass at the border”. That information could be encrypted using public key infrastructure technology. It could then be locked using my thumbprint as a private key … when I come to the border I could put my thumb on the card, up would come the packet of information that says, “Jeff can pass”. Nothing more would be known about me. My fingerprint wouldn’t’ be centrally stored, therefore it wouldn’t be possible to have a centralized fingerprint data base as Larry Ellison proposes that would allow the government to dust for fingerprints at the scene of a crime and then plug that fingerprint into the national fingerprint data base and find out exactly who was, you know, protesting, or who was in a restaurant. That’s the invasive architecture, that’s the privacy Chernobyl architecture. That’s what Ellison proposes just because he has no financial incentive not to create the protective one. By contrast the system that I’ve just described, which was invented by George Tomko who is a entrepreneur in Canada … it was very sensitive to privacy. Canada, incidentally, is doing a lot better than we are in thinking about these issues on a high level. That system would both create perfect identification, teaching the border people nothing more than they needed to know, and nothing less. And also would not threaten privacy. So how do we choose the protective version rather than the Ellison version. Well, that requires either regulation or some kind of Constitutional mandate that says it’s unreasonable given two technologies, both of which give you the same amount of security, to choose the more privacy-invasive one.

HEFFNER: So we’re back to regulation. You’ve finally come over to where I begin.

ROSEN: Well …

HEFFNER: … that’s not fair, that a low shot …

ROSEN: No, it’s a high shot because I came here … I’m so excited to see you today because I wanted to tell you that I have come to your side to the degree of recognizing that yes, regulation is theoretically possible, and indeed necessary given this new world, if we’re to balance these two values. Now the question of how optimistic I am or you are about the ability of meaningful productive regulation to be adopted is another matter. And in this sense, I wanted to share with you the pessimism of my friend and teacher Lawrence Lessig, who really was as responsible as you for, for setting me out on this voyage and for teaching me about these technologies. I think as we talked about last time, he had chastised me for being a Luddite when I criticized the cameras in Britain. He says, “this is so frustrating because it is possible to design these technologies in ways that balance the values we care about. Go and learn about them”. So I did. But then when I reported back to him about what I learned, I’d found that he’d become pessimistic. He was distressed by the profiling system that we’d discussed. He said this is just an example of a false, it brings you no security and terrible invasions of privacy. And he was despairing, although his important book Code and Other Laws of Cyberspace argues that code is law and that the design of any of these technologies can determine the degree to which they threaten or protect values. He was pessimistic about who would stand up for the privacy side. He said there were no politicians who would do it. And no judges … and then you have these Ellison types … glowing with the excitement of the market. He said, “he’s like a rich version of a North Korean dictator.”

HEFFNER: You wrote that in the Times piece, and I … we have probably a minute left .. Or maybe a little less than that. I was quite taken by the fact that you didn’t cotton to Silicon Valley, did you?

ROSEN: Well, I liked their enthusiasm and their almost childlike thought that here’s a technological solution to any social problem. But I did not like the fact that the technologists keep saying, when challenged about “who would watch the watchers?”; who would design that access controls. They kept saying, “it’s not our department. I did think of that wonderful Tom Leher song that I like so much, about the Nazi rocket scientist who defects to America …. “once the rockets go up, who cares where they come down. That’s not my department”, says Werner van Braun. And it’s a plaque right in Silicon Valley …

HEFFNER: And I love that, and I thought to myself, “ah’, the Times kept that in. I was so pleased that they did and I’m glad that you remembered to quote it from memory because I knew I couldn’t get the accent right. Jeffrey Rosen, thank you so much for joining me tonight again …

ROSEN: Thanks …

HEFFNER: … on the Open Mind.

ROSEN: It’s such a pleasure.

HEFFNER: And thanks, too, to you in the audience. I hope you join us again next time. If you would like a transcript of today’s program, please send four dollars in check or money order to: The Open Mind, P. O. Box 7977, F.D.R. Station, New York, New York 10150

Meanwhile, as another old friend used to say, “Good night and good luck”.

N.B. Every effort has been made to ensure the accuracy of this transcript. It may not, however, be a verbatim copy of the program.

Leave a Reply

Send me THIRTEEN's free weekly program update email

Please note that the THIRTEEN editorial staff reserves the right to not post comments it deems to be inappropriate and/or malicious in nature, as well as edit comments for length, clarity and fairness. No solicitations or advertisements will be allowed. Users may link to other Web sites relevant to discussion, but most often links to commercial Web sites will not be permitted.

Produced by THIRTEEN    ©2014 WNET, All Rights Reserved.