Transcript
Sven Johann: Welcome to a new conversation about software engineering. Actually, today it's not really software engineering, it's privacy or security in an insecure age. I'm sitting here with Susan Landau. Susan Landau is a cybersecurity expert; she works for the Tufts University and is a visiting professor at the University College London. She testified at Congress, writes for the Washington Post, Scientific American, is frequently on NPR and BBC; she has been a senior staff privacy analyst at Google, and a distinguished engineer at Sun Microsystems.
Sven Johann: She was inducted into the cybersecurity hall of fame in 2015 and is a Guggenheim American Association for the Advancement of Science fellow. So many, many things - I don't know, you must be 500 years old...
Susan Landau: 1007... [laughter]
Sven Johann: I'm really happy to have you here. I was waiting for this interview a long, long time, and now we finally make it. I just want to start with a simple question... What is privacy?
Susan Landau: Privacy is many different things to many different people. Some people view it as the right to be let alone, some people view it as control of the use of one's data, some people view it as the not publication of data, some people view it as the right to be forgotten - this is something quite popular in Europe, at the moment, and trying to impose it on the rest of the world.
Susan Landau: I think of it mostly as the control of use of one's data. Largely, I think of it that way because in the modern era it's very hard to prevent the collection, and what you really want to do is prevent the bad effects of the collection. But of course, that does mean that certain things like harm to one's reputation can happen even if you're controlling use.
Sven Johann: Okay. We will dig deeper on the collection of data. What is security?
Susan Landau: Security is protecting against the taking of one's data. One way to think about the distinction is to think for a moment about a particular company, and it could be Facebook, it could be Google, it could be LinkedIn. Google is a particularly useful example because it's an example that everybody knows, and it has extremely good security. Google's business model is that they take the information from users, that users supply, and then they do things with it to improve services for the users. But in order to do that, they have to have trust of the users.
Susan Landau: Beginning probably earlier than 2010, but certainly in 2010, Google really improved its security process. They know that users won't trust the company with their data, if Google security isn't good. So that's security - protecting attacks of your data from the outside. But on the other hand, Google takes the data and then does various things with it. It serves you ads - some people enjoy the ads, some people like the ads a great deal less, some people try to use Google services without being signed in; it's sometimes a much less good experience... But the point is there is a clear distinction between privacy and security right there. The protection of your data against outside attackers is security, but the use of Google of the data is in many people's minds an attack on their own privacy, even though they've supplied the data to Google.
Sven Johann: In Europe, we have a discussion on corporate surveillance, that Facebook, Google and others -- there are many business models which rely on collecting data. I myself am a little bit scared what happens to my data if an attacker gets it. Do you think I can trust Google, Facebook and all those companies that they don't do any harm?
Susan Landau: Again, it depends on what you mean by trust. For one thing, of course, if your data is being held by a third-party, then other people may have access to it, for example law enforcement or national security may have access to it under a court order... You may choose that "No, I don't want to share my data in that case." I for example am much more comfortable using a cloud service that, of course, like most cloud services, stores my data in the cloud encrypted, but the difference with most cloud services is that this particular one does not hold the key. So if I lose the key, if I lose my copy (or copies) of the key, then I have no way of getting my data back. I can look at it, but it's all encrypted, it doesn't do me any good.
Susan Landau: When you talk about trust, it really depends on what your threat model is. For many of us, the threat model is we don't really care who has the data, as long as it's not criminals, and then a company like Google or Facebook, because they rely on user trust, really go to strong lengths to protect the data. For others, maybe it's a human rights worker, maybe it's a journalist concerned that governments might try to access the data under court orders in their country, that the journalist or human rights worker feels are inappropriate, then they would choose not to use those services. Maybe they don't want to leave tracks of whom they've communicated with, or tracks of what was actually said. So it all depends on what you're protecting against.
Sven Johann: I'm just wondering, why is it important that not everybody - especially governments - can know what we are doing? With Facebook, Google, all these sensors on a mobile phone, you can create perfect profiles of a person, which the governments... I mean, governments sell that as a good thing, because they can protect us from the evil.
Susan Landau: It depends how you define evil, and I'll go back to the 1950's to 1970's in the United States, which are examples that I know quite well, and I'll describe two different sets of examples. One is in the 1950's and 1960's in the United States we had a very active civil rights movement because, of course, black Americans were being prevented from voting, they were being prevented from seating at lunch counters, and so on. It was a very active segregation, but segregation by law, and prevention of their rights by law or by efforts; when they would go to register to vote, they would be tested on things that white Americans were not tested on.
Susan Landau: So there were groups active in changing this in the United States in the 1950's and 1960's, and one of them was the National Association for the Advancement of Color People. In the state of Alabama there was a requirement that any organization active in this state had to register its membership with the state. Well, that was very dangerous in Alabama if you were trying to register black Americans to vote or to exercise other civil rights... So the NAACP fought that, and the case went all the way to the Supreme Court, which said that our amendments that protect individual rights, the Bill of Rights in the U.S. are applicable to the state governments as well, under what we call the 14th Amendment. So the state of Alabama couldn't require that information. Freedom of association was protected. That was one set of things.
Susan Landau: We discovered in the 1970's as a result of the Watergate break-in that in fact the Army and other American military institutions, including the NSA, had been collecting information about Americans that many Americans felt they weren't legally entitled to. There was a committee in the Congress, in the Senate, called the Church Commission, that investigated these at depth... And there were many changes put it about when the NSA, for example, could collect against Americans. That became part of the Foreign Intelligence Surveillance Act.
Susan Landau: So we built in those legal protections, but what we've discovered recently - and this has been the press in the United States - is that the FBI has decided that black activists are a real threat, and they are collecting against black activists who are involved in peaceful demonstrations... Not violent demonstrations, peaceful demonstrations. There were arrests during the inauguration of Americans protesting Trump's inauguration that the American Civil Liberties Union is fighting, because it appears that the arrests happened for -- the people were not protesting violently. So now, when you think about all that data collection, you can understand why at a time that may be two years ago we didn't think that the government was collecting against us in the United States; there were reasons to say "Well, there's information that I would wanna protect, like who I'm communicating with, that our phones are in the same area", and so on.
Sven Johann: If I understand you correctly, a society cannot make progress if there is no privacy, because the people who really want to change the state of the art, let's put it this way --
Susan Landau: That's very well said, and I want to go back to the Church Committee, which had a very trenchant observation... It said that when a government collects against its people, what it does is it silences the people in the middle. The people on the extremes are still willing to demonstrate, to speak, to write, but the people in the middle, who are the ones who seek compromise and help move the society forward, rather than having radical outbreaks - the people in the middle get silenced if there is collection against them. They're afraid of losing their homes, they're afraid of losing schooling capabilities, they're afraid of harm to their children, and they fall silent, and that's very unhealthy for a democracy.
Sven Johann: Yes. But still, as the Edward Snowden documents show, this is happening... I mean, it happened in the U.S. and it's probably still happening... Your work on policy and law enforcement is... how can you balance--
Susan Landau: Sure. So let me actually answer the Snowden disclosures first, because I think that there was a lot more heat than there should have been, and there were also places where the NSA was clearly doing things wrong... Two of them, to my mind, was that there was a secret interpretation of the law that allowed bulk collection of communications metadata in the United States, and when you have a secret interpretation of a law, then you have a secret law, which has no place in a democracy.
Susan Landau: The other thing that happened that came out in September of 2013 is that the NSA had - and I wanna put "interfered" in quotes - "interfered" with the process for recommending a cryptographic algorithm. It was one that had encouraged the National Institute of Standards and Technology to issue as a recommendation for picking random numbers, Dual EC DRBG, and it seems that the NSA had a back-door into this algorithm, so that it could actually predict what the random numbers were, and those random number were used to generate keys.
Susan Landau: Now, the problem there is that it made NIST look like a bad player in the world of crypto algorithms, and that's very bad for security long-term. It's very hard to come up with strong crypto algorithms, NIST had done an exemplary job for the late 1990's until this incident came to light, and while it looked like NIST was blind-sided and probably didn't have an entirely good process in place, it did not look like NIST had done this understanding what was happening, but NIST was missing some process, which has since been fixed... But it created distrust in NIST, and therefore created distrust in the whole business of picking algorithms, and how do you do that in a way where you can test them internationally by a broad community of cryptographers? So those are the things that I think were really problematic. There's some other smaller things...
Susan Landau: On the other hand, the fact that you had a U.S. Signals Intelligence Agency collecting broadly across the world - that's what Signals Intelligence agencies do, and so I think there was a lot made of that, but some of the objections were more about protectionism that they were about anger about the actual collection of data. Some of the complaints that I've heard in France and in Germany - it turned out that the French government and the German government were actually handing data over at the same time that they were complaining.
Sven Johann: Yes, exactly. That made a lot of people really mad, that the German government doesn't -- that they actually cooperate with international security agencies and do not protect our data.
Susan Landau: One of the things I have to say - and I've many times complained about certain NSA activities, but one of the things I have to say is that the U.S. has a more transparent view of what's happening about collection, that is the Foreign Intelligence Surveillance Act... Parts of what's called USSID 18 about signals intelligence collection abroad, various processes are more public, certain details are not... Certain details of the process and policy are not; I'm not even talking about methods of collection. But the U.S. has been more transparent, and I think that's a very good thing, and a great thing for other countries to adopt, as well.
Sven Johann: How did that happen? Were they forced to be more transparent?
Susan Landau: So a part of it was the Church Committee hearings of the 1970's showed, for example, that the NSA had been collecting copies of all telegrams coming into the United States, from the 1940's on; there were other types of collections by the Army, and so on. When these things happened, we had the Foreign Intelligence Surveillance Act, which had certain requirements... NSA embarked on a process of education of its own people, and following the law -- now, they may go right up to the edge the law, but following the law became very important. There was a change in the early 2000's after the September 11th attacks, and that's when President Bush ordered bulk metadata collection, and there were various methods under which it was done... That is various legal methods under which it was done. But as I said, that was a secret interpretation of the law, so I don't want to say that everything is perfect; it's not perfect. Then as technology changes, it's sometimes hard to know how to interpret the law.
Sven Johann: Yes, of course. I mean, if you say it's the job of intelligence agencies to do that, right...? That means even if there is a law, they maybe break it... I remember a conference -- I do remember the country, it wasn't Germany, but there was a representative from a cyber crime division of the country and he said "Please work for us, because if you work for us, you are allowed to break the law. It's just our job to break the law."
Susan Landau: It's not the job of the United States of Law Enforcement to break the law; if you have to bring a case in court and you've broken the law, in most cases the case gets thrown out.
Sven Johann: Okay, okay.
Susan Landau: So it may be that what the person was saying - and it's hard to know, because I wasn't there to ask - is that maybe the warrant allows you to circumvent a different law... For example the Computer Fraud and Abuse Act, which prevents against breaking into other people's systems, except for law enforcement, and there is a specific exemption for law enforcement, which is appropriate, but that doesn't mean law enforcement doesn't need a search warrant; of course it does.
Sven Johann: Yes, okay.
Susan Landau: I certainly don't want to live in a country that doesn't have a rule of law.
Sven Johann: Now that I know it's the work of an intelligence agency to bulk collect all kinds of data, how can I protect myself, that I'm not too open about what I'm doing?
Susan Landau: The question I always have when I'm thinking about my own security protections - and while I'm originally trained as a geek, I'm not a security geek; I'm trained as a theoretical computer scientist, so I can understand the mathematical algorithms better than I can think about system configuration... The thing that I always think about is "What's my threat model? Who is it that might detect me, and what resources might they bring to bear?" For a long time I didn't use Tor, for example. Then I was teaching a class on privacy, and I decided I wanted the students to use Tor. Tor anonymizes a web connection; it does it through using public key cryptography and a series of servers that have the Tor software on it.
Sven Johann: So maybe if you use Tor - you have to use a certain Tor browser, I think...?
Susan Landau: You have to download software onto your device and use a Tor browser, and what it does is it encrypts three times with a public key, and then you connect to a Tor server... So the Tor server knows who you are, knows your IP address; that connects to another Tor server, which only knows that you've connected to a Tor server, but it doesn't know your IP address, it decrypts a second time. Then there's a third server which knows where you're exiting to.
Susan Landau: What happens is now if an adversary compromises the entire network, if an adversary can watch traffic over the entire network, they can say "Oh, I'm watching this traffic traverse, and I know the IP address from where it came, and I know the IP address where it exited the Tor network, so I can see." But most adversaries don't have quite that capability... So the entry point knows where you came in, the exit point knows where you're going to, but nobody else has insight into anything.
Susan Landau: The onion router (that's what Tor stands for) was developed by the Naval Research Labs in the '90s and has had many improvements since then. I wanted my students to use the Tor browser to see how usable it was, to see what problems it had and so on. I couldn't give them this assignment without actually using Tor myself, so I began using Tor more often. I feel that two-factor authentication is important, so I use two-factor authentication. So I have gone to using various of these tools, in part because I teach privacy and security and it's important for me to experience the value and the problems... But in all of this, you need to think "What's your threat model?" When I'm traveling, I won't order anything online, when I'm in a hotel, unless I use my phone as a hot spot, because I don't trust the hotels as an ISP. That's because of a criminal threat.
Sven Johann: Can you trust the phone connection?
Susan Landau: Yes, I think we can trust the phone connection. Again, what I have to worry about is am I specific target or a general target?
Sven Johann: Yes, that's true. Okay. I usually don't use the Starbucks network, or airport, or something... But somehow I trust a hotel network; I don't know why.
Susan Landau: Well, it depends what I'm doing. I'm certainly not going to sit at the hotel, at the airport Starbucks and order items. When I'm sitting in my hotel at night and thinking about my next trip, I might want to do a flight, and then I say "Okay, I'm going to do this." That's more a matter of -- it's disruptive at a Starbucks, so I don't do those kinds of things. But if I get to the point that I'm not trusting the phone network, I'm really thinking of myself as a specific target, and at that point I'm probably not doing anything online, except maybe reading the New York Times.
Sven Johann: Okay, good to know. I was a bit scared that I cannot fully trust the phone network.
Susan Landau: Well, again, I'm thinking about it in the context of U.S. law and U.S. requirements, which is what I know thoroughly.
Sven Johann: I talked to Richard Clayton about it; he gave a talk here at GOTO about cybercrime, and we had a conversation about a hacker who hacked via the phone network into the Jeep Cherokee, and took over the Jeep Cherokee, so I thought --
Susan Landau: Yes, but then again, that was a specific attack. So would I do this kind of communication if I were in Russia? Absolutely not. Would I do this kind of communication if I were in Spain or Germany? Sure. I don't see the Spanish or German government as seeing me as a threat or a target.
Sven Johann: Okay. So two-factor authentication... I rarely used it, until I read your book. After like 20% of the book I enabled it everywhere.
Susan Landau: Good for you.
Sven Johann: So why does that help us?
Susan Landau: Because if you're not using two-factor authentication, if you're using just your password to connect, then somebody takes your password - you don't know that happened... It could happen one day when you're at a cafe, it could happen one day when you're at home, and three weeks later your whole e-mail file is published online... Whereas if you have two-factor authentication, you're securing against a device being able to get into your account. The point is that when somebody takes your phone, you know about it.
Susan Landau: Now, if it's a very sophisticated attack, if it's a government-enabled attack where they take your phone, they knock you down, you're completely unconscious, you're in the hospital, they take everything - that's a different kind of threat. I'm not talking about people facing those kinds of threats. They, of course, have to do much fancier things to protect themselves. But if you're the head of personnel of a hospital, if you're the head of security at a company, then two-factor authentication -- if you're in fact even at a low level, two-factor authentication is very important.
Susan Landau: For example, what happened in the Ukraine in 2015 is that six months before three power distribution companies were attacked in December 2015, six months before hackers got into the business network, and from the business network accessed the power distribution network. So maybe the business network needed two-factor authentication; maybe, or maybe not. I would tend to do it everywhere... But certainly, the power distribution network of each of the three companies needed second-factor authentication.
Susan Landau: What you have there is once the hackers were in, they were able to figure out all sorts of things about the three different power distribution companies worked, and then able to shut all three of them down within a half hour of each other. Two-factor authentication would have prevented that from happening.
Sven Johann: Okay, so it's a good advice for all kinds of companies to protect mails, all kinds of logins to sensitive company data, with a two-factor authentication?
Susan Landau: And if you're sure that the company does not have -- if you're convinced that the sensitive parts of the company are kept really separate from the non-sensitive parts of the company, then maybe you can get away with passwords. But if there are connections - and there are always connections; nobody maps out everything completely... If there are connections to the sensitive parts of the company, then you want two-factor authentication.
Sven Johann: Good. Let's switch the topics... Maybe it's not a big switch, but I was just wondering about wiretapping in the backdoors. What is wiretapping?
Susan Landau: Wiretapping is simply the ability to listen into an electronic communication. Of course, if you put a bug into a phone receiver - for those of us who remember what phone receivers look like - that is wiretapping, but it's accomplished through an electronic bug, because it's a physical bug placed in. Wiretapping can happen at the device; you can place a wiretap through software on a smartphone, you can wire-tap through the phone switch, you can wire-tap in many different places.
Susan Landau: Backdoor is the idea that if somebody tries to encrypt a communication end-to-end, so that you and I talking on the phone - the conversation is decrypted on my phone, it's decrypted on your phone, but anybody else listening in hears an encrypted communication; a backdoor is a way of being able to get into that communication, even though it's encrypted for anybody else.
Susan Landau: About 20-25 years ago in the United States and in other parts of the world there was a big fight about the whole idea of backdooring encryption systems. One of the things that--
Sven Johann: Before we talk about backdoors, I have one question about wiretapping. Let's assume I wire-tap a connection - somewhere between my home and a major switch... If I use encrypted communication, what can the attacker--
Susan Landau: If you're end-to-end encrypted and a wiretap is not on your device, if you have an encryption system where you and I have exchanged the key and it's a strong encryption system, then if the wiretap is not on one of the end points, there's nothing they can do. All they get is noise.
Sven Johann: Okay.
Susan Landau: But it had been previously the case, before all of us carried computers with us - we call them smartphones - that there were many places along the path to put a wiretap in. The most common place was at what was called the phone central station. If you have a wire line phone, there's a wire that goes from your house to an office within a couple of miles, a few kilometers, and the phone lines come in in order, 5129-5130-5131, and the wiretap was put right there, where the communication would be in clear text and then move on.
Susan Landau: You can do a similar thing at a cell tower, because phone communications are encrypted between you and the cell tower.
Sven Johann: Okay.
Susan Landau: Back about 25 years ago there was a big fight in the United States and in other places about the whole idea of encrypted communications, and law enforcement wanted to backdoor. One of the ways this was prevented from happening was that in 2000, for a number of reasons, the United States government changed its control on exports of encryption. It had been the case that if U.S. companies wanted to sell communication or computer devices abroad with encryption in it, they needed an export license. This was slow, and very hard to get, and if the device had strong encryption, you might be told "Well, we're still working on the export license. Well, we're still working on the export license...", and by the third time you heard that, if you're in Silicon Valley, you say "Forget it", and you put in weak encryption, and you sell the device.
Susan Landau: Now, that was for selling it abroad, but if you're a company in the United States and you're trying to sell things abroad, you don't want to say "We're selling you weak encryption. We have strong encryption for communication for devices we sell in the United States, so you have weak encryption everywhere."
Susan Landau: In 2000, both Europe and the United States changed their encryption policy. In part, for the U.S., it had to do with the military buying commercial off-the-shelf equipment. If you think about the kinds of military coalitions we've had in the last 15-20 years, even by now, starting in 1991 (I think), we've had a lot of ad-hoc military coalitions. You have NATO - this is not an ad-hoc military coalition; the countries trust each other, they've been in the coalition for a long time, they have secure communications systems they developed... But now you put together a coalition to fight the war after Iraq invaded -- I'm blanking... Iraq invaded...
Sven Johann: Kuwait?
Susan Landau: Kuwait, right. When Iraq invaded Kuwait, the U.S. put together an ad-hoc military coalition, and these are partners that maybe the U.S. doesn't want to share its secure communications techniques; it's countries that maybe won't be its friends in three years, so being able to buy commercial, off-the-shelf equipment is very useful... Commercial, off-the-shelf equipment that has strong encryption. This was one of many reasons of the growth of internet, and e-commerce was another. So the U.S. substantially loosened its export controls, and we expected to see lots of devices with strong encryption. It took another 8-10 years before that happened in commercial devices, and during this time the FBI was unhappy, but became quite unhappy beginning in the late 2000's, early 2010's, and began talking about going dark... And it knew that backdoors was no longer an approved term; it was something that people thought was really yookie and so it said "We want frontdoors", and then it said "We want exceptional access. We get access only when we have a court order. Only when there's a legal authorization to come in, and otherwise the encryption is really strong.
Susan Landau: Then they changed the term to "responsible encryption", but all of it is the same thing - it's a way of getting in even when the device is encrypted.
Sven Johann: So just a nicer term for backdoor.
Susan Landau: Well, I would say it's actually what we call 1984 speak; it's double speak... Because what they're calling "responsible encryption" I would call "irresponsible security."
Sven Johann: Why is backdoor problematic? You could think, "Well, you know, if there is a backdoor and only (I don't know) the FBI or whoever has the key to open the backdoor, what could go wrong...?"
Susan Landau: Right, but I am talking at a broadcast where people are actually scientists and engineers, so point one - when you build complexity in, things break down. Point two - and some of this is due to a paper I did with a dozen other colleagues (15 other colleagues) called Keys Under Doormats... Point two is that it breaks all types of security protections we've come up with. One is called Forward Secrecy, which is the idea that each communication carries with it its own key. That means if an adversary has been scooping up communication over a long time, then if it wants to break the communications, it has the job of breaking it for each key, as opposed to breaking it once... And that's a very important security protection.
Susan Landau: Another security protection it breaks is authentication encryption. One of the things we've learned over time is it's not the mathematics that breaks encryption, and it's not the protocols that usually break an encryption; most often it's the implementations, far and away. Sometimes it's the protocols, and very occasionally it's the mathematics... But it's really implementation, protocol, mathematics, in that order, with orders of magnitudes difference between them.
Susan Landau: If you think about authenticated encryption, when we talk electronically, as opposed to you and I facing in person, there's this whole business of authenticating. When you do it over the telephone, often you can recognize the person's voice, although as we get better on all sorts of electronic imitations, that may disappear as well... But you authenticate the person's voice. But what happens when you get an e-mail? Well, you'd like a way to authenticate that it really is coming from you, as opposed to somebody else. As we heard earlier today in Richard Clayton’s talk, you want to know that the piece of mail coming from the company that you're buying coal from, if it tells you to switch banks, you wanna know that it's really coming from the company you're buying the coal from, and not from some other company.
Susan Landau: So what we did is we put together the idea of authentication and encryption in one step. We called it authenticated encryption, and use the same key for both. Well, if you take the idea of backdoors or exceptional access or responsible encryption, you have to separate the authentication step from the encryption step, because the law enforcement will allow authentication, but it doesn't want the confidentiality in the same way... Once you create that separation, you've gotten back to "We have more complexity, more likelihood for error."
Sven Johann: I think that's an important sentence that I want to repeat - more complexity, more error.
Susan Landau: Every software engineer knows this... Every software engineer knows it.
Sven Johann: The simpler the system, the less likely it breaks or can be hacked...
Susan Landau: So maybe what you're telling me is we should take all the policy folks at the FBI who are arguing for this and put them through a course in software engineering and the problem would go away. I've never thought about that solution, but maybe that's it. There's also the issue of who would hold the keys. Who would hold the keys?
Susan Landau: I take my phone, I travel from the U.S. to Germany for a conference, last week I went to Canada... Does my phone stop working when I go to Germany because only U.S. law enforcement has access to the keys? Or are we going to have a really fast multilateral legal authorization treaty in which as soon as German police decide they need to get into my phone, they get an immediate response from the U.S.? That's not going to happen as a matter of policy. Does that mean that I need a different system for traveling in Germany, a different system for traveling in the U.K., let alone in other parts of the world? That complexity is also unreasonable.
Sven Johann: Is that the reason why there are no backdoors implemented?
Susan Landau: Well, there are plenty of backdoors implemented. When you work at a financial firm in the United States there's a requirement that your conversations are taped, and anytime you call a financial firm, a stockbroker and so on, it says "This conversation is being recorded."
Sven Johann: But is the communication itself encrypted?
Susan Landau: Well, it's recorded at the broker's end. So you're speaking to the broker, the broker understands you in English, it's speaking to you back in English, and that English conversation at the broker's end (at the endpoint) is recorded. Financial institutions are one place where this is a requirement. Various companies where you work - if you're working for a defense company - it will say that your communications are open to the company, and they will have software on the device and so on to enable that, and that's part of their security model. But that's different from everybody's devices being recorded in that way.
Sven Johann: If I have a cell phone, I don't want responsible encryption, irresponsible security. Actually, there was a case - was it the San Bernadino attack, or...?
Susan Landau: Right. There were two attackers... The San Bernadino Health Department - they killed (I believe) 14 people and injured many more; they fled, they were found by police a few hours later... They had destroyed their computers and their phones, but they'd left behind one locked iPhone, and there was an 18-minute gap where the police didn't know, based on cell phone data and so on, where the attackers had been, and they wanted to know if they'd communicated with anybody, they wanted the phone opened. Now, the "communicated with anybody" seems a bit weird, because if they communicated with anybody, it would have been picked up by the towers.
Susan Landau: But putting that aside, they asked Apple to undo the security protections on the phone. This is somewhat different from end-to-end encryption, it's about locked devices. They asked Apple to build software that would undo the protections, which included you have ten tries to type a PIN into the phone, and if after the ten tries you can't open the phone, the phone destroys all the data. The other protection that Apple put in is that each time you tried with an incorrect PIN, it would slow it down till the next time you could try. FBI said "Look, we can't get into the phone. Only Apple is able to get into the phone by undoing the security protections, doing an update specific to the phone." Apple refused, they went to court...
Sven Johann: So they refused because they thought it would harm their business model, or...?
Susan Landau: I'm going to be very careful here; my interpretation of why Apple refused - I'm not speaking for Apple in any way. What Apple said in court and also in the congressional hearing in which I participated - they said the government doesn't have a legal right to request this. The government was relying on a law from the 1700's about companies providing assistance in these types of cases. Apple said what the FBI was requesting went well above and beyond the type of assistance envisioned.
Susan Landau: Apple also said - and I'm not a lawyer, I'm not going to comment on the first part... But the second part, which is about security, I very much agree and support Apple. In fact, they used some of my arguments in the briefs to the court - they say it creates a security problem. The FBI originally had been saying they had a single phone that they wanted updated in this way; once they went to Congress, they said "Well, we actually have another eleven or dozen", and then the state of New York said that they had several hundred, and by now the law enforcement has several thousand, I think it's up to about several thousand.
Susan Landau: Well, when you think about the process of doing those kinds of updates, the phones have to go to Apple, because Apple is not going to mail the software somewhere else...
Sven Johann: Hopefully not...
Susan Landau: Right. They have to be individualized for the phone... Not in terms of new software, but they have to be specific to a phone so that they don't get used and sent somewhere else. So now you have a lawyer and an engineer looking at each phone as they come in, the lawyer looks at the legal contract, the engineer makes sure it's the right phone, that it hasn't been tampered with in any way, and they apply this update.
Susan Landau: The problem is you've now introduced - when you have several thousand over a year and a half's time of collection - many more people to the process of doing an update... And we all know in security the more people you have in such a situation, the less likely that you can keep the security of the update system safe.
Sven Johann: Again, something like the more complex it gets, the...
Susan Landau: Yes, but here you're introducing people, which is a completely new variable that's complicated.
Sven Johann: From a software perspective - you have to program it, and....
Susan Landau: Well, you already do the updates individualized to a phone, and that's partially because they want to be sure that when they are getting a request for an update, that they are updating the phone correctly, and not rolling back and giving a previous update.
Sven Johann: Yes, so it's also important that you don't automate it, in a sense...
Susan Landau: That's right, because you want to make sure that you're doing it to the right phone when you're doing this insecurity update; it's not a security update, it's an insecurity update... So that's why you have the people involved, but the people create the risk. And now you think about the risk that gets created in the update process. When I think about the best things that we've done for security over the last decade or so, to me the most important one is automatic updates.
Sven Johann: The best one...?
Susan Landau: Yes, automatic updates.
Sven Johann: I think it's totally interesting, I also think "Automatic updates - totally cool", but security departments of companies I work for as a consultant, they say "We don't want to have automatic updates."
Susan Landau: Because it breaks systems. On the other hand, their job should be to figure out how to deal with the update quickly, and maybe we need better communication between the providers of systems like Apple and Windows and so on, better earlier communication that says "This is what this update is going to do. We're not quite ready to roll it out, but here are the things you should be aware of, so that you can update your systems." But yes, automatic updates, especially for consumers, are critically important.
Susan Landau: If you get to the point where automatic updates are not trusted because maybe somebody has corrupted the Apple update system one time, or maybe people are worried that they're going to be snooped on, then we're moving very badly backwards in security at a time when we cannot afford to do so.
Sven Johann: Okay. Talking about end-to-end encryption, I was really surprised when I read in your book that a positive example of end-to-end encryption is WhatsApp. In Germany, WhatsApp and Facebook are seen very critically; I know people who don't use WhatsApp because they say "I don't trust it. I don't want to give my data to Facebook." Why do you make a positive case out of WhatsApp?
Susan Landau: Sure. So to my regret, I didn't use WhatsApp before Facebook bought it, because now in order to use WhatsApp, I have to register at Facebook, and I don't have a Facebook account.
Sven Johann: But I used WhatsApp for a long time, and I didn't register at Facebook; I don't use Facebook.
Susan Landau: That's right, but if you had WhatsApp before it was bought by Facebook, you didn't have to register with Facebook when you did so. So WhatsApp and Signal differ in one way. They use the same algorithm--
Sven Johann: Signal is a similar software, like WhatsApp.
Susan Landau: Yes. They use the same algorithm, with one distinct difference. To communicate securely, you have to do a one-time out-of-band communication, in which the two users, Alice and Bob, exchange a piece of information, to assure themselves there's no man in the middle. If Bob goes and gets a new SIM card, if Bob gets a new phone, they have to do that one-time exchange of information again.When Alice sends a message to Bob and Bob has updated in some way, Signal requires that they do this one-time exchange before the message is received. WhatsApp doesn't. So that's less secure. That's the one difference. But WhatsApp has a billion users, Signal doesn't. WhatsApp was working for convenience, and for almost all their users, that's the right choice. They're providing end-to-end encryption in a convenient way, which really matters to their users. Signal is providing end-to-end encryption for users who care more about security than they care about convenience, so Signal made a different choice.
Susan Landau: So we have now three different things you're asking about - there's the difference between Signal and WhatsApp, and there's the difference between WhatsApp and Facebook. One can communicate via WhatsApp and not share other information with Facebook, and then your communication is end-to-end encrypted. And the only thing you've done if you get a WhatsApp account now is that you've shared a little bit of information with Facebook, and perhaps they know how much you used the WhatsApp application, but aside from that, they don't have other data.
Sven Johann: So basically the content - there are no backdoors...
Susan Landau: That's right. It's the same algorithm as Signal, it's just that they made a choice about convenience, which for their user body is probably the appropriate choice.
Sven Johann: Okay. But still, Facebook and attackers of all sorts, governments - they know the metadata of conversations... What are metadata?
Susan Landau: Sure. Communications metadata, if you go back to phones, the phones that sat on your grandmother's hallway table, those big, black phones with a real, physical bell inside - they would say that a phone call occurred between this number and that number at this time, and lasted this many minutes. Phones didn't move back until sometime in the '90s. That was very useful for trying to establish, for example, that two criminals had talked to each other just before they went out and committed a robbery, or they talked to each other and then you notice that one of them charged Nitric Acid they bought for a bomb, and then they talk to each other and the other one went out and bought fuses for the bomb, and so on. That was very useful.
Susan Landau: As people began using cell phones and smartphones, the metadata became much richer, because now you knew not that this phone line called that phone line, but you know that this individual call that individual; few people share cell phones. You might share it with your wife, I might share it with my husband when we're out hiking and I didn't bring my phone, but typically a cell phone is tied to an individual.
Sven Johann: And they know where you are...
Susan Landau: They know where you are, and so on... So it's very rich for evidential purposes. That's a situation where the data that previously the two people on the phone knew they had talked to each other, but the phone company only knew which phone number talked to which phone number. Now we're in a completely different situation. I talk to you on the phone - I know I'm talking to you, and the phone company knows I'm talking to you, but the phone company knows where I am and where you are, whereas I might tell you I'm in New York when actually I'm in Berlin and I don't wanna tell you because I don't have time to see you, and I'm embarrassed, and all of that.
Susan Landau: So the phone company knows more about certain things than you and I do, and it's a real transfer of richness of information. But that evidentiary value of being able to track a person is really important for law enforcement and it's really important for national security, and very heavily used by both.
Sven Johann: You say it's important... Is there any risk to society or individuals?
Susan Landau: Of course.
Sven Johann: I mean, if they are not a suspect, because... I mean, I understand if there is a crime, I really like the idea that the police has lots of possibilities to access the data to narrow down...
Susan Landau: Sure, so what you want to do - this is a case where you really want to control the use of the data. Let me give you an example with a different kind of data, and then we can go back to phones for a second. We all use, whether it's metro cards to be on the subway, or E-ZPass or a version in Europe to go and use toll roads where we no longer pay the toll, but it's just automatically deducted from whatever account we set up, by something that reads something from your car. We all find that very convenient. It saves money, it's convenient, we use less gas because we don't have to slow down as we go through... But if a divorce lawyer proves that one part of the couple was always leaving work at four, stopping somewhere along the way at a particular time - simply by tracking the husband's route, let us say, and show that in fact there was an affair going on, that changes what the wife might collect under alimony, or whatever, or the wife may get to keep the children more of the time... All of those kinds of issues.
Susan Landau: So you want protection of that data. You want that data to only be stored for a certain amount of time. We never had that data before - is there a reason that that data should be kept more than a month, more than three months and so on? Under what circumstances should it be released?
Susan Landau: Now let's go back to communications, and I wanna go to encrypted voice over IP. There's some nice work that Charles Wright and Fabian Monrose and--
Sven Johann: But do get lawyers easy access to that? In one case I think somebody came up at the Chaos Computer conference and was like "People say metadata doesn't matter, only in the case where law enforcement asked to get them... But maybe they get in the wrong hands." Insurance - which is seen as the wrong hands - knows or gets the data that you called your doctor, or you made an AIDS test, and you just called that place and then you called your wife, and then you called some...
Susan Landau: Hotline, yes.
Sven Johann: ...something, and they may think "Okay, we quit his contract for some made-up reason."
Susan Landau: Right. So that's again a case where you really want to control use - who gets to use that data, what are the controls on what an insurance company could do... And if you go back to the 1970's in the United States, in the 1960's banks and other financial institutions began sharing data about people who were taking loans out, and all of a sudden you had this transfer of power from the individuals who had a pretty good sense about whether or not they could pay the loan - maybe not the perfect sense, but a reasonable sense - to all of a sudden the banks knew a lot more about you than you knew about yourself... So we got the Fair Credit Reporting Act, which said "Alright, you can have these institutions that collect information about your financial situation, how many credit cards accounts you have, how high their limits are, how quickly you pay, whether you're paying interest on it and so on... You can have this data collection, but that data cannot be shared with a bank without your permission." That's the Fair Credit Reporting Act, and that's an example of the kind of thing where you put controls on use, so what you're talking about here is controls on use.
Sven Johann: And if maybe the bank requires me or asks me to hand out the data, but that's probably not -- I can sue them for that.
Susan Landau: Well, again, it depends on your country's laws. If your country's laws say the banks can't be allowed to ask that, or the health insurance company cannot discriminate against you on the basis of a genome test you had...
Sven Johann: Yeah, exactly.
Susan Landau: But now, if you go back to communications metadata, if you talk about IP communications, voice over IP, there's some nice work by Fabian Monrose, Charles Wright and a few others that says if you looked at encrypted voice over IP, because we have a lot of redundancy in speech - it's just the way human beings are built - when you go to voice over IP, there's compression in the speech, and if you look at the packet length in voice over IP, that's actually very revelatory. It can reveal whether it's a man or a woman speaking, it can reveal what language they're speaking, and in some cases it can reveal the actual words that they're saying, simply by the packet length.
Susan Landau: Now, who would have thought that packet length is revelatory of content? So we all of a sudden, in this world that's much richer with metadata, have a real mess about how do we handle that data? Do we count it? Should we count location, the way we counted location back when phones didn't move, as part of communications metadata that law enforcement, at least in the United States, can get, if it's relevant to an ongoing investigation, or because it really reveals a lot about somebody? You notice that every day they come home at five from the factory, and one day they leave the factory at three, they go to a bar, they sit there till nine PM, they get home and they don't go to the factory again - you know they've been fired.
Sven Johann: Yes, exactly.
Susan Landau: So now do we treat location information as a different kind of information because it's so much richer? We have to figure that out of the societies, and we haven't gotten to that yet.
Sven Johann: Okay, but how does the society figure that out? Is there research which has to raise the hand, and activists --
Susan Landau: Absolutely, absolutely. For example, there's a Privacy Legal Scholars Conference in the U.S., and I believe there's no one that happens in Holland, that was an outgrowth of the one in the United States. The scholars publish all these theoretical papers about what could happen if, and what could happen later, and you can certainly track certain of the privacy cases that happen in the U.S. We do law in two ways in the U.S. - Congress and state legislators pass laws, but you also have reaction to the laws or reaction to cases happening in the courts, where the courts will say "This is a violation of due process, this is a violation of rights, and the law is too loose, or the police did not act within the law", and you get changes based on that. So sometimes the academic papers influence law, sometimes they influence court cases, and that then changes how the law is interpreted or new laws are passed.
Sven Johann: Switching again a little bit - it's not actually a switch, but... So these things - we can secure ourselves, there are laws which protect us, but are there attacks we cannot prevent?
Susan Landau: Of course, of course.
Sven Johann: I think of like large-scale Stuxnet attacks or something like that.
Susan Landau: There are many kinds -- well, Stuxnet was not large-scale, it was very, very targeted.
Sven Johann: Yes, okay, okay... But nobody can prevent it.
Susan Landau: We can stop using software and hardware, and then you could prevent those kinds of attacks. Short of that, I don't think you can. The analogy that people in security like to use is that you put locks on doors to prevent the common criminal, the street criminal from breaking into your house, and as long as your lock is at least as good as everybody else on the street, or maybe a little bit better, you're in good shape.
Susan Landau: If you have a Picasso on the walls, then you don't have big plate glass windows for everybody to see the Picasso and break the window and go in and get it. If you practice reasonable security, you don't go to a Wi-Fi network at a cafe or the hotel lobby or the airport lounge and do important financial transactions. If you actually look where the mail came from, if you don't open attachments at random, then you are reasonably safe.
Susan Landau: We have become as a society however -- increasingly, we've built our society upon fragile networks, fragile systems. Five years ago I would have said "Well, we have to worry about a concerted attack against critical infrastructure", but on the other hand, an attack against critical infrastructure like the attack against Ukraine power distribution companies - that's an attack that it might take a certain amount of time to establish attribution, but you can establish that attribution with a pretty high probability of knowing who did it, and then you can attack them.
Susan Landau: In the case of the power distribution companies, the Ukrainian companies, it has all the hallmarks of a nation-state. There was a lot of practicing and effort put into the attack to bring down three companies within a half hour of each other, and you can hold them accountable in certain ways...
Susan Landau: We've seen in the last couple of years a real shift to going after the soft underbelly of civil society. In the case of North Korea it was pretty weird to go after some picture entertainment as a way of stopping them from showing a film that the North Korean government objected to.
Sven Johann: You mean the Sony production--
Susan Landau: Of the interview.
Sven Johann: Yes, exactly. What was the movie like? I actually didn't see the movie.
Susan Landau: It got terrible reviews.
Sven Johann: What was the movie about?
Susan Landau: It mocked Kim Jong-un, the current leader of North Korea, and the North Koreans did not want the movie to be shown, and they threatened Sony Pictures Entertainment... Which for a while thought it wouldn't show the film.
Sven Johann: They threatened, or they actually did something...?
Susan Landau: Well, they did several things. First of all, they sent a threatening mail, which nobody paid attention to. Then they published e-mails, they published human resources information about individuals, they showed some of the films that Sony Pictures Entertainment had had available on their internal networks, they destroyed various machines electronically... So they did a number of bad things, and then they said "If you show the film in movie theaters despite all this, we will bomb the theaters”. Now, they didn't say it as the North Korean government, they said it as a group, but it was clear who was behind this.
Susan Landau: At this point, the U.S. government and others pointed out that it's one thing to attack electronically on wires going under the Pacific or other ways, it's another thing to move actual bombs into movie theaters, and that was not a serious threat. But that was one example of going after civil society, and I think President Obama - it's hard to know exactly the reasons that he viewed it as a national security threat, but I think the idea that you can shut down speech in another nation by doing an electronic attack of this sort is a very disturbing idea.
Susan Landau: The much more disturbing attacks on civil society are the ones that we saw in France, against the Macron campaign, in the United States in our presidential election... In Germany there was the incident of the German-Russian teenager who ran away and then there was all sorts of -- and I don't even want to give it the name "false news."
Sven Johann: But the Russian teenager - it wasn't a made-up story.
Susan Landau: That's right. Well, she ran away, but the rest of the story was a completely made-up story, that she had been raped by immigrants, and so on. She did actually run away, that part I believe was real, but the rest of it was not real... But I don't even want to call it fake news, I want to call it complete disinformation. It has nothing to do with news. Those types of attacks on civil society --
Sven Johann: So the Macron attack is that somebody wanted to influence the election in France, possibly to get the right-wing party in power, to destabilize...
Susan Landau: The French government.
Sven Johann: The French government, and therefore--
Susan Landau: The E.U.
Sven Johann: And then the E.U. In Germany people were also worried that Russian attacks...
Susan Landau: Would do the same thing. And in fact, in the last couple of days the British government has said that there was a concerted effort in the last couple of days before the Brexit election two years ago to influence people who are likely to vote against staying in the European Union, to vote against it... And that those ads were Russian influence.
Sven Johann: The same is going on in the U.S., right? It's not only U.K.
Susan Landau: Absolutely. We don't vote in Brexit, but yes, the same type of effort. These attacks on civil society are very scary, and the question is how you react to it, what you do. There are both computer security protections you want to do, of the sort that you increase the security of accounts, you keep certain data offline, you make certain communications ephemeral; there's no reason to keep communications.
Susan Landau: We have a tendency as a communicating species to imagine that when I speak to you, what I say may influence you, but the words drift away, because they were just said between us. But when we put them in an e-mail, they don't drift away, and we don't have a history of saying "Hah, the thing I said to you that was nasty about this third-person, and that I just said to you, and I figured you heard it and that was it - now you can prove that I said it, or somebody else can prove that I said it." We don't have those mechanisms unless we use ephemeral communications, end-to-end encryption and so on.
Susan Landau: So the security protections we might install - there's resiliency of how different civil society groups can recover once they were attacked... Not the technical recovery, which is one thing we do know how to do, but the social recovery, of how you establish "Yes, these were communications we had, but there were some false communications put in deliberately, so that you can't tell which is which. We knew which were which", or whatever it is you do... And Macron's campaign claimed to have done that - that they seeded their campaign information with false documents, so that if there was such a theft, they could react against it. It's establishing resiliency...
Susan Landau: Then there's the whole issue of how do you deal with disinformation campaigns within electronic social media that make accountability very difficult to establish.
Sven Johann: Is it possible to do something against fake news?
Susan Landau: Oh, sure, it's possible. You could do lots of things about news feeds, that would not be what Facebook wanted to do, and would involve much more people involvement. Whether you could do it legally in the United States, under the First Amendment, depends on how you view social media. There's regulation of speech in the United States on radio broadcasts and television broadcasts in the sense that political ads have to be labeled "Political ads." There hasn't been similar regulation on social media. That one is easy to do. The more complicated one is what you do about communications that are political in nature but are not political ads, and how do you label something as political speech when it's not and that becomes First Amendment issues. But we haven't even thought about Facebook or Google or other social media as falling within that regulatory framework, and it may be that that's what we need to think about, at least for pieces of what they offer.
Sven Johann: Since you worked for Google and Sun Microsystems, what can we as software developers learn in terms of security and privacy from Google, Sun and those firms.
Susan Landau: Sure. So I'm really impressed by both Sun and Google in terms of security. I still remember the weekend I was working for Sun, and we became aware of a vulnerability and I watched this conversation of our security interest mailing list as it was getting patched... It still took a day or two to roll out and process after the weekend was over, but I was watching people work all weekend, and we didn't have anything like a patch every fourth Thursday of the month or whatever it might be... We patched when we became aware there was a vulnerability. I was and remain really impressed with Sun's security Ethos.
Susan Landau: Google in 2010 got attacked by the Chinese, who took some software and did other things - I'm reporting this not as a Google employee, but as something I read in the press... And Google got very serious about security. It had been serious before, but it got much more security-serious, and I find there's security protections, whether it's Chrome, which is probably the most secure browser, or the fact that it does red teaming inside the company.
Sven Johann: Why is Chrome probably the most secure browser? What are they doing to make it more secure?
Susan Landau: Now we're getting past my expertise level, aside from sandboxing in the right ways, protecting against cross-site scripting, and so on. There are other parts to the fact that you're using Chrome, because of course, the data is shared with Google, and that's a different issue. But on the security side, Chrome appears to be the best.
Sven Johann: They also offer money to people to--
Susan Landau: Absolutely. The other part is that Google does a lot of red teaming.
Sven Johann: Red teaming...?
Susan Landau: Red teaming - using Google employees to attempt to attack Google systems to find holes in them, and then improve the protections. Google has Google Authenticator, which is a way of doing two-factor authentication. It's a piece of software you download on your device... I have two different pieces of software on my phone for two-factor authentication - Google Authenticator, which I use for a couple of sites, and Duo, which is from a company in Michigan, which I use for other sites.
Susan Landau: On privacy, the issues are somewhat less clear. First of all, Sun was never in the business of collecting consumer information, it wasn't in that business space at all. Scott McNealy famously said "You don't have any privacy, get over it."
Sven Johann: Scott McNealy?
Susan Landau: Scott McNealy, the CEO of Sun.
Sven Johann: I thought it was Mark Zuckerberg who said that...
Susan Landau: No, it was Scott McNealy, but both his view of privacy and the company's view of privacy was in fact a great deal more nuanced... But it was in a different business than Google is. Google views good privacy as good security, and they certainly have a good security story. They do use some forms of privacy-enhancing technologies to protect user data (that includes multi-party computation) to enable Google to learn certain information in the aggregate - certain very private information about users - while not learning information about individuals.
Sven Johann: In the aggregate?
Susan Landau: Learning about who is responding to certain types of ads, and what percentage of people are responding to certain types of ads, but a group of them, not about each individual. So aggregate means learning about a group, without learning about individuals. Using other forms of differential privacy to learn things about how users are responding to URL's and to search items. I think that that's a great step in the right direction, but most of Google's response on privacy is really about security, rather than privacy per se.
Sven Johann: In terms of learning all about individual users, to me that sounds a little bit like anonymization. Is that true?
Susan Landau: Well, Google doesn't do anonymization in the sense that Google is trying to provide you with personalized service, and it's hard to anonymize if you're trying to provide personalized service. So I don't think Google is going in the direction of anonymization, although you certainly can use Google services without being a signed-in user.
Sven Johann: But can a Google employee - I'm not sure if you're allowed to answer the question, but is it possible for a Google employee to see individual...?
Susan Landau: In fact, I'm really allowed to answer that question. Absolutely not! The fastest way to get fired at Google is to try and snoop on an individual Google user, and you are out the door before you finish typing, essentially. So while I'm not allowed to comment on things I learned at Google, that particular thing I'm sure I'm allowed to comment on. That is a complete violation of Google policy.
Susan Landau: What I was going to say about anonymization is if you're trying to provide individualized service - and that's what Google is trying to provide - then you're not going to go down the path of anonymization. That's different from allowing a Google employee to snoop on what a user is doing. That's absolutely not permitted, at all, period. The end.
Sven Johann: Why I'm asking - I work with insurances, health insurances, this data is really sensitive, of course. I as a software developer need (banking is the same, but let's take healthcare) test data to work on, which can be easily shared; it should be easy not to make mistakes, and for that reason, they need to be anonymized... But somehow that seems to be a very hard problem.
Susan Landau: An important thing to do is to frame the problem right at the beginning. "What kind of data am I going to be allowed to test on?" "What kind of data do you need to have?" "This is what I need." "Can we anonymize it in this way? Would that suffice for your program?" "No, because I have these constraints." So well before the program is architected, well before you're in final coding stages, well before you're ready to deliver to the company, you have conversations about "Here's what I want to be able to test on." "Well, we want to give you this kind of data. Will it work? This is sufficiently anonymized in these ways."
Susan Landau: It's just the same way we should think about security, from the beginning. We should think about privacy from the beginning, including in terms of what kind of test data will I be able to have.
Sven Johann: Yes, if you create software architecture, of course there are those security and privacy things you have to take -- the same with reliability... So you cannot build them in afterwards, you have to think about it up-front.
Susan Landau: That's right. But your question really makes me think about how important it is to be teaching software engineers within a software engineering course the kinds of questions they should be asking about the test data that they will be receiving, and that they need to ask it early, so that there's a good dialogue and they get to the point where they're testing against reasonable data, but that's also privacy-protective with users.
Sven Johann: And usually, we don't think about test data in the beginning, so...
Susan Landau: But if you're thinking about privacy, you have to.
Sven Johann: Okay. Thanks.
Susan Landau: Thanks very much.