“I work on technical aspects of privacy,” says George Danezis, a reader in security and privacy engineering at UCL and part of the Academic Centre of Excellence in Cyber Security Research (ACE-CSR). There are, of course, many other limitations: regulatory, policy, economic. But, he says, “Technology is the enabler for everything else – though you need everything else for it to be useful.” Danezis believes providing privacy at the technology level is particularly important as it seems clear that both regulation and the “moralising” approach (telling people the things they shouldn’t do) have failed.
There are many reasons why someone gets interested in researching technical solutions to intractable problems. Sometimes the motivation is to eliminate a personal frustration; other times it’s simply a fascination with the technology itself. For Danezis, it began with other people.
“I discovered that a lot of the people around me could not use technology out of the box to do things personally or collectively.” For example, he saw NGOs defending human rights worry about sending an email or chatting online, particularly in countries hostile to their work. A second motivation had to do with timing: when he began work it wasn’t yet clear that the Internet would develop into a medium anyone could use freely to publish stories. That particular fear has abated, but other issues such as the need for anonymous communications and private data sharing are still with us.
“Without anonymity we can’t offer strong privacy,” he says.
Unlike many researchers, Danezis did not really grow up with computers. He spent his childhood in Greece and Belgium, and until he got Internet access at 16, “I had access only to the programming books I could find in an average Belgian bookshop. There wasn’t a BBC Micro in every school and it was difficult to find information. I had one teacher who taught me how to program in Logo, and no way of finding more information easily.” Then he arrived at Cambridge in 1997, and “discovered thousands of people who knew how to do crazy stuff with computers.”
Danezis’ key research question is, “What functionality can we achieve while still attaining a degree of hard privacy?” And the corollary: at what cost in complexity of engineering? “We can’t just say, let’s recreate the whole computer environment,” he said. “We need to evolve efficiently out of today’s situation.”
One of Danezis’ first papers was “Towards an Information Theoretic Metric for Anonymity” (PDF), which won an award at the PET Symposium in 2003. “This was the first paper that said we had better start finding ways of measuring anonymity to make sure we get better quality. What doesn’t get measured doesn’t get done,” he says. The research involved analysing anonymising systems to understand their properties, with the goal of establishing methods by which their performance could be measured and compared over time.
Danezis moved on to study the premier anonymising system, Tor, publishing the results with Steven J. Murdoch in the 2005 paper “Low-Cost Traffic Analysis of Tor” (PDF). “We were the first to look at how to break Tor given its threat model.” The significance was not just that the paper exposed Tor’s limitations; it also meant that Tor users understood when they were safe and when they weren’t. Accordingly, when it came out this summer that the NSA had targeted Tor, the negative impact was minimal: “The open community was at least as mature as the classified community, so it wasn’t a surprise, and prudent security designers understood exactly what they were getting.”
The core problem, as Danezis has said, is that users do not go online in order to protect their privacy. They go online to share things, to collaborate, to “build together”; threats to privacy scare them away. In the 2009 paper “SybilInfer: Detecting Sybil Nodes Using Social Networks” (PDF), Danezis, working with Prateek Mittal, studied social networks to find ways to distinguish honest users from “Sybils” – that is, nodes controlled by an adversary.
“It’s a big problem with peer-to-peer networks,” he says. “How do you defend against 1 million fake nodes introduced to subvert them?” Earlier solutions were resource-intensive, particularly when faced with a large adversary that owns a botnet. SybilInfer uses a probabilistic model of social networks and Bayesian inference to use the social graph as a resource. For example, fake nodes tend not to be well connected except to each other.
“Privacy isn’t all about hiding things,” says Danezis. “It’s also about being able to have big networks where people can collaborate but are not subject to being subverted by big entities. To me, this is also privacy, though it’s a little controversial because there’s no secret there.”
Most recently, Danezis has been building on the work done for his 2011 paper with Alfredo Rial, “Privacy-Preserving Smart Metering” (PDF).
The security controls on smart meters are aimed ensuring that customers can’t steal electricity. Current business models expect to deliver all a nation’s data to the utility companies automatically from readings as frequent as every half hour. Danezis and Rial propose a system that allows for fine-grained billing without revealing a detailed portrait of every household’s energy usage. One option is to encrypt all the readings, ensure that no one has the decryption key, and allow the meter to provide only specific authorised statistics. A second is similar, but allows the authorities to unlock the results to audit the meters.
In further work, Danezis would like to build a stack of services that would allow users to access Facebook, instant messaging, and status updates but also provide strong privacy underneath. Often, the reason users do not make good security and privacy decisions is the system is not designed that way – and often software engineers don’t make it easy for users to protect privacy because they find it hard themselves to design and deploy such technologies. What’s required is a toolkit: high-quality building blocks that simplify creating complex distributed systems; languages that can express computations on private data; and infrastructures and services on which security multi-party computation and anonymous communications can be built.