This is a repost of something I wrote in November 2015, following the terrorist attacks in Paris. It was subsequently translated to French and published as an op-end in Le Monde.
In light of the recent terrorist attacks, things are getting heated for the regular security and encryption software developer. Being one myself, I’ve been on the receiving end of a small avalanche of requests from journalists, political pundits and even law enforcement. I’m also someone who was born and raised in Beirut and who recently immigrated to Paris, both of which were the sites of twin attacks, one day apart from each other.
It seems necessary to share some perspective on what’s going on with encryption software, the terrorists supposedly using it, and what this means for the rights and the security of our global communities.
The encryption software community writes a large variety of software, from secure instant messaging to flight tower communication management to satellite collision prevention. We do this for a number of reasons, but there’s always an underlying shared understanding: that we’re using mathematics and engineering to contribute towards a society that’s safer, more capable and able to communicate with a sense of privacy and dignity inherent to all modern societies.
The premise driving the people writing encryption software is not exactly that we’re giving people new rights or taking some away: it’s the hope that we can enforce existing rights using algorithms that guarantee your ability to free speech, to a reasonable expectation of privacy in your daily life. When you make a credit card payment or log into Facebook, you’re using the same fundamental encryption that, in another continent, an activist could be using to organize a protest against a failed regime.
In a way, we’re implementing a fundamental technological advancement not dissimilar from the invention of cars or airplanes. Ford and Toyota build automobiles so that the entire world can have access to faster transportation and a better quality of life. If a terrorist is suspected of using a Toyota as a car bomb, it’s not reasonable to expect Toyota to start screening who it sells cars to, or to stop selling cars altogether.
And yet, this is the line of questioning that has besieged the cryptography community immediately after the Paris attacks. A simple mention of my encryption software in an Arabic-speaking forum is enough to put me on the receiving end of press inquiries such as “are you aware of any terrorists using your software? Do you feel it’s your responsibility to monitor terrorist activity?” Or, more bluntly, do I feel like I’m complicit in aiding terrorists, by the simple fact that I write cryptography software or currently do PhD research in applied cryptography? The brouhaha that has ensued from the press has been extreme. I’ve received calls that bluntly want to interview me regarding “technology used by terrorists, such as yours.” A Wired article, like many alongside it, finds an Arabic PDF guide on encryption and immediately attributes it as an “ISIS encryption training manual” even though it was written years ago by Gaza activists with no affiliation to any jihadist group.
In this rush to blame a field that is largely unknowable to the public and therefore at once alluring and terrifying, little attention has been paid to facts: The Paris terrorists did not use encryption, but coordinated over SMS, one of the easiest to monitor methods of digital communication. They were still not caught, indicating a failure in human intelligence and not in a capacity for digital surveillance.
But even in light of all the evidence pointing towards a human intelligence failure, cryptography, being to the outsider a scary and mysterious usage of secret codes and complicated algorithms, remains an easy target. The press again drives the discussion, each time with a lessened priority for measured questioning and proper investigation. Why haven’t you inserted back doors into your software? Do you want terrorists to use your tools? The call for backdoors is nothing new. During my career in the private sector, I’ve been asked to backdoor encryption software so as to please potential investors, and have seen colleagues who appeared to stand for secure software balk under the excuse of “if that’s what the customer wants,” even if it results in irreparable security weaknesses. I’ve had intelligence officers ask me informally, out of honest curiosity, why it is that I would refuse to insert backdoors.
The issue is that cryptography depends on a set of mathematical relationships that cannot be subverted selectively. They either hold completely or not at all. It’s not something that we’re not smart enough to do; it’s something that’s mathematically impossible to do. I cannot backdoor software specifically to spy on jihadists without this backdoor applying to every single member of society relying on my software.
And I’ve seen what guarantees secure communication can give a society. I’ve seen my software used in Hong Kong to organize protests against a government otherwise unwilling to give people their rights. I’ve seen my colleagues produce software used by Egyptians rallying for democracy. I’ve had childhood friends call me from Beirut, desperate to know of a way to organize protests against a government that would lock them up were they to use public phone lines. I’ve set up communication lines for LGBTQ organizations so that they can give counsel without fearing ostracization or reprisal. And in the comfort of my new life in France, I’ve also relied on encryption so that I know I’m obtaining my simple right to privacy when discussing my daily life with my friends or with my partner.
I’ve come to see encryption as the natural extension a computer scientist can give a democracy. A permeation of the simple assurance that you can carry out your life freely and privately, as enshrined in the constitutions and charters of France, Lebanon as well as the United States. To take away these guarantees doesn’t work. It doesn’t produce better intelligence. It’s not why our intelligence isn’t competing in the first place. But it does help terrorist groups destroy the moral character of our politics from within, when out of fear, we forsake our principles. If we take every car off the street, every iPhone out of people’s pockets and every single plane out of the sky, it wouldn’t do anything to stop terrorism. Terrorism isn’t about means, but about ends. It’s not about the technology but about the anger, the ignorance that holds a firm grip over the actor’s mind.
I grew up and spent a decade of my childhood in south Beirut and was literally neighbors with the security sector of Hezbollah, a guerilla organization that fights frequent wars with Israel. During the 2006 war, an Israeli fighter jet carpet-bombed the entire neighborhood, razing my home and that of many others to the ground. While walking through a field of rubble and unexploded cluster bombs to try and find my house, I distantly saw a friend of mine, far away on the other side of whatever it was that I was staring across. We locked eyes. Then, we burst out laughing. We laughed for a long time.
In 2008, I got the opportunity to move away from Lebanon and to get an education abroad. This opportunity was rare and unusual. Making encryption software is hard, too: for many of my first years abroad, much of my software was riddled with bugs, and it took practice and feedback in order to start getting things right – namely, what my education abroad of Lebanon really was about.
Visiting south Beirut a few years later, I found that I had changed but that no one else there had. The rubble was mostly but not completely gone. I also found that people were angry, and that Hezbollah had pledged to rebuild their homes. Left without any hope for a good education, for a happy life, with much of their families missing, with their friends dead, many pledged themselves in return.
That’s what’s causing terrorism, not encryption software.