 If I used the characters A1 to encode some block of text, would that be steak enography? A terrorist recently participated in an attack on the Inland Regional Center in San Bernardino, California, which left 14 victims dead. The terrorist was also killed when the police responded. They recovered his iPhone 5C, which may or may not contain information relevant to their investigation. The data on the phone is protected by encryption. So the FBI contacted Apple and requested their assistance in circumventing that security. Apple refused. The FBI then got a warrant from a federal judge ordering their assistance and Apple refused again, opting instead to take the FBI to court. They also issued a PR statement to the press, which in Apple's case is kind of like nonchalantly unlocking the door on the T-Rex cage. Your move. In order to understand why Apple would rather fight a federal warrant in court than help the FBI unlock one stupid iPhone, it's useful to understand the details of how encryption works. So if you've ever done anything that's supposed to be secure, which involves a computer, then you've used encryption. ATMs, credit card terminals, websites that start with HTTPS and anything that takes a username or password, encryption is how they protect that information. All computer data is essentially a sequence of numbers. For example, these binary digits are exactly the same thing as this sequence of numbers, both of which code for exactly the same content in ASCII. Anyone can look at these numbers and know what content they represent. In order to protect sensitive content like our passwords and bank account information, we need to encode those numbers so they look like gibberish to anyone who might be listening, but not to the people who are supposed to have access to them. The most basic sort of encryption uses a set of rules, an algorithm, to add another number to that data, an algorithm which uses some variable that the authorized users know, but everyone else doesn't. For example, to encrypt our sequence, I could pick an encryption key or password, like the word brain, convert that to a number, add that number to the original number, and presto. I have a new number that looks like total gibberish. Mission accomplished, I guess. Later, if I want to recover the original number from that gibberish, I just have to remember brain, subtract it from the gibberish number, and I have my original data back. Of course, that's not very secure in an age when computers can perform millions of operations a second to try and reverse that coding. Modern encryption algorithms use all sorts of different techniques to further convolut that process so it's less obvious what's going on, but they still essentially function on that technique. So encryption is why the FBI can't get any meaningful data off of this iPhone. Without the proper code plugged into this complex algorithm, any data that you might read off of it is just garbage. The actual FBI request is for Apple to write a patch which will disable some of the security features that make it hard to brute force guess the password used for encryption. But interestingly, the encryption on the phone itself is not the encryption that Apple is worried about compromising. Apple software, like a lot of modern software, uses something called code signing. It would be relatively easy to slap an Apple logo onto some malicious code and then get users to install it thinking that it's legit. But code signing uses encryption to verify that certain software is authentically Apple's. When Apple releases an important software update, they wrap it in an encryption algorithm. Instead of the basic encryption we were talking about before where you just reverse the encoding process to decode something, this encryption is asymmetric. An asymmetric encryption algorithm uses a pair of two linked keys, neither one of which can be derived from the other. With one key encrypts, the other one is needed to decrypt. One of the keys is usually broadcast publicly. It's something that you can just leave out for other people to use. If someone wants to send you some secure information, instead of beating in a dark alley somewhere to agree on a symmetric encryption password, they can just grab your public key, use it to encrypt the data, and then send it to you. Even if someone else knows your public key, they can't use it to reverse that process. The only way to decrypt that data is to take your own secret private key and then use it to run a decryption algorithm. That's a pretty ingenious way to keep information secure, but for code-signing verification, software companies kind of do the opposite of that process. They use their private key to encrypt the data. At first that seems stupid because anyone could use their public key to decrypt it. But the thing is, if you can decrypt something using someone's public key, then you've just proven that only they could have sent it. After all, nobody else could have encrypted it using the other half of the pair. Software certificates also reference other useful information in the encryption process. Running a software certificate authentication program doesn't just vet that it's from who it claims to be from. It also checks things like what data was published, the version number, and even the content of the software itself. If anything doesn't match up, either because it was corrupted in transit or maliciously modified, the certificate authentication will fail and the phone won't install it. The problem, from Apple's perspective, isn't that the FBI is asking for their help in decrypting a phone. They've done that before. The problem is that they're being asked to create a program which disables some key security features on the phone and then sign that program, so the phone will trust it and install it. This is the big deal. Apple is the only entity in the world that can generate code sign certificates that say, no, really, this is from Apple, you can totally trust it, no questions asked. Apple devices will only install stuff that modifies important parts of the operating system with that sort of certificate attached. A piece of software designed to break the iPhone security that had that Apple certificate on it would be like a universal passkey. There are conceivably some precautions that Apple might take to prevent it from being used elsewhere, but the act of creating a program which instructs the iPhone to implicitly trust harmful changes to its operating system, that isn't to be taken lightly. It might also set a bad legal precedent if a government agency could order a software company to create this sort of utility. After all, if the FBI goofed and this sort of thing made its way into the wrong hands, every single thing that Apple has ever produced might be compromised. But there is a looming threat to data encryption which has a lot of computer science people really nervous and it might make this whole court drama thing moot. Most asymmetric encryption algorithms use a particular sort of mathematical function to generate key pairs called a trapdoor function, something that's very easy to calculate in one direction but almost impossible to calculate in reverse without a special piece of information, the trapdoor. A great example of this is prime factorization for very large numbers. A computer can take two large prime numbers and multiply them together to get an even bigger number almost instantly. A computer can also very easily find out what one prime number is if you give it the other one, the trapdoor value, just divide. But if you were to give a computer a very large number and ask it to find its two prime factors without any help, it's going to take it a while. How long? Well, for an average desktop computer, the sun would probably go out first. That's great for encryption. Knowing the two primes makes it easy for your computer to encrypt and decrypt data quickly while making it almost impossible for a random hacker to reverse calculate the primes and break the encryption. That is, unless they have a quantum computer. In theory, quantum computers just happen to be really really good at reversing the trapdoor algorithms usually used for encryption. A very small quantum computer would have no trouble factoring large numbers in a few seconds. Some algorithms are safer than others, and nobody's really figured out how to put together a practical quantum computer yet. But there are a lot of really smart people working on it, including people at Google and the NSA. As soon as someone manages to build one, it will render the vast majority of encryption and the systems we use for computer security pretty much useless. By calculating the trapdoor algorithm in reverse, a quantum computer could intercept secure online transactions and get credit card details, pin codes, account numbers, whatever. It would also be relatively trivial to write malicious code that did just about anything, and then sign it and watch it install itself on every computer that it touched. This is why many security agencies and computer science research organizations have started looking into new encryption techniques, ones which include layers of calculations that quantum computers are also bad at. It's important for commerce, but also for the security of anything that has a computer in it. Because, yes, I don't want anyone using my Amazon account to buy themselves a new refrigerator, but also, I kind of have my ringtone set exactly the way that I like them. Are you using a different password for every single one of your online accounts? Please leave a message below and let me know what you think. Thank you very much for watching. Don't forget to blah, blah, subscribe, blah, share, and don't stop thunking.