Many have recently become aware of the significant role which network security has in our lives through the recent Heartbleed debacle. Due to a bug in OpenSSL, a standardized library for cryptographic methods used in the SSL and TLS communication protocols, countless servers were left vulnerable to potential attackers who could anonymously access arbitrary blocks of memory. Just a slight error in the library’s code had jeopardized the security of databases containing millions of users’ private information, including emails, passwords, and credit card numbers. In the aftermath of Heartbleed, it was brought to the public’s attention that OpenSSL, the foundation of internet security, was maintained by a significantly underfunded team of volunteer developers. In a world where one’s digital signature is rapidly growing to hold a greater value than one’s written signature, many had argued that a new foundation for the world’s online matters be erected to provide secure communications. Since then, the OpenSSL project has received more funding, allowing for its developers to better maintain the library and dedicate more time to patching bugs. However, in light of this new mindset which is dedicated to improving secure communications across the internet, one might look back upon the foundations of all network security – cryptology. Astonishingly, the field itself is founded upon mere conjectures which have yet to be proven. As it stands now, a sudden breakthrough in mathematics or the sciences could be the key to every lock in the internet, effectively rendering all communications insecure.
Cryptology can be divided into two distinct branches: cryptography and cryptanalysis. Cryptography is the study of the encryption and transfer of data across an insecure network, while cryptanalysis is the study of breaking encryption schemes . Effectively, cryptanalysis is concerned with undermining the work of a cryptosystem. By demonstrating flaws in the design of a cryptosystem, it can be patched and retested, until it is deemed “unbreakable”. However, a truly unbreakable cryptosystem which is feasible to implement for internet communications does not exist in the setting of classical computation . In cryptography, a cryptosystem is considered to have perfect-security if the resulting ciphertext of an encryption tells one nothing of the original message. One such cryptosystem which has perfect security is the One-Time Pad (OTP). The OTP is an encryption algorithm which takes a message and a randomly selected key of the same length, and returns the XOR (pairwise addition modulo 2) of the two values. One could then take the XOR of the resulting cipher text and the key, returning the original message. Without the key, it is impossible for one to recover the original message . In fact, for every message of the same length as the original, there exists a key which will produce that new message when XOR’d with the cipher text. Therefore, an adversary could not break the OTP through any amount of computational power. However, the OTP is flawed in several aspects. To start, it relies upon the fact that both parties are aware of the secret key. If this is not the case, then the OTP becomes useless. Since the message and the key are of the same length, if one party had a method of securely sending the key to the other, it could have used that same method to distribute the original message. Also, as the name would suggest, the OTP can only be used reliably one time. By reusing the same key for multiple encryptions, a cryptanalysis attack becomes possible, in particular by taking the XOR of pairs of intercepted messages . So, in classical cryptology, the OTP is impractical. Unfortunately, any cryptosystem which has perfect security has been proven to have similar key requirements to the OTP, thus succumbing to the same problems. Since a pragmatic perfect security cryptosystem would appear to be unattainable, cryptosystems have been developed which are inherently flawed, but are believed to be sufficiently complex so as to preclude cryptanalysis. One such cryptosystem is RSA. In RSA, one party generates a pair of large (usually greater than five-hundred digits) primes and takes their product. An odd number less than the product is then chosen and distributed with the cipher text. The recovery of the plaintext is relatively painless if the two original prime numbers are known. However, without knowledge of the primes, it becomes a much more computation-intensive problem . In essence, RSA is founded upon the fact that, despite centuries of extensive research, no efficient algorithm exists to find the factors of a number. In the context of cryptology, an efficient algorithm runs in polynomial time with respect to the size of the input. The most efficient classical algorithms to date for the factoring problem run in near-exponential time. A polynomial-time factoring algorithm could run within minutes for a number that would take years for the most efficient algorithms to date. It is conjectured that such an algorithm may not exist within the realm of classical computation, so it is considered to be intractable. However, in a new, nonstandard setting, a polynomial-time solution already exists. Such a setting is that of quantum computing.
The application of the fundamental principles of quantum mechanics to traditional computing theory is known as quantum computing . In a classical Turing-machine, a single process is capable of running at any given time and performs actions upon data in memory, which is stored in the form of a binary string. Any given singular point in memory is known as a bit, and is either in the state or the state. However, in a quantum Turing-machine, the running head (that which accesses and writes values to memory) is able to take advantage of the concept of superposition, in which a particle exists in multiple, distinct states simultaneously . As opposed to the classical Turing-machine, the running head of the quantum Turing-machine can access and write to multiple blocks of memory simultaneously. Additionally, the memory of a quantum Turing-machine is much more efficient, as each quantum bit, or qubit, occupies a position of the form , where |a|2 and |b|2 represent the respective probabilities of measuring a or a . Using this notation, each qubit can be represented as a point on the Bloch sphere, allowing for algorithms to be illustrated as rotations of a point about the center of the sphere . Additionally, given that there are infinitely many unique states that a qubit can occupy on the surface of the sphere, multiple values can be encoded into a single qubit. Ultimately, this means that a quantum computer requires far fewer qubits than the number of bits which a typical computer requires. In particular, it has been estimated that a quantum computer containing thirty qubits would be capable of operating on par with some of the most sophisticated supercomputers to date. However, there are many problems which have yet to be resolved in academia which currently inhibit the development of such a computer. One such inhibitor is the lack of a sufficient language for which such a computer could be programmed. In order for quantum computing to gain commercial significance, it must be capable of being programmed. In particular, such a language must contain fundamental algorithms capable of performing simple operations, such as reading and writing to memory, while preserving the state of the computer . However, due to the unstable nature of quantum mechanics, preserving the machine’s state is a very complex problem. For instance, due to the Heisenberg Uncertainty Principle, upon measuring the value of a qubit, it would collapse to either or . In essence, each qubit would become a standard bit, resulting in the loss of any advantage which the quantum computer had had over a classical computer. Currently, research is being conducted to determine whether or not an algorithm capable of indirectly reading the state of a qubit exists. A promising route which researchers are currently exploring is that of entanglement, in which each of two particles mirrors the state of the other, even over a great distance. It is believed that through entangling a qubit to another particle, it would be possible to read its state without disturbing it . Additionally, researchers have yet created a model for the qubit which scales naturally across great orders of magnitude. For instance, in order for quantum computers to gain a significant role in cloud computing, computers capable of operating on millions of qubits must be developed in order to provide quantum information processing to a large base of users . No quantum computer to date has contained at least one hundred qubits. Ultimately, though, as quantum computers begin to be developed, they will effectively deprecate the majority of existing cryptosystems due to their superior computational power . For instance, Shor’s algorithm is a method for which a quantum computer could factor numbers within polynomial time, undermining the security of RSA. However, all is not lost, as quantum computing also lends itself to a new branch of cryptology: quantum cryptology .
Quantum cryptology is the quantum analog of classical cryptology, incorporating various principle of quantum mechanics to develop more secure cryptosystems. Interestingly, while the OTP is impractical in the classical sense of cryptology, it gains great significance in quantum cryptology through the use of quantum key distribution (QKD). QKD a process by which two parties can construct a secret key while communicating across an insecure network. The first party, Alice, entangles pairs of photons and transmits them to her own detector and to the second party, Bob. The polarization of the photons is random, yet entangled pairs of photons will have polarization along the same axis. Alice and Bob assign to each photon that they detect a value of 0 or 1, depending upon the angle of its polarization. Alice then encodes her values as a quantum state and transmits them to Bob across a quantum channel . Bob then compares the states recorded by Alice to his own, assigning a value of N to any pairs which do not match. Of the remaining states which did match, he assigns a value of N to half of them, chosen at random. All remaining recordings are then assigned a value of Y, and the results are then transmitted to Alice across a public channel. Now, Alice and Bob can construct a key by using only the values to which Bob had assigned a Y, and then communicate securely through the use of the OTP . A potential adversary, Eve, would be incapable of constructing their key. If Eve had intercepted the photons sent from Alice to Bob, Bob would immediately notice that his and Alice’s key generation rates were different, indicating Eve’s presence . Eve would be incapable of simply copying the state exact state of the photons which she had read; any photons which she could transmit to Bob would have a random polarization [2,7]. So, each photon which Eve transmits would be incorrect fifty percent of the time, which is great enough an error to indicate her presence; ergo, the QKD method is perfectly secure. However, no scalable method currently exists with which one could perform QKD, although research suggests that optical networking could be the solution to providing QKD multiple parties simultaneously .
As it stands right now, quantum computers are nowhere near ready for commercial application. However, their significance to the world should not be overlooked. The advent of quantum computing will usher in an age of perfectly secure communications through QKD and the OTP, while providing superior computational power to the masses. Additionally, improved processing power would ultimately allow for vastly more complex simulations of natural systems, thus forwarding the progress of other fields, including biochemistry and physics . In order to further the development of quantum computing, more attention must be brought to the subject so that the researchers working on developing this bleeding edge technology may be better funded, potentially allowing for quantum computing to be universally available within the next few decades.
- Aaronson, S., & Bacon, D. (2008). Quantum Computing and the Ultimate Limits of Computation: The Case for a National Investment. Computing Community Consortium, and Version, 6.
- Bartlett, B. C. (2004, June 30). Securing Key Distribution with Quantum Cryptography. SANS Institute.
- Cecka, C. (2005). Review of Quantum Computing Research. Retrieved from http://crisco.seas.harvard.edu/resources/Papers/QuantumComputingResearchSummary.pdf
- Chapuran, T. E., Toliver, P., Peters, N. A., Jackel, J., Goodman, M. S., Runser, R. J. & Dardy, H. (2009). Optical networking for quantum key distribution and quantum communications. New Journal of Physics, 11(10), 105001.
- Devitt, S. J., Munro, W. J., & Nemoto, K. (2008). High performance quantum computing. arXiv preprint arXiv:0810.2444.
- Garg, P. and Dilawari J. S. (2012). A Review Paper on Cryptography and Significance of Key Length. International Journal of Computer Science and Communication Engineering.
- Hughes, R. J. et al (1994). Quantum Cryptography. Retrieved from http://arxiv.org/pdf/quant-ph/9504002.pdf
- Kaliski, B. (2006). The Mathematics of the RSA Public-Key Cryptosystem. RSA Laboratories.
- McConkey, T. (2014, August 27). A leap into quantum computing [Web log post]. Retrieved from http://www.edn.com/electronics-blogs/quantum-leap/4433900/A-leap-into-quantum-computing
- Quantum Computing 101 (n.d.). Retrieved from https://uwaterloo.ca/institute-for-quantum-computing/quantum-computing-101
- Rijmenants, D. (2013, December 24). The Complete Guide to Secure Communications with the One Time Pad Cipher. Retrieved from http://users.telenet.be/d.rijmenants/papers/one_time_pad.pdf