Examination of the technical evolution within several industries reveals an approaching precipice of scientific change. The glacially paced, but inevitable convergence of quantum mechanics, nanotechnology, computer science, and applied mathematics, will revolutionize modern technology. The implications of such change will be far reaching, with one of its greatest impacts affecting information security. More specifically, that of modern cryptography.

The theoretical numerists, responsible for cryptography's algorithmic complexities, will be affected by this scientific conglomeration, although numerologists shouldn't be concerned. The subsequent adaptation and remodeling of classical cryptography will be a fascinating, and yet undetermined process. Of course, this would all be irrelevant if we could just standardize the use of Arithmancy and it powers of numerical divination. Then again, we live in a society where the mysticism of numbers is left to those practicing the pseudo-science of applied mathematics.

In addition to the Intel 8080 and disco, the 1970's gave us public-key cryptography. Popularized by the RSA cryptosystem, it introduced a revolutionary new method for encryption that overcame the security issues of key exchange with symmetric cryptosystems. This is far better than the 1960's contribution of "New Math."

Because generating large prime numbers is much easier than factoring them, public key systems have proved to be a mathematically strong approach. The need to evaluate the security of factor-based cryptosystems, has led to advancements in integer factorization algorithms. From the simplistic use of trial division, to the more advanced techniques of Fermat or Pollard rho factorization, to the sophisticated general number field sieve algorithm--improved approaches using computer processing have left only the strongest of cryptosystems and key sizes intact.

However, the 30-year crypto-evolution of public-key cryptography has shown some remarkably difficult and complex advances in computational number theory. One of the best-known examples is well documented in the results of the RSA Factoring Challenge. Starting in 1991, with the smallest key, RSA-100 (100 decimal digits), being cracked within the first two weeks, approximately 50 RSA numbers of increasing length were published, in an open factorization contest. Ending in 2007, with 12 solutions, this event resulted in the achievement of numerous mathematical milestones for the crypto community, including the factorization of RSA-640 (192 decimal digits = 640 binary) in 2005. The factors of this 193-digit number distilled down into two 97-digit prime numbers, neither of which were ciphertexts for "The Magic Words are Squeamish Ossifrage."

While this event is considered more successful than the pi (I couldn't get the "pi sign" to appear when posting) eating contest held by the Association of Symmetric Security (they don't use their acronym), private key cryptography is a complimentary, efficient, and secure method of encryption. When compared to public-key systems, the processing time is reduced by an order of 10^{2}-10^{3}, not requiring the computational overhead for key correlation. Whether it's using block or stream ciphers, the use of a single key for both encryption and decryption relies on a high level of key integrity. Aside from its entropy, secure key transfer, or key distribution, is fundamental to its success, ensuring only the sender and recipient have a copy, without risk of compromise. Even the über-security of the one-time pad is powerless when improperly implemented, or "man-in-the-middled", as a two- or three-time pad.

It is important to note how PCs have evolved alongside cryptography, since this all takes place on some level of computer hardware. In a general sense, given a strong algorithm, the increases in computational power have necessitated the use of larger key sizes. For example, brute force attacks require a theoretical average of trying ½ of all possible key combinations before achieving success. Therefore, increasing the key size makes the job of brute force attackers exponentially more difficult. For each additional bit added, the key space is doubled, thus doubling the work required to crack (this is an oversimplification of the cryptanalytic process). However, a battle of computer processing power versus mathematical complexity has been one of the fundamental challenges of maintaining cryptographic security.

Examining the landscape of modern computing reveals both legitimate and questionable concerns. Using parallel processing and distributed systems, the time required to break keys can be reduced. Theoretically, with *n* number of computers, the time required to crack a key is *1/n ^{th}* the time required using only one. A modified example in application was demonstrated when Nick Breese's PS3 Crackstation leveraged its Cell Broadband Engine cluster to process calculations in parallel with vector computing. On the other hand, with the possibility (or the realization) that Moore's Law will soon be no more, it may be that the physical constraints of conventional silicon chips will be outpaced by the conceptual constraints of mathematics.

Quantum computing is often discussed as the disruptive technology that will transform computer science. The theoretical blueprints for quantum computers were drafted by Richard Feynman over 25 years ago and currently several companies have prototyped designs, with manufacturing claims in the near future. However, the optimism of this field and its potential impact is somewhat premature. The scientific media have repeatedly voiced the concerns of quantum computers annihilating our public key cryptosystems. According to this news release from Accenture, "A breakthrough in quantum or molecular computing could leave today's computers-and IT security systems-in the dust." and further states, "...it could unleash a hacker's ‘killer app' that breaks through security defenses." Science Daily describes the arms race that will result from the post-apocalyptic world of post-quantum computing. Why is this technology so threatening to cryptographic security?

Perhaps it's the publicized predictions that quantum cryptanalysis will mark the downfall of classical cryptography. In 1994, while at AT&T Bell Laboratories, mathematician Peter Shor developed a quantum algorithm that can factor large numbers in polynomial time. Transitioning from classical to quantum computing, Shor's algorithm has the potential to break all of the existing public key schemes used today. In 1996, modest computer scientist Lov Grover, created a database search algorithm that provided powerful quantum calculations through functional inversion. Grover's algorithm provides quadratic improvements to brute forcing, reducing the search of n entries to only n^{-1} (supposed to sqrt of n) steps. Applications of these mathematical attacks, are at best, proofs of concept in the absence of theoretical hardware.

Furthermore, there are several cryptosystems thought to be resilient to cryptanalytic attack from both traditional computers *and *quantum computers.

- Hash-based cryptography
- Code-based cryptography - (well...McEliece's used to be)
- Lattice-based cryptography
- Multivariate-quadratic-equations cryptography.
- Secret-key cryptography

It should be noted that quantum algorithms have important implications in computational complexity theory. Mathematical problems labeled P, can be solved in polynomial time, whereas those termed NP, can only be verified in polynomial time, thus all problems P are a subset of NP. Essentially, given an algorithm with x steps, taking t time, problems considered P will have a polynomial relationship such that they can be solved algebraically as f(x) = t. Problems that are NP can be thought of having a solution that can be confirmed to work in polynomial time, but not necessarily solved in polynomial time. My own efforts to solve the P vs. NP problem, in my free polynomial spare time, have led me to believe that one's attempt to solve the problem, in itself, is of type NP. Since I'm still waiting on TigerDirect to ship me a new motherboard for my nondeterministic computer, my proof remains NP-incomplete.

As I've been quoted in the past, "Experience is no match for brilliance." (Of which I have neither), my words clearly defy the common practices for scientific debate. Referencing the writings of technological pioneers and regurgitating the thoughts of industry veterans are some of the basic self-defense techniques of argumentation. Nevertheless, instead of citing the works of David Deutsch, Paul Benioff, or Charles Bennett, I would like to highlight the beliefs of a bright quantum physicist named Stephen Jordan:

"It is now believed that quantum computers can solve certain problems more efficiently than classical computers. For example, quantum algorithms have been discovered which can factor large numbers in an amount of time that varies roughly as the cube of the number of digits. In contrast, the best known classical algorithm for factoring requires time which grows exponentially with the number of digits.

Such results raise many questions. The two main questions I am currently interested in are:

- How can we build a quantum computer?
- What could we do with a quantum computer if we had one?"

When the commercialization of a future technology becomes over-hyped by media buzz, one can always gain perspective truth from academicians of the fundamental science. It is not a matter of questioning the merits of quantum computing and its impact on classical cryptosystems, but it's the media's use of headline grabbing scare tactics and providing gross misinformation. Public attention should be directed towards the ongoing efforts of researchers proactively addressing this issue, as demonstrated by PQCrypto, the international workshop on Post-Quantum Cryptography. What we don't need are more headlines claiming that the future of internet security has already been cracked.

Encryption may not be the most glamorous layer of security, but when properly implemented, it's probably the most sophisticated and strongest layer. A majority of the real world defense in depth strategies still lack a cryptographic layer. When faced with relevant security concerns to address, IT managers should be allocating resources to the weak areas of the security chain. For those who've never spent time reading the works of Schneier: Not only is security a process, but should be thought of as a chain whose strength is only as strong as its weakest link, and encryption is usually one the strongest. Fix the areas that pose the real threats: social engineering, end-user training, policy enforcement, perimeter security, firewall rules, secure coding, and so on.

How many automated exploit tools exist that allow script-kiddies to launch cryptanalytic attacks? Have there been some sort of underground crypto-cons going on (besides this one) that have eluded my hackercon radar? Or perhaps we should fear the Quantum Hacking group?

It's all ciphertext to me.

Send your questions, qubits or comments using SARG04 to: greyhat@computer.org