Transitioning into the Quantum-Cryptography Era

In this article, I’ll compare the encryption standards we consider secure with actual technology and those that may not make them secure in the years to come. I will also try to analyze how we can prepare for the future in our industry at different scales – software development.

First, I want to highlight the two main types of encryption, depending on most of the software we use. This is mostly based on the status as of today in 2022 by the United States National Institute of Standards and Technology (NIST).

symmetric encryption

On one hand, we have symmetric encryption, which uses a Personal Key (or secret) for encrypting and decrypting data. The need to share secrets between parties that need to communicate securely is really a security risk and its weakness. This can be a good option, along with other measures to ensure a secure key transfer, as it is cryptographically secure compared to other methods.

The most commonly used algorithm is the Advanced Encryption Standard (AES), which has not yet been cracked, and has several versions such as AES-128, AES-196, and AES-256 (being the number of version lengths). Being) the key in bits). One issue of such algorithms is that they are vulnerable to brute force attacks and they rely on significant lengths as a way to strengthen encryption against these attacks. Note that, as with real classical computers, any version of AES key-length would take much longer (thousands of trillions of years, even larger than the lifetime of the universe) to crack.

There were also other attacks like meet-in-the-middle (MITM), which compromised algorithms we used (or still do) as a standard in recent times. An example of this is the DES (Data Encryption Standard) family, which in its first version was crackable by brute force with only 256 operations (it took 22 hours in 1997 and 5 minutes today). Its successor Double DES fixed this by encrypting twice with two different keys but still one that became vulnerable to a MITM attack, then it was replaced by Triple DES which allows multiple attacks (including MITM). ) and the only solution to add complexity is re-lengthening the keys, or doing more operations with the keys.

Symmetric encryption is mostly used to encrypt data in operations that need to be fast like banking and credit card transactions, it is used across the Internet in Transport Layer Security (TLS) protocols, in encrypted data storage. and is also used in many other applications. ,

asymmetric encryption

On the other hand, we have asymmetric encryption. This type of encryption is based on a key pair, the private and the public key. Private is used to encrypt and should not be shared. The public key is used to decrypt it along with the private key of the respective recipient. can do Be as its name states, public. Even if it exposes some information.

The most well-known and de facto standard for asymmetric encryption is the RSA (Rivest Shamir Edelman) protocol. It uses two large prime numbers to generate a private-public key pair and is commonly used as a key exchange algorithm. Thanks to RSA, we can exchange private keys to use symmetric encryption in a more secure way, and thus create other complex solutions for various threats and use cases. For example, it is used in TLS to exchange its private key and warrant digital integrity with AES. We can also find it in email encryption services, digital signatures, blockchain and more.

As mentioned above, the security of this method depends on the multiplication of two large prime numbers with a specific equation. These two numbers are the private key and the result of the equation is the public key. An attacker would need 300 trillion years to crack an RSA key pair to brute-force this task with classical computing. It doesn’t look like a threat, does it? Well, the vulnerability is still there, and in this case 300 trillion years is a way of adding up the number of computational operations and the time spent for each of those operations. But it is only based on the framework we understand for computational power mechanics today.


So far, we can say that our only resource to protect ourselves from most attacks of symmetric encryption is to have a long key and/or to add the computational work time (or complexity) to creating it. And as such it would take an amount of “computation time” that we consider far from possible to crack our systems with brute force or other attacks that we know of today. But how is it that we consider that time impossible? Well, it’s based on Moore’s Law, which states that computational power doubles every two years, and at that speed, it seems really difficult to reach the goals needed to break real cryptography in the near future.

What if I told you that the 300 trillion years required to break RSA, would be seconds for a quantum computer powerful enough? Yes, seconds. That being said, it doesn’t seem too crazy to expect that symmetric encryption can be cracked in a reasonable amount of time with this new technology. It will be a matter of improving the field at the pace it is doing now, and that doesn’t mean it can’t happen even faster if there’s a significant breakthrough. With what we know now on the matter, there is expected to be a quantum computer with the power needed to crack RSA by 2030. It’s not that far, is it?

NIST already has a proposed plan for transitioning to new post-quantum standards, but in past history, their standards were inaccurate and had vulnerabilities (for example DES was under a list that NIST considered not secure long ago). was). Furthermore, we are not at all certain that the new encryption schemes that NIST will propose as standards are going to be quantum-proof once the technology is implemented.

I also think it is extremely important to note what the future of software looks like for years to come because it is common to hear that the data that needs to be protected today needs to be protected now “when quantum computing is around” and There is no need to be other. Similar confirmation. And in a world where everything is moving towards software that operates on shredded data (like artificial intelligence) and centralized services that host user data that will be there when quantum comes, I don’t worry about that. There doesn’t appear to be any reason to do so. As safe as possible for that moment. Decentralized services emerging based on today’s encryption technology will also be vulnerable to this threat and we have better options to prevent the chaos that may arise. Basically most of our exposure to encryption today is at risk and the sooner we start thinking about how to transition to this new cryptographic paradigm, the better.


First, let’s look at ways to deal with this menace.

One approach that NIST is already tackling is developing new cryptographic schemes that are quantum-proof and with plans to migrate (hopefully) in a timely manner from real ones.

Another option would be to build quantum-proof solutions on top of the infrastructure we have today, to give a familiar example, let’s mention TLS. A quantum-proof layer can be built on TLS to protect against quantum exploits.

But what is certain is that the transition is not going to be easy or short. It looks like we’re going to be in an environment where a lot of new technologies will emerge and we as developers have to adapt to it – always keeping in mind that what we’re building today is what we expect in the future. may not work as intended.

As software developers, we have the tools to be ready for this era. is the most important “good practices.” We already know how to build scalable software, and we can start by thinking of this transition as a type of scaling. So if we use the good practices of abstraction, for example, we’re headed in the right direction. If we make sure we have maintainable code, and actually do maintenance, remove what is needed, update dependencies, and keep track of them (AES-128 and RSA are probably your dependencies); So we should be in a good position to build software that is ready for this next phase of digital information.

In favor of organizations, the process should also be handled properly, as they are the ones who will fund this development. Organizations are going to put money on the methods outlined above, and will actually implement them. That’s why I think the focus should be on making sure their teams are prepared for what’s to come and that their software is built in a way that they can maintain in the future.

All this being said, it seems like we should (or should be) moving to an industry where companies invest and focus on quality and flexibility. With good quality software and ready engineers, the industry must adapt to this rapidly changing environment and prevent as much catastrophe as possible.

With the help of all the entities involved – such as NIST and many other organizations that work towards standardization and cyber security – optimistically, we should have a solid framework to rely on as time passes and technology improves.

Leave a Comment