In today's world, cryptography is critical to protecting data and users, ensuring confidentiality and preventing cyber attackers from intercepting sensitive corporate information. Currently, different encryption methods are being used: symmetric key encryption and asymmetric key encryption. Quantum computers threaten cryptographic algorithms because they can perform many times more parallel operations than conventional computers by solving complex mathematical problems or breaking asymmetric key cryptography by exhaustively searching for all possible secret keys.
Introducing the ‘Inspiration series’ by HCLTech Quantum Labs
To understand the scope and progression of the quantum communications market, HCLTech conducted a series of sessions in collaboration with our global innovation partners. The series aims to provide a comprehensive overview of the quantum communications market from academic and industry standpoints, incorporating the business and technology verticals. HCLTech's Quantum Labs, being at the center of innovation, aims to accelerate quantum exploration activities to help industry leaders capitalize on quantum technologies.
The panel in the third episode included Abhinav Khare (Head-Tech Venturing and open innovation ecosystem, HCLTech), Dr. Sushmita Ruj (Senior Lecturer, UNSW) and Panos Kampanakis (Cybersecurity Engineer, AWS). The session encompassed rich discussions among the key stakeholders, each bringing their respective viewpoints on “Post Quantum Cryptography entering the adoption stage.”
The current state of public key cryptography and the beaming quantum threat
Most of the security protocols today employ a combination of symmetric key encryption and asymmetric key encryption. Asymmetric key algorithms are computationally expensive but provide ease of use since it is the only way to exchange keys between two nodes for initiating symmetric key cryptographic protocols. Symmetric key algorithms are many times faster than asymmetric key algorithms. However, it is a challenge to distribute symmetric keys among participants. Security protocol designers usually combine both algorithms and use symmetric algorithms for data encryption and asymmetric key algorithms for establishing shared secrets (symmetric key). The public key of the asymmetric keys pair is currently shared using public key infrastructure (PKI).
One of the biggest challenges facing asymmetric cryptography is the threat of quantum computing, which could break some of the most widely used algorithms, such as RSA and ECC, by exploiting their mathematical vulnerabilities. Quantum computing poses an “existential risk” to classical computer public key encryption protocols and notes that cybercriminals are potentially already exfiltrating encrypted data to decrypt it once quantum computers advance. Panos added, “The industry faces threats because of the store-now-decrypt-later policy currently being adopted, as it leads to data theft in many cases.”
Peter Shor showed how to factor a number using quantum computers, reducing the time required from years to hours. Shor's algorithm can target asymmetric keys, which are the basis for the PKI. If Shor's algorithm becomes practical, any existing keys and data stored anywhere must be re-encrypted. One of the most famous of such schemes is the RSA encryption scheme.
Grover's unstructured key search algorithm, on the other hand, could impact symmetric key encryption. Grover's algorithm uses amplitude amplification to search for an item in an unsorted list. While it would take a classical computer N/2 or N steps, Grover's amplification trick only requires the square root of N steps. A quadratic speedup is a time saver to search items from a long list, but the algorithm must be executed sequentially to achieve its total quadratic speedup.
Introduction of Post Quantum Cryptographic (PQC) algorithms
Researchers and developers have been working on new algorithms resistant to quantum attacks, such as lattice-based, code-based, hash-based and multivariate-based cryptography, to counter such threats. These algorithms rely on different types of hard problems that are not easily solvable by quantum computers and offer various performance, security and compatibility trade-offs.
The National Institute of Standardization and Technology (NIST) initiated a process in 2016 to solicit, evaluate and standardize one or more quantum-resistant PQC algorithms. The focus here is to create algorithms that make it difficult for quantum computers to break digital signatures and encryptions. Several PQC algorithms have been proposed by NIST, like lattice-based, code-based, multivariate polynomial cryptography and hash-based signatures, and most of these have a larger key size.
“The lattice-based algorithms have been studied for a long time, and these algorithms are chosen because of their security. Moreover, these are very well-tested, increasing customers' confidence. The hardness of lattice algorithms is believed to be hard. These problems may be easy, and then the entire system crumbles”, emphasized Dr. Ruj.
NIST has identified four candidate algorithms for standardization: CRYSTALS-KYBER (key-establishment), CRYSTALS-Dilithium (digital signatures), FALCON and SPHINCS+. Dr. Sushmita seconded this, stating, “There is a desire to push towards newer protocols and algorithms, and NIST is in the process of standardizing signatures that are not based on lattices. Much work is also happening in the post-quantum cryptographic algorithms for zero-knowledge proof, and various cryptographic primitives for blockchains, etc.”
HQC and Bike have also been considered in the fourth round conducted by NIST, with just one of the two to be selected, Panos added. The reason is that if a lattice-based algorithm collapses, the code-based algorithm can be considered, saving the system from a massive collapse.
Publication of these standards should happen in the coming two years. Since almost all of us rely on an internet security standard known as TLS, which relies on public key algorithms, TLS will be updated per the standards.
“AWS has been engaging in standardization efforts with NIST and is introducing some of the algorithms, and the focus has been on encryption, especially TLS. This is because these aid in connecting to AWS services and calling back API. Thus, TLS protects important connections, thereby saving the data,” added Panos.
Post Quantum Migration and its impact
Quantum computing has seen a surge of development in the past decade. Because quantum algorithms are more powerful than classical algorithms, the cryptographic landscape demands change in a post-quantum world. Cryptographers worldwide are developing quantum-resilient post-quantum cryptographic schemes. The ongoing NIST PQC standardization process aims to establish standard PQC schemes to integrate into cybersecurity. The initial scope of this project is to demonstrate the discovery tools that can provide automated assistance in identifying where and how public-key cryptography is being used in hardware, firmware, operating systems, communication protocols, cryptographic libraries and applications employed in data centers — whether on-premises or in the cloud — and distributed computer, storage and network infrastructures. However, adapting and migrating large software infrastructures to PQC is complex, with several requirements and challenges. The end objective of the project is to provide systematic approaches for migrating from vulnerable algorithms to quantum-resistant algorithms across the different types of assets and their supporting underlying technology.
Panos believes there would undoubtedly be some financial impact, as the companies must invest in upgrades and resources, especially hardware, to mask them. He also mentioned that governments across the globe are gradually investing in quantum technology and related compliance issues, and non-compliance further leads to financial impact. Also, in the industry scenario, classical use cases exist for getting quantum resistances. This mainly involves transfer protocols, PKI and certificates, and there is a lot of attention to the standardization efforts in the ITF.
The Public Key Infrastructure Consortium (PKI Consortium) has shown great interest in migrating towards PQC and held its first conference in March 2023. They focused on the standardization efforts and recommendations by organizations such as the European Telecommunications Standards Institute (ETSI), the Internet Engineering Task Force (IETF) and PQC for PKIs and certificates. Additionally, they published a PQC Capabilities Matrix, which provides valuable information on libraries, hardware and applications concerning their PQC capabilities. “The transition may not happen over a year or two, instead the developments are expected to take place gradually.” Panos also opined that further developments in this space would undoubtedly lead to software upgrades, and in some cases, we might also witness hardware upgrades.
What's coming next?
It is only a matter of time before quantum computers can break today's gold standard for cryptographic security. Organizations across every industry and sector will need to achieve crypto agility to navigate the rapidly changing realities of the quantum era successfully. To do this, cryptographic protocols must be developed at a sufficiently high level so that it's possible to switch the underlying cryptography when a quantum threat becomes effective. Abhinav said, “PQC may be effective in maintaining end-to-end quantum-safe communication. The key also lies in crypto agility. It is essential to understand the algorithm's overhead, i.e., the time to encrypt the data. Thus, crypto skill comes into a foray, depending on the use case."
In our next blogpost, we will move into the crypto agility space and talk about the unbreakable Quantum Key Distribution and its role in quantum-safe communication.