Post-Quantum Cryptography (PQC) refers to new encryption algorithms designed to resist attacks by quantum computers. Advances in quantum computing, notably Shor’s Algorithm, threaten today’s common encryption (like RSA and ECC) by allowing a future large quantum computer to crack those systems quickly
This report explains what PQC is and why it’s needed, how experts believe it will impact society, the progress of the U.S. National Institute of Standards and Technology (NIST) PQC standardization competition, the leading candidate algorithms (their strengths and challenges), and how everyday digital activities – from ATM use to online credit card payments – may evolve in a post-quantum world.
Why PQC Matters: Encryption underpins nearly all digital infrastructure – securing websites, financial transactions, communications, and more. Modern society relies on cryptographic algorithms to protect private conversations, sensitive data, and digital infrastructure, but quantum computing can render some current cryptography obsolete. Shor’s Algorithm, if run on a sufficiently powerful fault-tolerant quantum computer, could crack RSA and elliptic-curve encryption in a matter of seconds or hours (versus billions of years on a classical computer)
This looming “quantum threat” has prompted a global race to develop PQC algorithms that can secure our data against quantum attacks.
Expected Societal Impact: Experts anticipate that PQC will bring one of the most significant shifts in internet security everever seen
Digital infrastructure will need updates – for example, web browsers, banking systems, and secure communications protocols must adopt quantum-resistant algorithms. Online transactions and secure communications may undergo changes to incorporate new encryption, potentially using hybrid methods (combining classical and post-quantum algorithms) during the transition. While end-users might not notice a change in their day-to-day use of ATMs or credit cards initially, behind the scenes those systems will be upgraded to new cryptographic standards to maintain security. Failing to do so could make current methods of digital banking and e-commerce vulnerable to quantum-era criminals
NIST PQC Competition and Global Effort: In 2016, NIST launched an open competition to identify and standardize quantum-resistant cryptography. This process has been highly collaborative and international, with 82 algorithm submissions from 25 countries
Over multiple elimination rounds, NIST and experts worldwide scrutinized candidates for security and performance. In 2022, NIST announced four finalist algorithms (one for encryption/key-establishment and three for digital signatures) for standardization. Draft standards were released in 2023 and the first official PQC standards were published in 2024
Work continues with additional algorithms (e.g., a code-based encryption scheme) selected in 2025 to ensure diversity
Other nations and regions – including the European Union, China, and Japan – are also investing in PQC research and planning, often in coordination with or parallel to NIST’s effort, to secure their own systems. Industry adoption is accelerating, with companies like Google, Cloudflare, IBM, and others beginning to implement PQC in products and services.
Leading Algorithms at a Glance: The frontrunner PQC algorithms each use different hard mathematical problems (lattices, hash functions, error-correcting codes, etc.) believed to resist quantum attacks. For encryption, CRYSTALS-Kyber (lattice-based) is favored for its strong security and efficiency. For digital signatures, CRYSTALS-Dilithium(lattice-based) and Falcon (lattice-based) offer fast signing and verification with relatively small signatures, while SPHINCS+ (hash-based) provides a more conservative approach (no new math assumptions) at the cost of larger signatures
Another scheme, Classic McEliece (code-based encryption), is renowned for its decades-old security track record, albeit with very large public keys
Each candidate must balance security, performance, and practicality – for example, lattice-based schemes are fast with small messages but require more computation than current RSA or ECC, and code-based schemes have huge key sizes that could be impractical for memory-constrained devices. Implementing these algorithms securely (avoiding side-channel leaks and integrating with existing protocols) is an ongoing challenge for engineers.
Financial Transactions in a Post-Quantum Era: Banking and payment systems are among the critical infrastructure that must be upgraded before quantum computers arrive. Today’s ATMs, chip-and-PIN credit cards, and online payment platforms often rely on RSA or ECC-based protocols (for PIN encryption, card authentication, or TLS secure connections). Quantum computing could eventually break these, potentially exposing bank account info or allowing fraud if no action is taken
In a post-quantum world, these traditional activities will either be secured by new PQC algorithms or replaced by new secure methods. In practical terms, the user experience of withdrawing cash or paying online might not drastically change, but the cryptographic handshake between your card/bank and the server will use PQC under the hood. In the long run, some current tools might be phased out as obsolete – for instance, older credit cards or ATMs that cannot be updated might need replacement. We may also see more use of digital wallets or QR-code payments secured by PQC, and possibly new forms of secure digital cash. Financial institutions are already planning for this transition, conducting inventory of cryptographic systems and preparing upgrades now to avoid disruption
The goal is that consumers continue to trust and use digital payments safely, with PQC ensuring that even the advent of quantum computers won’t compromise the confidentiality of transactions or personal financial data.
The following sections delve into these points in detail, using accessible language to demystify PQC and illustrate the road ahead for our quantum-safe digital future.
Post-Quantum Cryptography refers to cryptographic algorithms (especially for encryption and digital signatures) designed to be secure against attacks by quantum computers. Unlike quantum cryptography (which leverages quantum physics for new ways of transmitting information, like QKD), PQC algorithms run on conventional computers but are built on mathematical problems that even quantum computers should find intractable. The need for PQC arises from the expected capabilities of future quantum machines. In 1994, mathematician Peter Shor developed Shor’s Algorithm, which showed that a sufficiently powerful quantum computer could factor large integers and solve discrete logarithms exponentially faster than any known classical algorithm
This is significant because modern encryption schemes like RSA (which relies on the difficulty of factoring) and ECC (elliptic-curve cryptography, relying on discrete log) would be vulnerable – tasks that would take classical computers billions of years could take a quantum computer mere hours or seconds
In essence, Shor’s algorithm demonstrated that RSA/ECC “locks” have a “quantum key” that could eventually open them.
Today’s public-key cryptosystems are secure only under the assumption that adversaries lack such quantum capabilities. Experts warn that once “cryptographically relevant” quantum computers exist (machines with enough stable, error-corrected qubits to run Shor’s algorithm on large keys), they could “crack all RSA/ECC cryptography” used in practice
For example, RSA-2048 (widely used for secure websites and VPNs) could theoretically be broken in around 10 seconds by a perfect quantum computer with about 4,000 logical qubits
Likewise, elliptic-curve based schemes (like the ECDSA signatures securing Bitcoin or your phone’s secure communication apps) would succumb to quantum algorithms even faster (ECC is actually an “easier target” than RSA in terms of required qubits) However, current quantum computers are far from this scale – they have only tens or low hundreds of physicalqubits and are noisy (error-prone), nowhere near the thousands of error-corrected qubits needed to threaten RSA/ECC. So, the threat is not immediate, but it is anticipated in the future (some optimists project a decade or more, though no one knows for sure because of this uncertainty, security experts advocate starting the transition to PQC well before quantum computers arrive. One reason is “harvest now, decrypt later” attacks
An adversary could record sensitive encrypted data today (for example, intercepting and saving an encrypted financial transaction or government communication) and simply store it. Years later, if they obtain a quantum computer, they can decrypt that saved data. Thus even data exchanged now can be at risk in the future if it needs to remain confidential for a long time. To counter this, new algorithms must be deployed in advance. PQC algorithms are designed around math problems believed to be resistant to both classical and quantum attacks – for example, problems based on lattices (geometric structures in multidimensional grids), error-correcting codes, multivariate equations, or hash functions. Crucially, these new schemes have to be practical for real-world use: that means reasonably fast, with manageable key sizes and message sizes, and able to integrate into existing protocols.
In summary, PQC is a proactive defense: developing and standardizing encryption that can withstand the power of quantum computing. It’s a global effort involving academia, industry, and government agencies. The next sections explore what adopting PQC means for society and what progress has been made so far.
The advent of quantum computers will challenge the security of virtually every sector that relies on digital cryptography. Experts often emphasize that the impact of quantum computing on digital security and privacy could be one of its most immediate societal effects
As PQC is introduced to counter this threat, we can expect changes across our digital infrastructure:
Despite the need for change, experts reassure that this transition can be managed much like previous crypto migrations (e.g., from weaker algorithms like DES to AES, or from SHA-1 to SHA-256 hashing). It will require careful planning, testing, and widespread software updates. It’s a long-term effort: NIST estimates it can take 10–20 years to fully deploy new cryptographic standards across all systems
During that time, both old and new algorithms will coexist. Organizations are urged not to “procrastinate” – waiting too long could leave a gap if a breakthrough in quantum computing occurs faster than expected
The overarching societal impact of PQC will be an increased resilience of our digital world: if done right, users will continue banking, shopping, and communicating with confidence that their privacy and security endure in the quantum era.
One of the most significant initiatives propelling PQC forward has been the NIST Post-Quantum Cryptography Standardisation Project. NIST (the U.S. National Institute of Standards and Technology) recognized early on the need to prepare for quantum threats. In 2016, NIST formally launched an open call for quantum-resistant cryptographic algorithms
This call was global – open to scientists and teams worldwide, emphasizing collaboration over competition.
The response was tremendous: by the end of 2017, NIST had received 69 complete algorithm submissions that met their requirements
These came from academia, industry, and government research labs around the world (in total, researchers from 25 countries contributed to proposals
The submissions spanned a variety of mathematical approaches – lattice-based schemes, code-based schemes, hash-based, multivariate polynomial, even exotic ones like isogeny-based encryption. Given the volume, NIST structured the evaluation as a multi-round elimination process, somewhat analogous to a tournament (though NIST carefully calls it a “selection process” rather than a competition
The goals were to evaluate security (resistance to all known attacks), performance (speed and resource use), and other factors like key sizes, bandwidth, and ease of implementation.
Over the next few years, NIST and the global cryptographic community scrutinized these candidates through workshops and conferences:
As shown in the timeline, what began with preliminary workshops in 2015–2016 led to a formal call in 2016, rounds of narrowing candidates from 2017 through 2020, and initial standards by 2024. Global collaboration was a hallmark of this project – NIST “rallied the world’s cryptography experts and many candidates were joint efforts by international teams. For instance, the CRYSTALS algorithms had contributors from Europe and North America; Classic McEliece is based on an algorithm invented in 1978 by an American but had updated proposals from European researchers; HQC was developed by French researchers, etc. Throughout the competition, worldwide cryptographers participated in analysis, often publishing research papers that helped NIST in the decision-making. This open process increased confidence that the chosen algorithms have withstood intense scrutiny.
It’s worth noting that PQC standardization is also being pursued outside NIST. The European Telecommunications Standards Institute (ETSI) has a Quantum-Safe Cryptography working group that has held workshops since the mid-2010s. Germany’s BSI, France’s ANSSI, and other national bodies have been tracking NIST’s process closely to align their future standards. China has launched its own PQC standardization effort (separate from but informed by NIST’s results), reportedly favoring some lattice-based and hash-based algorithms in its standards. Japan has supported PQC research through its National Institute of Information and Communications Technology (NICT) – the earlier example of a PQC smart card came from a Japan-led project
In other words, while NIST’s competition is the de facto focal point (much as earlier NIST competitions produced AES and SHA-3, used globally), it is by no means a purely American effort; it’s a worldwide quest to secure the future internet.
By mid-2024, NIST had released draft standards (FIPS 203, 204, 205) for Kyber (now renamed “ML-KEM”), Dilithium (“ML-DSA”), and SPHINCS+
These were finalized into official standards in August 2024
With HQC’s selection in 2025, additional standards will follow. Governments and industries now have concrete algorithms to implement, and we’re entering the phase of deploying these in real-world systems, guided by the standards and best practices that come out of this long competition process.
While many algorithms were evaluated, a few stand out as the leading candidates that will form the core of post-quantum cryptography. Each has different strengths and weaknesses. Below is a comparison of some key algorithms and their characteristics (security basis, performance, and implementation considerations):
Algorithm (Type)Security BasisKey & Message SizesPros and ConsCRYSTALS-Kyber(KEM)Lattice (Module-LWE)Encryption/Key ExchangePublic key ≈ 1184 bytes, Ciphertext ≈ 1088 bytesopenquantumsafe.org
Pros: Very fast key generation and encryption; relatively small key and ciphertext (on par with or smaller than RSA keys); chosen as NIST’s primary PQC encryption standard.Cons: Relies on newer hardness assumptions (lattice problems) – well-studied now but not as time-tested as RSA; requires more computing power on low-end devices than ECC (though still efficient).CRYSTALS-Dilithium(Signature)Lattice (Module-LWE)Digital SignaturePublic key ≈ 1312 bytes, Signature ≈ 2420 bytes
Pros: Efficient signing and verification; moderate-size keys and signatures (a few kilobytes); chosen as a main signature standard due to strong security and simplicity of implementation (no exotic math).Cons: Signatures are larger than current ECC signatures (which are ~64 bytes), meaning more bandwidth/storage use – acceptable for most applications, but not ultra-compact.Falcon(Signature)Lattice (NTRU) Digital SignaturePublic key ≈ 897 bytes, Signature ≈ 666 bytes
Pros: Compact and fast. Falcon signatures are much smaller than Dilithium’s (under 1 K which is useful for bandwidth-sensitive cases (e.g., blockchain transactions, DNSSEC). Verification is very fast. Provides diversity as an alternative lattice approach (NTRU lattice).Cons: Trickier to implement. Falcon uses complex mathematics (floating-point arithmetic in Fourier domain); implementing it securely against side-channel attacks is more challenging. It’s considered more “delicate” for programmers, so it may be used where its size advantage is critical, but not everywhere.SPHINCS+(Signature)Hash-based (Merkle trees)Digital SignaturePublic key ≈ 32 bytes, Signature ≈ ~7856 bytes (≈7.7 KB)
Pros: Most conservative. Security relies only on well-understood hash functions (like SHA-256); even if quantum algorithms improved, hash functions only lose a bit of security (Grover’s algorithm halves the security, which is mitigated by using larger outputs). Stateless variant avoids state management issues. Ideal for high-assurance applications that value long-term security over performance
Cons: Large signature size and slower speed. Signatures can be tens of kilobytes (up to ~8 KB at 128-bit security which is far larger than lattice signatures. Verification and signing are slower, which could be a bottleneck if doing many signatures per second. Thus, SPHINCS+ might be used sparingly, where its unique security property is needed (for example, auditing systems or software signing), rather than for every TLS connection.Classic McEliece (KEM) Code-based (binary Goppa codes)Encryption/Key EncapsulationPublic key ≈ 0.5–1 MB, Ciphertext ≈ 128–240 bytes
Pros: Highly trusted security. Based on a problem that’s been studied since 1978 – in over 40 years, no one has found a viable attack that significantly weakens it. Very small ciphertext size and very fast decryption/encryption. Provides excellent diversity (non-lattice).Cons: Enormous public keys (hundreds of thousands of bytes) which are impractical for many uses like TLS certificates or IoT devices with little memory
This limits McEliece to niche applications unless key size can be reduced or one is willing to accept the storage/transmission cost. Because of this, NIST kept McEliece as an alternate; it might still be standardized for specialized scenarios (e.g., secure backups or as a root certificate algorithm) where its large key is manageable.
(Sizes above are approximate for NIST Level 1/128-bit security. “KEM” refers to key encapsulation mechanism, used for establishing shared secrets in encryption protocols.)
As the table shows, there is a trade-off triangle between security assumptions, performance, and size. Lattice-based schemes like Kyber and Dilithium are generally favored for broad use: they are fast (on the order of microseconds to milliseconds for operations) and have key/signature sizes in the kilobyte range, which today’s networks and hardware can handle easily. Their security rests on problems like the Learning With Errors (LWE) problem on lattices, which is believed to be hard even for quantum computers. While not proven, lattice problems have been studied for a few decades and withstood all attempts so far, giving confidence. One advantage is that symmetric encryption and hash functions remain secure against quantum attacks (just with larger keys/hashes), and lattice schemes have some connections to those (in that their security reductions often rely on hash functions too), further bolstering trust.
Hash-based signatures (SPHINCS+) have the appeal of using no unproven math assumptions – if our hash functions (SHA-2, SHA-3 families) remain secure, so will SPHINCS+. This “insurance” comes at the cost of efficiency. For many consumer applications, a 8 KB signature is actually not a deal-breaker (consider that a single photo on your phone is several megabytes; 8 KB is trivial in comparison), but the slower speed and the bulk do add up if, say, a server must verify thousands of signatures per second.
Code-based systems like McEliece and HQC have very large public keys, which historically made people shy away from them. However, HQC (selected in 2025) uses structured codes to shrink keys somewhat (HQC’s public key is on the order of a few thousand bytes, much smaller than McEliece’s) at the expense of a larger ciphertext (~also a couple thousand bytes). This is a pattern in PQC design: often we can trade off key size vs. ciphertext size by adding structure. Kyber, for example, has much smaller keys than an unstructured lattice scheme would, because it uses algebraic structure (module lattices) – but one must be careful that this structure doesn’t introduce a weakness. NIST was cautious about structured vs. unstructured: they picked mostly structured schemes due to efficiency, but tried to ensure nothing known compromises their security.
Another aspect is implementation challenges. Many PQC algorithms require careful programming to avoid side-channel leaks (like timing or power consumption variations that could reveal secrets). For instance, lattice algorithms involve lots of matrix arithmetic and random noise sampling; ensuring that is done in constant time and without leakage is an active area of research. The PQC finalists have reference implementations, but developers are now creating optimized versions for specific platforms (CPUs, smart cards, hardware accelerators). Industry consortia and open-source projects (like Open Quantum Safe, and PQCrypto libraries) are working to make adoption easier by providing tested implementations.
In summary, the leading PQC algorithms are suitable replacements for our current cryptosystems, each with some caveats:
The good news is that in tests and trials so far, these algorithms have shown they can run on everything from cloud servers to smartphones – sometimes with minimal impact. For example, Google and Cloudflare ran an experiment adding Kyber to TLS handshakes and found it workable, with only modest increases in handshake packet sizes and computation
As PQC moves from theory to practice, ongoing evaluation will continue, but we now have a toolkit of quantum-safe algorithms that seem suitable for protecting society’s digital infrastructure going forward.
Among the daily activities that could be disrupted by quantum threats are those involving financial transactions – using ATMs, swiping or inserting credit cards, and paying online. These systems rely heavily on cryptography for security, and thus will be significantly affected (or transformed) by the transition to PQC. Let’s consider how these might change:
In all likelihood, financial transactions won’t become obsolete but will undergo an invisible metamorphosis. Much like how we moved from swipe cards to chip-and-PIN for better security, we will move from classical crypto to post-quantum crypto in our payment systems. Chip cards didn’t make credit cards obsolete – they made them more secure. Similarly, PQC will make digital payments more secure against future threats. Consumers might only notice subtle changes: perhaps new cards, maybe slightly longer transaction times in some cases (if an old payment terminal is slow with the new math), or new security policies (like banks encouraging the use of updated mobile banking apps that support PQC). The financial industry is highly motivated to not let quantum computing undermine trust, because trust is the bedrock of banking.
Encouragingly, experts say we are still within a safe time window to do this right: Quantum computers are not an immediate threat, giving the industry a number of years to test and roll out PQC
But the work must begin now. In the words of one financial cybersecurity expert: “For us, it’s not an option to just wait and see what happens. We want to be ready and implement solutions as soon as possible.”
This proactive stance means that by the time you slide your card in a post-quantum 2030s, the transaction will proceed securely — and quantum hackers will be left empty-handed.
The transition to post-quantum cryptography is a global effort, not confined to any one country or sector. Here we highlight some global initiatives and trends in industry adoption:
In summary, the world is mobilizing to meet the quantum challenge. Collaboration is key: nations are sharing research, companies are open-sourcing their implementations, and standards bodies are ensuring interoperability. While there may be geopolitical divergence in exactly which algorithms are adopted (for instance, one country might favor a locally developed algorithm), the mathematical problems tend to be similar, and many PQC algorithms are converging to a few well-vetted families (lattice and hash-based for now, with code-based as a backup). As a general public audience, one can be reassured that PQC is not some theoretical fancy – it is a very active engineering project, with prototypes running today and deployment plans in motion across the globe. The goal is that the average person, wherever they are in the world, will continue to enjoy secure digital life – chatting, shopping, traveling, banking – without interruption, even as quantum computers rise in the background.
The journey to a post-quantum cryptographic world is underway. PQC provides the tools to ensure that our digital society – built on billions of secure transactions and communications every day – remains safe against the next leap in computing power. We have identified strong candidate algorithms, vetted by years of analysis, and the first standards are in place. The impact will be felt gradually: software updates here, a new card or device there, maybe a news headline about a “quantum-safe” VPN or browser. For most people, PQC will simply become the new normal for security, much like longer passwords or chip cards did – a necessary evolution to keep trust in our systems.
Experts paint a hopeful picture that if we act in time, critical data and infrastructure will transition smoothly, and the quantum computer will be a marvel for science and industry without becoming a catastrophe for cybersecurity. Society’s dependence on digital encryption will only grow (think of expanding IoT, autonomous vehicles, digital currencies), so PQC is arriving not a moment too soon. As individuals, being aware of this change is useful: over the coming years, you might hear terms like “quantum-ready” or “X.509 certificates with Dilithium signatures” – these are indicators that organizations are embracing the future. Governments and businesses are investing now so that, by the time a quantum computer is powerful enough to matter, the world’s secrets and transactions will already be locked by new quantum-proof keys.
In the end, PQC is about preserving the privacy and security principles that underpin the internet and modern life, even in the face of groundbreaking technological shifts. It’s a fascinating convergence of advanced mathematics, computer science, engineering, and policy on a global scale. And while quantum computers promise to solve complex problems in chemistry, optimization, and AI, thanks to PQC they won’t get to solve the problem of undermining our encryption. The collaborative effort of the international cryptographic community is ensuring that when the quantum future arrives, we’ll be ready.