Why Post-Quantum Cryptography Matters Now, Not Later

Godfrey Maiwun  ·  February 2026  ·  Cryptography  ·  11 min read

The quantum threat to encryption is not speculative fiction. It is an engineering timeline — and organisations that begin transitioning now will be the ones that are not scrambling when that timeline closes.

The problem with "later"

One of the most common refrains in security leadership is that quantum computing capable of breaking RSA or elliptic-curve cryptography is "decades away." This framing is technically defensible and strategically dangerous in equal measure. The threat is not only when a quantum computer breaks encryption — it is today, through a class of attacks known as harvest now, decrypt later.

Nation-state adversaries and sophisticated criminal groups are already collecting encrypted traffic at scale: financial records, state communications, health data, intellectual property. The bet is that decryption capability will arrive before the sensitivity of that data expires. For data with a long sensitivity horizon — classified communications, long-lived credentials, strategic plans — the clock is already running.

The question is not whether you have a quantum problem. It is whether you have enough runway to solve it on your own schedule rather than someone else's.

What NIST has signalled

In August 2024, the National Institute of Standards and Technology finalised its first set of post-quantum cryptographic standards. This is not a theoretical exercise. It is the most consequential cryptographic standardisation event since the adoption of AES in 2001 — an institutional acknowledgement, backed by years of global competition and peer review, that migration needs to begin now and that the window for orderly transition is finite.

The post-quantum transition: a timeline, not a theory.

The primary algorithms are CRYSTALS-Kyber (now formally ML-KEM) for key encapsulation, and CRYSTALS-Dilithium (ML-DSA) for digital signatures. SPHINCS+ (SLH-DSA) provides a hash-based signature scheme as an alternative. Each has been publicly scrutinised by the global cryptographic community since 2016. They are not perfect — no cryptographic standard is — but they are ready for production deployment.

Waiting for the "right" quantum-resistant algorithm is a trap. The right algorithm is the one that is standardised, peer-reviewed, and available to your vendors today.

Governments are not waiting. The US Office of Management and Budget issued directives requiring federal agencies to develop quantum-readiness inventories. The UK's NCSC has published migration guidance. The EU's ENISA has followed. If you operate in regulated sectors or supply chains that touch government, compliance timelines will arrive before quantum computers do.

Cryptographic agility — the immediate priority

For most organisations, the immediate priority is not replacing all cryptographic infrastructure overnight. It is cryptographic agility: designing systems so that algorithms can be swapped without requiring wholesale re-architecture. This is a design discipline, not a product you buy.

Designing systems that can swap algorithms without re-architecting.

Cryptographic agility means separating algorithm choice from application logic. It means using abstraction layers — TLS libraries, key management services, certificate authorities — that can be upgraded centrally rather than patched across thousands of endpoints individually. It means documenting where cryptography is used, not just assuming that TLS handles everything.

Most organisations discover, when they start this inventory, that cryptography is embedded in far more places than their architecture diagrams suggest: in API authentication tokens, in database encryption schemes, in firmware signing pipelines, in backup encryption keys that have not been rotated in years.

The inventory problem

Before you can migrate, you need to know what you are migrating. A cryptographic asset inventory is the foundation of any post-quantum readiness programme — and it is harder to build than it sounds.

The categories to map are: what algorithms are in use (RSA, ECC, AES, SHA-2, etc.), where they are used (TLS termination, code signing, database encryption, authentication tokens, backup encryption), what data they protect and for how long that data needs to remain confidential, and which vendors and libraries are responsible for the implementation.

This inventory will surface surprises. Legacy systems using SHA-1 or RSA-1024 that someone assumed were retired. Third-party integrations whose cryptographic choices are opaque. Hardware security modules whose firmware update path is unclear. Vendor SaaS dependencies that have no published PQC roadmap.

A migration framework

Once the inventory exists, a realistic migration programme follows a triage model: prioritise by risk exposure, not by ease of implementation.

Tier 1 — Highest risk: Data with long-term sensitivity horizons (10+ years), key exchange mechanisms that could be targeted by harvest-now attacks, public-key infrastructure that underpins identity and authentication across the organisation. These move first, even if they are hard.

Tier 2 — Medium risk: Systems using RSA or ECC for authentication or signing, certificate authorities, VPN infrastructure, cloud provider integrations. Plan migration within a 2–4 year horizon.

Tier 3 — Lower urgency: Symmetric encryption (AES-256 is quantum-resistant today), hash functions (SHA-256 and above are acceptable), short-lived tokens with no long-term sensitivity. These can be addressed as part of normal refresh cycles.

The migration itself is largely a vendor management exercise — your TLS library, your cloud provider's KMS, your certificate authority, your VPN vendor, your hardware vendor all need to be on post-quantum roadmaps. Begin those conversations now. Vendors who cannot answer questions about their PQC timeline are vendors who will create emergency work for you in three to five years.

The supply chain dimension

Your post-quantum exposure is not limited to the cryptography you directly control. It extends to every vendor and third party whose cryptographic choices affect data you care about. A supplier who encrypts customer exports with RSA-2048 creates risk that belongs to you, not just to them.

Adding PQC readiness to vendor security assessments is one of the highest-leverage things a security team can do right now. It costs almost nothing in effort, and it creates accountability and visibility before the problem becomes urgent.

Starting the conversation internally

Post-quantum cryptography is a topic that sounds esoteric to non-technical leadership. Framing it correctly is a communication problem as much as a technical one. The framing that lands is not "quantum computers will break encryption" — it is "our encryption has an expiry date, and we need to know what it is and plan accordingly." That is a risk management conversation. Every board understands risk management.

The organisations best positioned for the post-quantum era are not those with the biggest budgets. They are those who started the conversation early, built an inventory, and made cryptographic agility a design requirement rather than a retrofit project. The time for that conversation is now.


Filed under: Cryptography

More writing Sep. 2025
Zero Trust at Scale: A Reality Check for Practitioners
Security Architecture
Jan. 2026
AI in Threat Detection: Signal or Expensive Noise?
AI · Security Operations

All writing →