Post-quantum cryptography is already important for embedded systems
These days, I am reading more and more about quantum computers. And, like any nascent technology, there are still many questions surrounding them. Will QPUs (Quantum Processing Units) replace high-end CPUs as we use them today? Will QPUs replace NPUs (Neural Processing Units) and GPGPUs (General-Purpose Graphics Processing Units)? At least to me, this is not yet foreseeable. However, qubit technology is evolving rapidly and is already starting to have a real impact on today’s technology – including new cybersecurity concerns. That extends to embedded computing. Let’s look at the current situation, what the impact on embedded computing at the edge is, and why post quantum cryptography needs to be implemented now.
The rise of quantum computing
Today’s QPUs are still large, fragile, and require specialized environments such as cryogenic cooling or ultra-stable photonic systems. The technology is not easy to use, but it’s extremely fast. A recent publication shows that Google's Willow quantum processing unit can perform a specific algorithm 13,000 times faster than a “standard” supercomputer.
This performance level will continue to improve, offering great opportunities for enhanced simulation (molecular structure and reaction simulation, drug discovery, and protein modeling), machine learning, and cryptography.
And that last point – improved cryptography – is where I foresee a potential impact on embedded computing.
Post-quantum cryptography for future-proof embedded systems
The new cryptographic possibilities of QPUs are a good and a bad thing – depending on how they are used. Decoding of stolen encrypted data will become much faster and easier when the “dark forces” have access to QPUs – which might already be the case.
Embedded computer systems have a long lifespan and operation time and must be hardened against the upcoming risks of quantum computing. This means already equipping them with secure hardware elements, firmware update mechanisms, root-of-trust architectures, and the latest communication protocols. A crucial part of these strategies will be implementing post-quantum cryptography (PQC) algorithms.
Public-key cryptography algorithms like RSA and ECC are particularly vulnerable to future large-scale quantum attacks. To counter this, the National Institute of Standards and Technology (NIST) established a set of post-quantum cryptography standards (FIPS 203, FIPS 204, and FIPS 205) intended to secure a wide range of electronic information against the future threatquantum computers. The idea is that organizations should already be applying these standards now.
There are already several PQC (Post-quantum cryptography) algorithms based on these standards which could be implemented today to remain future-proof:
| PQC algorithm | Description |
|---|---|
| ML-KEM (formerly CRYSTALS-Kyber) | Standardized in FIPS 203 for key encapsulation / key exchange. Based on lattice problems (learning with errors), which are thought to be difficult for both classical and quantum computers. |
| ML-DSA (formerly CRYSTALS-Dilithium) | Standardized in FIPS 204 for digital signatures. Also lattice-based, offering good security and performance. |
| SLH-DSA (formerly SPHINCS+) | Standardized in FIPS 205. Uses hash-based cryptography, which doesn’t rely on lattice problems. Good as a “backup” in case lattice assumptions are broken. |
| FN-DSA (formerly FALCON) | Expected to become FIPS 206. Also lattice-based, optimized for smaller signatures. |
| HQC (Hamming Quasi-Cyclic) | Recently selected by NIST for standardization (as of March 2025) as a backup KEM. It’s code-based, which provides mathematical diversity (different hardness assumption than lattice). |
The requirements for implementing PQC algorithms
PQC algorithms are more computationally demanding, leading to increased memory, power, and bandwidth requirements. And this means that updates will be needed. Older systems handling critical data probably need to be upgraded soon. Meanwhile, in some cases, software patches may only be viable if the existing hardware is powerful enough to handle this heavier encryption workload. Also, any new designs must take into account the potential need to upgrade in the future. 
This is a straightforward task for applications built on Computer-on-Module standards like SMARC, COM Express, or COM-HPC. These modular building blocks allow for easy upgrades to the latest technology without a full system redesign. And if the COMs selected are IEC 62443-4-1 certified, you can be sure of an even more secure development lifecycle.
What you can do now
Ultimately, it is not clear when quantum computing will become mainstream. What we do know is that static, inflexible hardware will struggle to keep up with the defense mechanisms that will be required. The smartest path forward is to adopt a modular philosophy now. By prioritizing architectures that allow for seamless performance upgrades and certified security, you ensure your embedded systems are not just ready for current threats, but resilient enough for tomorrow’s quantum reality.

