Hardware wallets are constrained embedded devices built to operate in a hostile environment. They sign transactions, keep keys off the host, and remain safe even if the connected computer is compromised. They tolerate physical attackers via things like secure elements, hardened boot chains, and side-channel mitigations which define what the device can safely do, and what it cannot.
NIST finalized its first post-quantum signature standards in August 2024 (ML-DSA FIPS 204 and SLH-DSA FIPS 205). Whether existing hardware wallets can run these schemes safely, inside the trust boundaries they shipped with, depends on where signing executes, how firmware updates are verified, how much resource headroom exists, and how much side-channel risk the platform can absorb.
Wallet hardware takes years to design and ship. Secure elements take years to develop and certify. Major chain migrations take years because they require consensus and coordination. Those timelines stack. If the ecosystem wants widely deployed post-quantum transaction signing in hardware wallets before quantum risk becomes acute, the engineering work has to start while elliptic curves are still the default.
A hardware wallet is a signing appliance. The host constructs an unsigned transaction and sends it to the device over USB or Bluetooth. The device displays the details on its own screen (recipient, amount, fees), the user approves with a physical action. The device returns a signature and importantly the private key never leaves the device.
.jpg%3F2026-02-27T10%3A30%3A13.247Z&w=3840&q=100)
Where signing runs on the device, and what is allowed to touch key material, determines whether post-quantum schemes can be added later. Most consumer wallets follow one of two designs.
Secure-element-centric. The private key lives inside a secure element (SE) and signing happens inside that same chip. An SE is a small programmable computer with physical countermeasures (tamper sensors, shielding, glitch resistance), a constrained interface, and often a certification process. The rest of the device can request operations from the SE but cannot read raw key material out. The SE's operating system and crypto library define which signature schemes are available. Third-party code can only call what the SE platform exposes.
MCU-centric. The private key lives on the main microcontroller and signing runs on that MCU in firmware. These devices still keep keys off the host and enforce local approval. Algorithm changes can be implemented and shipped without requiring platform-level changes from an SE vendor or new certification cycles. The trade-off is the physical attacker model. A general-purpose MCU has fewer built-in countermeasures than a secure element, so side-channel resistance depends more on constant-time implementation and device design.
Post-quantum cryptography pressure-tests both architectures. In an SE-centric design, new signature schemes require the vendor to add support at the SE platform layer. A process that involves SE firmware development, validation, and potentially recertification. In an MCU-centric design, new schemes can be implemented directly in application firmware, but still have to fit within tight CPU, RAM, and side-channel constraints on a less physically hardened processor.
Ledger and Trezor are the two largest consumer hardware wallet vendors, and they represent these two architectures.
Ledger uses a dual-chip design. A secure element runs Ledger OS and performs all cryptographic operations. A separate MCU manages USB, Bluetooth, and the display. Ledger describes the MCU as a "dumb router" that does not perform application logic or store secrets (Ledger hardware architecture). Signing happens inside the SE through OS syscalls like cx_ecdsa_sign_no_throw, and key derivation runs there via calls like os_perso_derive_node_bip32 (Ledger cryptography API).
The secure elements in recent Ledger devices are the ST33J2M0 (Nano X) and ST33K1M5C (Nano S Plus, Stax, Flex) (Ledger hardware architecture). Public ST documentation lists the ST33J2M0 at 50 KB user RAM and up to 2048 KB user flash, and the ST33K1M5C at 64 KB user RAM and up to 1.5 MB user flash (ST33J2M0, ST22K1M5C). The usable budget for any single signing feature is smaller as the Ledger OS and its isolation mechanisms consume part of those resources.
Adding post-quantum signatures to a Ledger device requires Ledger to add support at the SE platform layer. Running PQ signing on the MCU instead conflicts with the Ledger’s trust boundary; the MCU is not allowed to handle key material. Moving signing there changes the security model.
Trezor splits trust differently. Recent devices include secure elements, but the SE is not where transaction signatures are computed. Trezor describes the SE as enforcing physical access controls and contributing a secret used to protect private keys on the main MCU (Trezor secure elements). Basically, the SE ensures the keys are encrypted at rest but signing runs in firmware on the MCU. The firmware is open source, and the ECDSA signing path is implemented as C functions in the crypto codebase (trezor-firmware/crypto/ecdsa.c). The signing path is independently auditable, and algorithm upgrades do not require SE vendor involvement or recertification.
Memory is still a constraint. Bitcoin signing on Trezor uses a streaming protocol because the device cannot store an entire transaction in memory so data arrives in chunks (Trezor Bitcoin signing). Trezor's documentation notes that transactions can be several hundred kilobytes but older devices have only 64 KB of memory (Trezor message workflows). Newer Safe devices run STM32U5-series microcontrollers (Trezor docs). The STM32U575/U585 variants in that family offer up to 2 MB flash and 786 KB RAM; higher-end STM32U5 variants go up to 4 MB flash and 2.5 MB RAM (STM32U5 series). Post-quantum signing still competes with transaction parsing, UI, safety checks, and update logic for that space.
The Trezor architecture makes ML-DSA-class upgrades more plausible as firmware updates. However, the MCU lacks SE-grade physical countermeasures, so side-channel resistance is entirely an implementation responsibility.
The two NIST standards most relevant to hardware wallets are ML-DSA (FIPS 204, lattice-based, formerly Dilithium) and SLH-DSA (FIPS 205, hash-based, formerly SPHINCS+).
Signature size. ECDSA signatures are 64–72 bytes. Ed25519 signatures are 64 bytes. ML-DSA-44 produces a 2,420-byte signature and a 1,312-byte public key. SLH-DSA's smallest parameter set produces a 7,856-byte signature. On a device where Bitcoin signing already requires a streaming protocol to handle transactions within 64 KB, a 30–120x increase in signature size will raise further problems. See our reference howbigistoobig.com for more details.
Signing speed. The pqm4 project benchmarks PQC on Cortex-M4 class microcontrollers (pqm4). The original pqm4 paper reported SPHINCS+ signing times from 22 seconds to 88 minutes at 24 MHz (pqm4 paper). ML-DSA is faster but has variability due to rejection sampling, which affects both user experience and side-channel analysis. Trezor's STM32U5 runs at 160 MHz, which helps, but signing still has to complete within a user-acceptable approval window alongside transaction parsing and display.
Working memory. RAM is often the binding constraint. Standard pqm4 implementations of Dilithium (now ML-DSA) require 50–100 KiB of memory on Cortex-M4. Memory-optimized implementations exist; compact Dilithium3 has been demonstrated at under 12 KiB and further work has pushed key generation and signing below 7 KiB but these trade signing speed for memory savings. On a Ledger SE with 50–64 KB of total user RAM shared with Ledger OS, the standard 50–100 KiB range may exceed what is available for a single signing operation. On a Trezor MCU with 786 KB, ML-DSA fits more comfortably, but signing still competes with transaction parsing, UI rendering, and safety checks for the same memory.

Hardware wallets assume a physical attacker. Someone with physical access to the device may observe power consumption, timing, or electromagnetic emissions and recover secrets.
Classical ECDSA and EdDSA on embedded targets have had years of hardening; constant-time implementations, blinding, and countermeasures. Post-quantum schemes have less maturity. ML-DSA's rejection sampling creates observable variability dependent on secret data. Falcon has different implementation risks driven by its sampling process and floating point arithmetic. NIST explicitly listed side-channel resistance as a desirable property during the PQC standardization process (NIST PQC project).
On an SE-centric device, physical countermeasures built into the chip reduce exposure, but the SE has to support the scheme. On an MCU-centric device, the firmware has more flexibility, but the entire hardening burden falls on the implementation.
The question is whether a device can produce post-quantum signatures for on-chain transactions without breaking the security properties it was designed to provide.
MCU-signing devices (Trezor-style) are the most plausible firmware-upgrade candidates for ML-DSA. The signing path already runs on the MCU. STM32U5-class hardware has enough RAM for ML-DSA working sets. Open questions are whether the implementation fits alongside existing features, whether latency is acceptable, and whether side-channel leakage can be controlled without SE-grade countermeasures.
SE-signing devices (Ledger-style) face a harder constraint. If the SE does not expose post-quantum primitives, third-party apps cannot add them. Ledger could add support via SE platform updates, but feasibility depends on remaining headroom; 50–64 KB of user RAM shared with Ledger OS, and the certification process. Moving signing to the MCU breaks the trust boundary.
Older devices have tighter CPU and RAM budgets and are often near their limits already. Even if an algorithm is theoretically implementable, practical integration may not fit.
A device can upgrade to post-quantum signing only if the new scheme fits both its resource envelope and its existing trust boundary. If either one fails, it requires hardware replacement.
Even if a device has the resources to run post-quantum signing, it needs a secure way to receive the update that enables it.
Hardware wallets verify firmware updates using digital signatures. When you install a firmware update, the device checks that it was signed by the vendor before it runs any new code. On most devices in circulation, that check uses classical elliptic curve signatures (ECDSA or EdDSA). Under a post-quantum attacker model, those signatures are forgeable. An attacker with a sufficiently powerful quantum computer could produce a valid-looking signature on malicious firmware, and the device would accept it.
The verification logic lives in the bootloader, the lowest layer of code on the device, which runs before anything else. On most hardware wallets, this layer is immutable: it is written at the factory and cannot be updated. That is a security feature because it prevents attackers from tampering with the root of trust. But it also means the signature verification algorithm baked into that layer cannot be changed after the device ships.
Trezor Safe 7 was designed with this constraint in mind. Its bootloader uses a hybrid signature scheme: firmware updates must carry both an EdDSA (Ed25519) signature and an SLH-DSA-128 (SPHINCS+) signature, and both must verify (Going Quantum).
Older Trezor devices (Safe 5, Safe 3, Model T, Model One) and current Ledger devices use classical-only signature verification in their boot chains. Trezor states that legacy devices relying only on classical elliptic curve verification cannot be patched and will need replacement (Going Quantum). The same constraint applies to any hardware wallet whose immutable boot code verifies updates with only classical signatures.
If every blockchain agreed on one post-quantum signature scheme, hardware wallets could treat the transition like a firmware update. That is not happening.
As established above, most consumer hardware wallets support a small set of elliptic curve primitives: secp256k1 for Bitcoin and Ethereum, ed25519 for newer chains like Solana.
Post-quantum is diverging across every dimension. Bitcoin's BIP-360 proposal, merged into the official BIP repository in February 2026, introduces a quantum-resistant output type (Pay-to-Merkle-Root) and names ML-DSA (Dilithium), FN-DSA (Falcon), and SLH-DSA (SPHINCS+) as candidate signature algorithms (BIP-360). Ethereum's approach is different, account abstraction (EIP-4337) lets users choose their own signature algorithm per account.
Even within a single NIST standard, there are multiple parameter sets. ML-DSA has three security categories. SLH-DSA has 12 standardized parameter sets. Some chains will ship hybrid constructions combining classical and post-quantum signatures during migration, or stateful hash-based signatures like XMSS in constrained settings.
Key derivation also diverges. BIP-32 hierarchical deterministic derivation, the mechanism that makes a single seed phrase usable across accounts and chains, relies on elliptic curve algebraic structure. Lattice-based schemes do not have a widely accepted equivalent (stay tuned for more on this from the Project Eleven Research Team soon). Migration will require hardened-only derivation, new seed handling, or new standards, and different chains will likely make different choices.
For a multi-chain hardware wallet, each additional scheme means more code, more test vectors, more memory pressure, and more surface for side-channel issues. On an SE-centric device, each scheme requires SE platform support. On an MCU-centric device, every implementation competes for the same finite flash.
Most hardware wallets in circulation were not designed for post-quantum transaction signing. Whether a given device can be upgraded depends on three questions: does the signing architecture allow new algorithms without breaking the trust boundary, does the device have enough resource headroom to run them, and can the device securely receive the update that enables them. For the majority of deployed devices, at least one of those answers is no.
Some newer MCU-centric devices with post-quantum boot chains, like the Trezor Safe 7, are positioned to add ML-DSA-class signing via firmware updates. For many legacy devices, both SE-centric and older MCU-centric, the path to post-quantum on-chain signing is hardware replacement.
A hardware wallet shipping today has to choose which post-quantum schemes to support before major blockchains have committed to which schemes they will require. A blockchain committing to a scheme has to consider whether the deployed wallet ecosystem can run it. Both decisions operate on multi-year timescales and neither can fully resolve without the other. That circular dependency is the core problem.
Based on what we see across the PQC and blockchain landscape, here is what we think different parts of the ecosystem should be doing now.
- Wallet vendors should ship post-quantum boot chains in every new device. A device that ships today with classical-only firmware verification is already locked out of a secure upgrade path.
- Wallet vendors should design for ML-DSA resource budgets now, even without final chain commitments. ML-DSA-44 is the most likely first candidate across multiple ecosystems. New hardware designs should ensure enough RAM and flash headroom to run ML-DSA signing alongside existing transaction parsing and UI, not as a replacement for classical schemes, but in addition to them as hybrid signing periods will require both.
- SE vendors need to expose post-quantum primitives. SE manufacturers should be adding ML-DSA and SLH-DSA to their crypto libraries and exposing them through the same interfaces that ECDSA and EdDSA use today.
- Chain developers should consider hardware wallet constraints when selecting PQC schemes. A signature scheme that cannot run on the devices where most users keep their keys is not a practical standard and should be considered in each chain’s migration plans.
- Users holding significant value in hardware wallets should not assume their current device will carry them through the transition. If your wallet has a classical-only boot chain, it will eventually need to be replaced regardless of what firmware updates are released. That does not mean you need to act today, but it does mean the next hardware wallet you buy should be evaluated on its post-quantum readiness.
The timelines here are long but they are not indefinite. Secure elements take years to develop. Hardware takes years to ship. Chain migrations take years to coordinate. The engineering decisions being made now, by wallet vendors, SE manufacturers, and chain developers, will determine whether post-quantum transaction signing is available on consumer hardware when it is needed.