Quantum Computing, Blockchain, and the Power of Nodes: Securing the Future of Decentralization (Part 2)
Nodes, Quantum Threats, and Resiliency. Explore the critical role of blockchain nodes in decentralization, security, and redundancy, while addressing the challenges posed by quantum computing.
Quantum computers have surprisingly not been a major plot device in most sci-fi movies or TV shows, despite their fascinating and potentially world-changing implications. This is odd, considering I’m constantly asked when quantum computers will eradicate all data privacy and bring about armageddon.
Naturally, I consulted my LLM on this mystery, and the response was, “AI often takes the limelight because it’s easier for audiences to grasp: machines becoming smarter and taking over is a more direct and tangible concept.” A tangible concept! Everyone, be warned. So, back to the question of encryption…
Elliptic Curves Cryptography
Elliptic curves, like those used in cryptography, are described by equations of the form:
y^2≡x^3+ax+b (mod p)
Ethereum has a specific elliptic curve: the secp256k1 curve:
a=0: This simplifies the equation because 0⋅x=0, so the term is omitted.
b=7: A specific constant chosen for this curve.
p: A very large prime number, p=2^256−2^32−977, which defines the finite field over which the curve is defined. This prime ensures the curve operates in a modular arithmetic space.
2^256: This is a very large number, approximately 1.1579209×10^77.
2^32: This is much smaller, equal to 4.294967×10^9.
Subtracting 2^32 and 977 results in a large prime number p≈1.157×10^77
p here appears to approximate 2^256, the upper limit of numbers representable with 256 bits. However, they are not exactly the same, but they are extremely close. Cryptographic operations in ECC require a prime modulus to ensure mathematical properties like unique solutions for point addition and scalar multiplication.
Using a number like 2^256 introduces mathematical weaknesses, making some operations easier to exploit. Subtracting 2^32 and 977 ensures p is prime and avoids these vulnerabilities.
Thus, the equation for secp256k1 is:
y^2≡x^3+7 (mod p)
This curve is widely used in cryptography, particularly in Bitcoin and Ethereum, for its efficiency and security. Secp256k1 was originally chosen by Satoshi Nakamoto for Bitcoin due to its computational efficiency and security properties. Ethereum inherited this choice for consistency.
Notably, secp256k1 is not one of the curves recommended by the National Institute of Standards and Technology (NIST). Some believe Satoshi avoided NIST curves due to concerns about potential backdoors, though there is no evidence of this.
Unlike smooth curves in calculus, secp256k1 operates in a finite field. This creates a finite set of points on the curve. You can "add" two points on the curve to produce another point on the curve but only by adding two points.
Why Can’t We Reverse Scalar Multiplication?
A private key k is a randomly chosen number in the range between one and where order of the curve (n), which is the large prime number from before n=1.158×10^77(approx.). The private key (k) must be kept secret.
The public key P is derived from the private key k by multiplying a predefined generator point G on the curve: P=k⋅G.
G is a fixed point on the curve defined in the secp256k1 standard. The multiplication k times G is repeated k times. This is called scalar multiplication: the process of multiplying a point G (called the generator point) on an elliptic curve by a scalar k (a regular integer) to get another point Q on the curve. Each private key maps to exactly one public key as a result go using the elliptic curve equation. So scalar multiplication is the foundation of all ECC operations e.g. Key Generation. Scalar multiplication in elliptic curves involves repeatedly adding the point G to itself a total of k times. For example:
2G=G+G
3G=G+G+G
kG=G+G+…kG (k times)
RSA uses pure modular arithmetic, not scalar multiplication. The core computations involve exponentiation and division with a modulus (e.g., finding remainders) as explained earlier. Elliptic curve cryptography does not use traditional multiplication like in basic arithmetic. Instead elliptic curve point addition rules need to be applied. This makes always a bit confusing unless you know that the operations involve the mathematical addition of points. This is defined geometrically and relatively complex but the point is that if you add two points and draw a line through them, the line intersects the curve at a third point and this is ensured by the way the curve is defined i.e. limiting your choice and describing the shape.
Modular arithmetic comes into play because ECC operates over a finite field (e.g., numbers modulo p, where p is a large prime, which was p=2^256−2^32−977). Thus, ECC combines scalar multiplication with modular arithmetic, whereas RSA is purely modular arithmetic.
Why Elliptic Curves Are Hard to Break
The security of Ethereum's elliptic curve cryptography relies on the fact that, while the number of points on the curve is finite, it is so vast that guessing or calculating the private key from the public key through trial and error is computationally infeasible—at least for now.
But why? We know the formula P = k ⋅ G, where G is a known constant and k is the private key. The sender’s private key, k, is used to sign transactions, and the cryptographic signature reveals enough information for the network to recover the public key, P, during the signature verification process. However, this recovery is temporary and occurs only during transaction processing; the public key is not stored on the blockchain.
If we know both P and G, why can’t we simply calculate k = P / G? At first glance, this might seem straightforward, but it’s not. Division is not defined in this context because elliptic curve scalar multiplication is a one-way, non-linear function. This means there is no operation equivalent to division that allows us to reverse the process.
To illustrate, imagine G is "red," and k is the number of times you mix it with "blue" to create the final color, P (e.g., purple). Observing purple doesn’t provide a defined method to determine how much blue was used to create it. Similarly, in elliptic curve cryptography, the process of scalar multiplication cannot be reversed to isolate the private key, k.
Another way to think of it is a dartboard with several darts sticking in: If someone shows you a dart on the board (the public key), you have no way of knowing which sequence of throws (private key) landed it there. Trying every possible sequence of "throws" (private key) is not practical given the size of numbers.
Unlike traditional linear equations, where you can isolate a variable (e.g., x=y/z), there’s no equivalent operation to "divide" P by G to recover k. Scalar multiplication doesn’t behave like multiplication in basic arithmetic; it involves repeated, iterative operations based on elliptic curve rules. Unlike RSA, where factoring N=p⋅q is hard but conceptually straightforward, elliptic curve math involves an abstract structure that resists shortcuts. It’s like having one equation with two unknowns, where the operations involved don’t allow division or simplification.
Ancient mathematicians knew about circles but couldn’t calculate their properties until they discovered the relationship between circumference, radius, and π. Once the formula using π was defined, circles became "computable." And for now, we have no π for reversing scalar multiplications.
Machine learning excels at finding patterns or symmetries in data, even when they are not immediately apparent to humans. Researchers hypothesize that ML might detect hidden relationships in the points on elliptic curves that could reveal shortcuts for reversing scalar multiplication. The sequence of points generated by scalar multiplication on an elliptic curve appears pseudorandom, even though it’s deterministic. ML algorithms struggle to learn or predict behavior in highly random datasets. So this idea didn’t lead to a different insight.
From Public Keys to Wallet Addresses
Quantum Computing: A New Challenge
Ethereum wallet addresses are derived from the public key using the Keccak-256 hash of the public key—i.e., the last 20 bytes of the hash are used as the wallet address. This means that, in theory, multiple private keys can map to the same Ethereum address because the hashing process reduces the space size from 256 bits to 160 bits.
Signing with k1:
When k1 signs a transaction, the signature reveals enough information to compute the corresponding public key P1.
The network checks that P1 hashes to the shared Ethereum address.
Signing with k2:
Similarly, if k2 signs a transaction, the signature reveals P2, and the network verifies that P2 hashes to the same Ethereum address.
In both cases, the transactions would be valid because the public key P1 or P2, derived from the respective private key, correctly hashes to the shared address. However, this is not a practical issue because the probability of two private keys hashing to the same address (a "collision") is astronomically small.
Interestingly, until a wallet sends funds for the first time, the public key is not revealed. A wallet can receive funds against its address, but this does not expose the public key. Without the public key, even a quantum computer would be ineffective except for trial and error. In other words, your wallet is already quantum-resistant as long as you only receive funds and don’t send any. Why not? Such a security policy may have unexpected side effects—like getting rich.
So what is quantum computer and how can it calculate the private key by “looking at purple”?
To understand how a quantum computer might "look at purple," we first need to understand how it operates on a fundamentally different set of rules than classical computers.
The concept of a quantum computer was first introduced by Richard Feynman in 1982. He had an illustrious career; for instance, he was awarded the Nobel Prize in Physics in 1965 (shared with Julian Schwinger and Sin-Itiro Tomonaga). He also worked on the Manhattan Project during World War II, helping to develop the atomic bomb. And he was a fun guy. If you’ve read his papers, you’ll know what I mean. To give you an idea, he published books like "Surely You’re Joking, Mr. Feynman!" (a collection of humorous and insightful anecdotes) and "What Do You Care What Other People Think?" (a mix of personal stories and scientific reflections). The latter could have been written by me.
There are several papers where he describes the idea, but the first one, published in 1982, begins as follows and gives you a perfect sense of the kind of character he was:
“On the program it says this is a keynote speech—and I don't know what a keynote speech is. I do not intend in any way to suggest what should be in this meeting as a keynote of the subjects or anything like that. I have my own things to say and to talk about and there's no implication that anybody needs to talk about the same thing or anything like it. So what I want to talk about is what Mike Dertouzos [he was a professor at MIT] suggested that nobody would talk about. I want to talk about the problem of simulating physics with computers and I mean that in a specific way which I am going to explain.”
In quantum mechanics, a "quantum" is the smallest discrete unit of energy or change in a system. This reflects how particles behave at microscopic scales—they can only occupy specific energy levels, rather than a continuous range.
Einstein’s Nobel Prize in 1921 was not for his theory of relativity, but for something much more practical: "his discovery of the law of the photoelectric effect." The photoelectric effect involves the emission of electrons from a material (such as metal) when light shines on it. While Heinrich Rudolf Hertz (a German physicist) discovered the phenomenon in 1887, its behavior mystified scientists for decades.
The puzzle was this: increasing the intensity of light caused more electrons to be emitted, but it did not increase the maximum energy of the electrons. Einstein resolved this by introducing the idea of light quanta (now called photons). He showed that the energy of the emitted electrons depends on the frequency of the light, not its intensity, through the relationship: E=h⋅f
This insight, grounded in quantum mechanics, explained the "unknown quantum condition" that classical physics could not address. The phrase "quantum leap" is used in English as an idiomatic expression, similar to the German "Quantensprung" or "saut quantique" / "bond quantique" in French, to imply a significant breakthrough or a large, transformative change. Like in German, the use of the term is somewhat ironic or misleading from a scientific perspective, as it contradicts the actual meaning of "quantum" in physics. I cannot explain how this came to be, especially since Germans essentially invented the whole concept—but at least we are in good company.
The Cambridge Dictionary translates "quantum leap" in Chinese as "飞跃,重大进展." The word "quantum" in Chinese is translated as 量子 (liàngzǐ). But there’s no liàngzǐ in their "quantum leap." When used metaphorically—say, “my article is a quantum leap in writing on this topic”—Chinese speakers drop 量子 (quantum) and use terms like 重大进展 (significant progress). Chinese avoids unnecessary scientific confusion. What’s the conclusion here?
Chinese: Practical and straightforward, aiming for clarity instead of ‘misusing’ the original scientific term.
English/German: Maybe elegant but potentially misleading, relying on the "cool factor" of quantum to elevate the expression.
Japanese is similar to Chinese in this context. The Japanese language also avoids directly using the scientific term for "quantum" (量子, ryōshi) in the metaphorical expression "quantum leap." Instead, it opts for a more figurative and practical translation, just like Chinese. They have 大きな進歩 (ōkina shinpo), "significant progress" or "great advancement."
While the discrete nature of energy change underpins quantum mechanics, the term "quantum computer" relates more broadly to the use of quantum principles (not just energy quantization) in computation.
The Quantum Threat: Should We Worry?
In classical computers, a bit is either 0 or 1. A quantum computer, on the other hand, is fundamentally different from a Turing machine—a theoretical model of computation introduced by Alan Turing in 1936. A Turing machine follows deterministic rules to manipulate symbols, calculates arithmetically step by step, processes one operation at a time, and doesn’t inherently deal with probabilities, randomness, or parallel computations.
In quantum computing, calculations are based on a qubit (quantum bit), which can exist in a superposition of both 0 and 1 simultaneously. For example, a photon’s polarization or an electron’s spin can exist in a combination of "up" and "down" states at the same time. This property is exploited in quantum computers to perform parallel computations, allowing them to solve certain problems more efficiently than classical computers. However, this efficiency applies only to specific problems, not all. A quantum computer is not universally "faster" than a classical computer—that’s not the best way to think about it. Instead, they solve problems differently because they are:
Not Deterministic: Quantum computers rely on quantum mechanics, which is inherently probabilistic. Their behavior cannot be fully described by deterministic rules.
Not Step-by-Step: Turing machines process inputs sequentially. Quantum computers leverage quantum parallelism, performing many computations simultaneously through superposition.
Quantum Interference: Quantum computers use interference to amplify correct outcomes and suppress incorrect ones—this is not something a Turing machine can do directly.
Richard Feynman pointed out that classical computers (based on Turing machines) struggle to simulate quantum systems because classical rules don’t naturally account for the probabilistic and wave-like behavior of quantum mechanics. Quantum computers, on the other hand, don’t simulate quantum systems—they are quantum systems.
For example:
A classical computer simulates rolling a dice by following step-by-step rules to generate random numbers. It doesn’t inherently "behave like" the dice; it calculates outcomes.
A quantum computer, however, is like the dice itself, embodying its probabilistic behavior. It doesn’t simulate probabilities—it operates within them.
These properties are incredibly useful, especially when attempting to hack RSA or the current elliptic curve encryption used in Ethereum.
How soon can IBM ship us a quantum computer then? Or when will Apple upgrade my Mac into a quantum Mac?
Theoretically, you could have one right now—assuming someone would sell them. But the real question is, would you actually want one right now or even in the future?
The qubits, which exist in a delicate superposition of states, are highly susceptible to disturbances from their environment. This causes something called decoherence, which occurs when a quantum system loses its quantum properties (like superposition and entanglement) due to interactions with the external environment. For example, stray electromagnetic fields, thermal noise, or even vibrations can disturb the quantum state.
The qubits are manipulated for computation using so-called quantum gates (for now, you could think of them as the equivalent of transistors in a traditional computer chip—although I suspect I’ll regret this comparison sooner or later because there’s probably some nuance I’m overlooking that makes this misleading). However, these gates are not perfectly accurate, leading to small but cumulative errors during operations. Every quantum gate operation introduces slight inaccuracies due to hardware imperfections or calibration errors, causing the overall computation to deviate from the intended result.
There are also a couple of other reasons why this setup is prone to generating errors. For example, cross-talk between qubits can occur, meaning they unintentionally influence each other due to electromagnetic or other physical interactions. Office gossip has its equivalent in the quantum world.
Small errors accumulate exponentially in quantum computations, especially for long computations with many qubits. There are quantum error correction techniques which require additional qubits to encode information redundantly. Estimates suggest that for every logical qubit, hundreds to thousands of physical qubits may be required to manage errors. Building qubits with sufficiently low error rates remains a hardware challenge. Advancements in error correction and hardware design suggest that these hurdles, while significant, are not insurmountable but for now are a significant limiting factor.
Nvidia’s CEO recently spoke about when he thinks the technology is ready:
"If you kind of said 15 years... that'd probably be on the early side. If you said 30, it's probably on the late side. But if you picked 20, I think a whole bunch of us would believe it,"
Tasks like adding numbers, running spreadsheets, or executing basic software are faster and more reliable on classical computers. Current quantum computers have limited qubits and storage, making them unsuitable for tasks like managing large databases or training modern machine learning models with massive datasets.
Quantum computers outperform classical computers in problems where superposition, entanglement, and interference can significantly reduce the computational effort. Using a specific algorithm allows quantum computers to factor large numbers exponentially faster than classical computers, threatening traditional cryptography methods like RSA.
The RSA example used the ciphertext C=65^17 mod 3233, with the remainder being 2790. This formula is based on the function f(x)=a^x mod N. To hack RSA encryption using a quantum computer, we focus on something different but related to this function: its periodicity. Once the periodicity is determined, it can be used to calculate the two prime factors (p and q) of N, which are the basis of RSA encryption. A classical computer could also try to find the periodicity, but it would face the same trial-and-error challenges that make the problem computationally infeasible.
Example of Periodicity: Suppose N=15 and we pick a=7. The a needs to be smaller than N and not share any common factors with N. For example, if N=15, you wouldn’t choose a=3 or a=5 because they are factors of N. This ensures the arithmetic calculations work correctly.
We now compute f(x)=7^x mod 15 for x=1,2,3,…:
f(1)=7^1 mod 15 = 7.
(7 divided by 15 is 0, i.e., 15 fits zero times into 7. The remainder is just 7 itself because nothing further needs to be done.)f(2)=7^2 mod 15 = 4.
(7 squared is 7×7=49. Then, 49÷15=3 remainder 4.)f(3)=7^3mod 15 = 13.
(7 cubed is 7×7×7=343. Then, 343÷15=22 remainder 13.)f(4)=7^4 mod 15 = 7.
(7 raised to the fourth power is 7×7×7×7=2401. Then, 2401÷15=160 remainder 7.)
The sequence of remainders is 7,4,13,7,… and the period r=4 because f(x) repeats every 4 steps. This means that after r steps, the function returns to its starting value. This periodicity is guaranteed to continue indefinitely due to the properties of modular arithmetic. This repeating cycle is what helps uncover the factors of N. While periodicity is just a mathematical property of this specific calculation, it helps uncover the internal structure of the number N, which is otherwise hidden in the encryption process.
When r is even (a requirement for the algorithm to work), it creates values a^r/2−1 and a^r/2+1, which are divisible by the prime factors of N.
So, back to the example: N=15, a=7, and we now know r=4.
Compute a^r/2:
Since r=4, divide r by 2 to get r/2=2.
a^r/2 =7 ^2 = 7×7 =49.Compute a^r/2−1 and a^r/2+1:
49−1=48
49+1=50
Find the greatest common divisor (GCD) of N=15 with these values:
For 50: The GCD of 50 and 15 is 5, because both 50 and 15 can be divided by 5, and there is no larger number that divides both.
For 48: The GCD of 48 and 15 is 3, because both 48 and 15 can be divided by 3, and there is no larger number that divides both.
Result: Using these GCDs, we’ve factored N=15 into 3×5 (p and q). Voila, we hacked RSA!
Once r is known, the rest of the process is straightforward and can be done quickly using modular arithmetic. Factoring a 2048-bit RSA modulus (finding r) might take classical computers thousands of years, while a sufficiently powerful quantum computer could do it in hours or minutes.
Why Can a Quantum Computer Do This Quickly?
The Practicality of a Quantum Threat.
A quantum computer uses n qubits to represent all possible values of x in the formula f(x)=a^x mod N. This superposition allows the computer to calculate f(x)=a^x mod N for all x simultaneously in a single step using quantum gates. The results exist as a superposition of states, and it needs to be analyzed to find patterns in the data. So the quantum computer won’t say “hey r = 4” instead you have to infer it from the quantum state of the various particles used as qubits. The quantum system manipulates the superposition such that incorrect guesses for r cancel each other out, amplifying the correct period. The speed thus comes from the combination of superposition (holding all x values at once) with quantum parallelism, interference, and a mathematical function called Quantum Fourier Transform (QFT).
1 qubit can represent 2^1=2 states: ∣0⟩∣0⟩ or ∣1⟩∣1⟩.
2 qubits can represent 2^2=4 states: ∣00⟩,∣01⟩,∣10⟩,∣11⟩.
3 qubits can represent 2^3=8 states: ∣000⟩,∣001⟩,∣010⟩,…,∣111⟩
With 3 qubits, the quantum computer simultaneously explores all x=0 to x=7 in superposition, calculating f(x)=7^x mod 15 for all those values.
When you measure the quantum state (e.g., ∣010⟩), the superposition collapses to a single outcome, and you can no longer access the original superposition or perform further operations on it after measurement. For example, say we measure ∣010⟩, called k, which corresponds to k = 2 in decimal.
This measurement k is used in a fraction k÷N. For example, if N=8 (since 3 qubits can represent 2^3=8 states, as explained before), then 2/8 simplifies to 1/4. The fraction 1/4 suggests that the periodicity r=4.
This process works because the Quantum Fourier Transform (QFT) encodes the periodicity of the function f(x) into the amplitudes of the quantum states. These amplitudes exhibit wave-like behavior, allowing the quantum computer to efficiently extract the periodicity r, which is key to factoring N using this algorithm (Shor’s algorithm after Peter Shor who is a theoretical computer scientist who worked at Bell Labs when he discovered the algorithm. He is currently a professor at MIT. In 2001, IBM demonstrated Shor’s Algorithm on a 7-qubit quantum computer, factoring 15=3×5. So since I described the example earlier, you can basically simulate being a quantum computer).
In elliptic curves, scalar multiplication is analogous to exponentiation in RSA (a^x mod N). Shor's algorithm is adapted to find k by leveraging periodicity in the elliptic curve group.
How problematic would this be for Ethereum?
The algorithm is just as devastating to ECC as it is to RSA once large-scale quantum computers become practical. However, it would require thousands of error-free logical qubits, which are far beyond current quantum capabilities. Estimates suggest that one logical qubit may require 1,000 or more physical qubits, depending on error rates. To break Ethereum’s cryptography, millions of physical qubits would likely be necessary.
The complexity of error correction makes scaling quantum computers exponentially harder. As progress is made, building a quantum computer at the required scale becomes increasingly difficult. Therefore, the actual threat could be decades away—or perhaps closer than we think.
Putting technical challenges aside, how practical would it be to exploit this capability if the challenges were resolved? The comments below are made from the perspective how the encryption mechanism are currently designed without considering changes that prevent certain risk since we know its coming.
Privacy-oriented encryption risks and blockchain-specific authentication risks.
The purpose of encryption directly influences how quantum risks manifest: Traditional Systems (e.g., a bank) use cryptography for data privacy and secure communications but public blockchain systems (e.g., Ethereum) do not. This means banks face a retroactive threat: All encrypted data becomes vulnerable. This risk is absent for blockchain because there is no private data.
I wrote about a report from HSBC covering risks from quantum computing and their concern was particular this aspect: the risk of decrypting historic data.
Since blockchain encryption focuses on authentication and integrity, the quantum risks concerns the theft of assets rather than privacy. But is this a material risk?
Framework for assessing quantum risk
An optimal security and operational strategy is subject to an organization’s objectives, and these decisions must balance risk tolerance, cost, and feasibility. Hence, the extent of security required may also differ for two banks due to different business volumes or other business factors. One bank may decide it needs to be secure against quantum threats under any circumstances whilst the next one will accept some risk in exchange of operational efficiency.
To manage these risks effectively, you need to understand the dependencies and interdependencies of a complex system. And this is in my opinion the very essence job of product management: thinking in terms of interdependencies.
So thinking about quantum risk requires the same precision and understanding of causal relationships as building a new service or product.
Define Objectives: What level of quantum preparedness is needed? What is acceptable risk?
Understand Dependencies: Where are the critical vulnerabilities in your cryptographic and operational infrastructure?
Build Redundancy: Ensure multiple layers of defense to prevent cascading failures.
Accept Trade-Offs: Acknowledge that 100% quantum security is neither feasible nor necessary in all cases.
In order to do that, you need one critical ingredient: detailed knowledge about a system’s behavior, rules, and functions that allow you to identify those interdependencies. It’s a bit like standing in the cockpit of a big Airbus A380 after the pilot has invited you for a visit. I’m not sure this actually happens, but I’ve seen it on TV. So, during the flight, you’re standing there, seeing all the buttons, and you think to yourself: what would happen if I pressed this button or that one? Maybe nothing, or perhaps we eject all the fuel and crash moments later.
In complex systems, there are often indirect relationships and time delays between pushing a button and a catastrophic consequence—or a nice surprise. In other words, understanding how the system will behave if you introduce a change is the key concern.
Why Timing is Critical for a Blockchain Hack
Funds in a blockchain wallet can be moved at any time. If an attacker derives a private key but the funds are already gone, the effort is wasted. So for Ethereum, this can only start after the public key is exposed (e.g., when the wallet sends its first transaction). Attackers with quantum computing capabilities must derive the private key quickly and submit their own transaction to steal the funds.
Attackers would need to continuously monitor the blockchain for:
Public key exposures (e.g., when an address sends a transaction).
Confirmation that the target funds are still in the wallet.
This requires significant computational and network resources, especially if monitoring multiple wallets.
Everyone Can See the Activity: As soon as an attacker broadcasts a "theft transaction," it is visible to the entire network. Wallet owners, exchanges, or validators might notice the malicious transaction and take action (e.g., move funds, blacklist addresses). If the wallet owner initiates a defensive transfer, the attacker may lose the race.
Blockchain nodes prioritize transactions by:
Gas fees: Higher-fee transactions are processed first.
Nonce order: Only valid transactions with the correct nonce are accepted.
The attacker must therefore monitor the mempool (pending transactions) to determine if the wallet owner is sending funds. The decentralized, transparent nature of blockchain inherently tilts the balance in favor of defenders over attackers in scenarios requiring precise timing.
An attacker targeting a multi-signature wallet (multisig) would face significantly more complexity and timing constraints than attacking a standard wallet (Externally Owned Account, or EOA). This is because multisig wallets require multiple actions (instructions) to execute a transaction, and these actions must conform to the logic of the smart contract managing the wallet. Each step takes time, and the attacker must ensure the wallet remains exploitable throughout the process. Since the attacker must meet all conditions required by the multisig wallet, it increases the chance of failure or detection. Wallets like Gnosis Safe can notify co-signers of pending transactions, enabling quick intervention to block malicious actions.
Mitigating Quantum Risks in Blockchain
Multisig wallets require multiple private keys to approve and execute a transaction subject to On-Chain Verification. Multi-Party Computation (MPC) however keeps the MPC key verification off-chain. In other words, a hack can bypass the distributed nature of MPC and execute unauthorized transactions.
Without getting into a debate with myself on the question of MPC vs. Muti-sig, quantum computing could be seen as presenting a bigger threat to MPC than multi-sig in this particular aspect. However, the safety is achieved by the combination of independent and redundant security measures and as always, the details in this case of the actual implement need to be considered. The API key and connection credentials are typically managed by the service provider (e.g., Gnosis) and are separate from the user's private keys. If an attacker gains access to the API key, they could potentially issue requests (depending on the level of access granted to that key). However, they cannot sign transactions on behalf of the user unless they also gain access to the user’s private key (or keys in the case of multisig).
One well-known case involved Bitfinex, which used BitGo's multisig wallet in 2016.
Bitfinex relied heavily on BitGo's API for its operational wallet. The 2-of-3 multisig architecture was structured so that:
Bitfinex held one key.
BitGo held the second key.
The third key was a backup.
For operational efficiency, Bitfinex allowed BitGo’s API to automatically co-sign withdrawals. The attackers gained unauthorized access to Bitfinex's systems. Using the API, they automated transactions that BitGo's system co-signed. This allowed the hackers to drain 120,000 BTC (worth around $4 billion at today’s prices) because the co-signature process relied too heavily on the API without adequate oversight or multi-factor authentication (MFA).
While multisig adds an additional layer of security, the implementation must include operational safeguards. In this case, the system relied on an external API to co-sign, effectively centralizing the risk. So what is needed instead is that the API provider (e.g., BitGo) would have its own anomaly detection and safeguards to prevent unauthorized co-signing etc.
So MPC reduces operational risks (e.g., no physical keys to manage), but it centralizes risk into the software and communication protocols and the best choice for one company could be different for the next given their particular operational constraints.
The Role of Nodes in Decentralization and Mitigating Systemic Risk
Decentralized systems derive their strength from their architecture—a distributed network of nodes that validate and propagate transactions. Nodes are more than technical components; they are the foundation of blockchain’s integrity, security, and redundancy.
Why Nodes Matter
In a decentralized network, each node acts as a checkpoint, independently validating transactions to ensure compliance with protocol rules. This distributed verification eliminates reliance on centralized entities, reducing vulnerability to tampering and single points of failure. However, the redundancy and security of this system hinge on active, diverse, and independent participation. Without a robust network of independent nodes, decentralization becomes a theoretical ideal rather than a practical reality.
Quantum Resilience: Nodes as Gatekeepers Against Emerging Threats
Quantum computing introduces potential risks to blockchain cryptography, but the role of nodes as gatekeepers provides inherent protections. Nodes propagate transactions in a decentralized, peer-to-peer manner, ensuring that any attempt to inject fraudulent transactions must navigate a web of independent checkpoints. This structure creates significant barriers for attackers, even those leveraging quantum capabilities.
Tracking and Tracing Fraudulent Transactions
Currently, Bitcoin and Ethereum nodes do not explicitly store metadata about where a transaction originated. However, timestamping and network-level observation can provide clues about the propagation path.
Enhancing nodes to track transaction origins could serve as a powerful deterrent against quantum-powered attacks. For instance, nodes could:
Trace Propagation Paths: Identify suspicious activity originating from specific nodes.
Flag Malicious Nodes: Isolate or scrutinize nodes propagating fraudulent transactions.
This added accountability would make large-scale quantum attacks less appealing due to the increased risk of detection. Wallets could maintain a list of "trusted nodes" that have previously handled their transactions. Transfers originating from untrusted nodes might require additional verification or delay. This structure creates plenty of opportunities to make quantum attacks less feasible.
Blockchain: An Unattractive Target for Quantum Theft
Stealing large amounts of crypto using quantum computing is far from practical. Transparency in blockchain networks makes hiding stolen assets difficult, while mixers like Tornado Cash cannot effectively launder significant sums without detection.
The Bitfinex Hack (2016): Even years later, hackers attempting to move stolen Bitcoin were detected, leading to arrests.
Mt. Gox Hack (2014): Most stolen Bitcoin remains unspent because of the difficulty in laundering such a large amount.
Tokenized securities are even less appealing, as custody verification ensures stolen tokens cannot grant legitimate ownership. The high upfront costs of quantum computing and the "one-shot" nature of such attacks make crypto theft unattractive compared to long-term exploits targeting centralized entities like banks.
Nodes and Systemic Resilience
Blockchain nodes also play a critical role in asynchronous updates, creating natural delays that can be fortified to detect and mitigate attacks. If a theft transaction were to succeed, the propagation structure could introduce delays, allowing the network to respond, blacklist addresses, and protect assets. This asynchronous and distributed design makes coordinated attacks exceptionally difficult to execute without detection.
"Systemic risk" with Ethereum's burn address
The burn wallet is designed to be an irretrievable sink for tokens and ETH, relying on the fact that no private key exists for it. If someone could derive the private key for this address, they could recover all burned ETH and tokens. That wallet would need some extra protection.
Redundancy. Staking and Gas Fee Management
In proof-of-stake systems like Ethereum, staking introduces an economic layer of security. Validators with a vested interest in the network’s success act honestly to avoid losing their stake. Distributed node operation ensures trustlessness and autonomy, reducing reliance on third-party services that could compromise security. It is essential if institutions want to engage meaningfully in digital asset markets.
P.S.
Besides not knowing much about it, I somehow managed to fill 20 pages. But there’s a sincere point behind my repeated comments about product management: debating the risk of something needs to be contextualized. It’s about understanding strategy, constraints, capabilities, and examining problems from multiple perspectives. The key question is: What do I need to believe to be true for a certain outcome to happen? This helps in identifying causalities and dependencies. So no magic trick after all—just the disciplined application of product management principles.
This is a supplement on quantum mechanics.





