Listen to the article
When someone first described how quantum computers could crack encryption, it sounded almost dramatic. Large numbers that would take centuries to crack abruptly disintegrate in a matter of hours. When Peter Shor unveiled his algorithm in 1994, that was the promise—or the threat. It felt far away for years. scholarly. Not for boardrooms, but for labs. It feels closer than it should now, standing in the glow of security dashboards and server racks.
According to Google’s most recent research, a quantum computer with one million noisy qubits could crack RSA-2048 encryption in roughly a week. That figure is important. It’s not because we currently have such machines—we don’t—but rather because the gap between engineering and creativity has become much smaller. It’s difficult to ignore how rapidly, almost uncomfortably, the estimates are declining.
| Category | Details |
|---|---|
| Field | Quantum Computing & Cybersecurity |
| Key Algorithm at Risk | RSA-2048 Encryption |
| Breakthrough Source | Google Quantum AI |
| Key Researchers | Craig Gidney, Sophie Schmieg |
| Estimated Requirement | ~1 million noisy qubits |
| Previous Estimate | ~20 million qubits |
| Risk Timeline | Early 2030s (projected) |
| Security Concern | “Store now, decrypt later” attacks |
| Transition Solution | Post-Quantum Cryptography (PQC) |
| Reference | https://blog.google/technology/research/quantum-computing/ |
In 2012, experts estimated that cracking RSA might require a billion qubits. Millions after that. One million now. Seeing those numbers decline is similar to witnessing a countdown that was not formally initiated.
Progress is being made in ways that don’t always make headlines inside quantum labs, which are quiet places with delicate wiring and cooling systems. Researchers have discovered ingenious ways to extract more performance from unstable qubits, enhanced algorithms, and improved error correction. Approximate modular exponentiation is one technique that reportedly reduced computational overhead from 1000 times to just twice the baseline. It doesn’t feel incremental to make such a leap. Something seems to be tipping.
There is still a sense of hesitancy. Nowadays, a lot of systems run on just hundreds of qubits. The difference between hundreds and a million is not insignificant. It is enormous. However, it is also gradually and unevenly getting smaller, with each prototype humming a little more consistently than the previous one.
There’s a feeling that the true story isn’t about when encryption fails, but rather about how unprepared most systems are for it.
Today’s encryption relies on issues that are difficult for traditional computers to resolve. For instance, factoring big numbers is excruciatingly slow. Everything, including private messages and banking transactions, is protected by this slowness. However, those rules don’t apply to quantum machines. They tackle issues from a different angle, investigating several options simultaneously and reducing ambiguity to solutions. The lock gives way at some point during that process.
“Store now, decrypt later” is a concept that security experts frequently discuss. Until you visualize it, it sounds abstract. Financial records, medical histories, and corporate secrets are just a few examples of the data that is being intercepted, copied, and stored as it moves silently across networks. Not helpful at this time. However, one day. awaiting a device strong enough to open it. Some of that data might already be stored on servers somewhere, unaltered but not lost.
You see things from a different angle when you stroll through corporate IT departments. systems built over many years. Deep within the infrastructure are outdated encryption protocols. engineers with a deeper comprehension of deployment pipelines than cryptographic libraries. There is a subtle complexity there that causes change to be gradual and occasionally brittle.
It is not the same as updating an application to upgrade encryption. It’s more akin to changing a building’s foundation while occupants are still inside.
Post-quantum cryptography can help with that. There are already new algorithms being standardized that are intended to withstand quantum attacks. For example, Google has started integrating ML-KEM and other quantum-resistant systems into its own infrastructure. However, adoption varies by industry and has just begun in some. Organizations still don’t fully understand how long these changes take.
Tension is increased by timelines. According to some analysts, encryption-breaking quantum computers might appear by the early 2030s. Some suggest 2035 as a more practical cutoff. In any case, migration initiatives frequently take ten years or longer. The true vulnerability lies in the intersection of risk and readiness.
Additionally, there is a cultural lag. Cryptographic systems are not well understood by many developers. Security is frequently overlooked, taken for granted, and seldom questioned. It might be just as hard to change that mindset as it is to develop the technology.
Instead of panic, there is a sense of quiet urgency as this develops. a realization that there is a threat, but it is not imminent. Reaction is not as important as preparation. However, the rate of advancement continues to raise concerns.
What comes next if estimates have already decreased this much? One more innovation. One more optimization. One more sudden leap. The timeline might compress once more.
Life goes on as usual outside the labs. Messages were sent. Transactions were completed. Data is stored and encrypted. Everything is operating according to plan. However, something is changing beneath that surface—slowly, steadily, and nearly imperceptibly.
Additionally, it won’t feel abrupt when the time comes when encryption breaks. It will seem as though it has been coming for a very long time.










