John Martinis, the physicist who helped steer two major turns in quantum computing, is now pursuing a fresh rethink that aims to push machines to new heights. The effort arrives as companies and labs race to scale fragile qubits into systems that can solve useful problems. His move raises a key question this year: can a third reinvention unlock practical quantum power sooner?
“John Martinis has already revolutionised quantum computing twice. Now, he is working on another radical rethink of the technology that could deliver machines with unrivalled capabilities.”
A Track Record of Reinvention
Martinis built his name in superconducting circuits, where tiny electrical loops act as qubits. At the University of California, Santa Barbara, his group pushed coherence times higher and made gates more reliable. Those advances helped turn lab curiosities into engineered devices.
He later led the hardware team at Google that in 2019 reported a milestone often called “quantum supremacy.” The Sycamore processor, with 53 superconducting qubits, completed a sampling task in about 200 seconds. Google argued a top classical supercomputer would need thousands of years. Critics disputed the time gap, but the test still marked a public step in performance.
That period showed Martinis’s approach: reduce noise, design for scale, and measure progress with clear benchmarks. It also revealed the field’s tension. Big demonstrations build momentum, yet long-term value depends on making error rates low enough for fault-tolerant computing.
What a Third Rethink Could Mean
Details of the new push remain limited, but the goals are clear: fewer errors, more qubits, and software that maps problems to hardware with less overhead. Such a rethink could target three pressure points.
- Stability: Extend qubit coherence and make gates more uniform across a chip.
- Error control: Blend hardware-level suppression with smarter error-correction codes.
- System design: Rework chip layout, packaging, and control electronics to scale.
If successful, these steps could cut the resources needed for fault-tolerant operations. That, in turn, would bring tasks in chemistry, materials, and some optimization closer to reach.
Technical and Business Hurdles
Every quantum platform faces stubborn physics. Superconducting qubits are fast, but they are sensitive to tiny defects and microwave cross-talk. Adding more qubits often creates new noise and new failure modes.
There are also costs away from the chip. Cryogenic systems, control racks, and calibration software must scale with the processor. As systems grow, tuning them becomes a major engineering burden. The rethink likely addresses automation, testing, and “design for yield.”
Capital needs are another test. Building and iterating at the hardware level is expensive. Investors now expect clearer timelines and proof that near-term devices can deliver value. A plan that shows measurable gains each year could help keep funding steady.
Competing Paths to Scalable Machines
Martinis’s past work advanced superconducting qubits, but rivals are pushing other routes. Trapped ions offer high-fidelity gates with slower speeds. Neutral atoms scale well with optical tweezers. Silicon spin qubits promise chip-fab compatibility. Photonic systems target room-temperature operation and networking.
Each path trades one strength for another. Superconducting circuits lead in speed and industrial tooling. Ion and atom systems lead in gate quality and qubit uniformity. Silicon and photonics pitch manufacturing and connectivity. The winner may be a hybrid, linking different hardware through quantum networks.
For users, the near-term focus is on concrete gains. Lower error rates, deeper circuits, and stable software stacks matter more than showpiece demos. Any new approach will be judged on reproducible data, not promises.
What To Watch Next
Observers will look for signs that the new effort can cut logical error rates by orders of magnitude, not just small steps. They will also watch for cleaner scaling data as device size grows.
Partnerships with chip fabs, control-electronics makers, and national labs could hint at serious scale plans. Early pilot projects with chemical or logistics firms would show a path to applications.
Martinis’s record gives this new push weight. His earlier turns helped define how labs measure progress and build hardware. If this rethink trims the cost of error correction and lifts reliability, it could reset expectations for the decade ahead. The next milestones will tell whether quantum machines can move from impressive experiments to dependable tools.