Princeton, IBM, and Harvard-MIT reveal groundbreaking quantum computing advances in 2025, pushing technology closer to practical reality.

Matic P.

Continue your reading
From Theory to Practice: The 3 Quantum Leaps That Defined 2025
November 2025 may be remembered as the month quantum computing transitioned from promising research to practical technology. Within just one week, three separate research teams announced breakthroughs that address the field's most stubborn challenges: qubit stability, error correction, and scalability. Together, these advances from Princeton University, IBM, and a Harvard-MIT collaboration represent the most significant progress in quantum computing in over a decade.
The convergence of these announcements signals something bigger than individual achievements. For the first time, researchers have demonstrated solutions to the core problems that have kept quantum computers confined to laboratories. While fully fault-tolerant quantum systems remain years away, Princeton engineers have built a superconducting qubit that lasts three times longer than today's best versions, IBM has unveiled processors targeting quantum advantage by 2026, and Harvard-MIT researchers have demonstrated the first scalable error-correction architecture—all within a single transformative month.
This article examines each breakthrough in depth, explores what they mean for the future of computing, and explains why industry leaders believe we're witnessing a pivotal moment in technological history.
The quantum computing industry has long struggled with a fundamental problem: qubits don't last long enough to perform useful calculations. Information stored in quantum states typically degrades in microseconds, making complex operations nearly impossible. This limitation, called coherence time, has been the primary barrier to scaling quantum computers beyond laboratory demonstrations.
On November 5, 2025, a Princeton University team published research in Nature that dramatically extends this critical timeframe. The Princeton team reported their new qubit lasts for over 1 millisecond, three times longer than the best ever reported in a lab setting, and nearly 15 times longer than the industry standard for large-scale processors.
While one millisecond sounds brief, it represents a revolutionary improvement in quantum computing terms. That extra time allows quantum processors to execute far more operations before errors accumulate to unusable levels.
The breakthrough came from an unconventional collaboration between quantum engineers and materials scientists. Andrew Houck, Princeton's dean of engineering, and Nathalie de Leon, co-director of Princeton's Quantum Initiative, first introduced the use of tantalum for superconducting chips in 2021 in collaboration with Princeton chemist Robert Cava, an expert in superconducting materials who had no background in quantum computing.
Their key insight involved addressing the two main sources of energy loss in quantum circuits: surface defects in the metal and substrate quality. The team took a two-pronged approach: using tantalum metal for the circuits and replacing traditional sapphire substrates with high-purity silicon.
Tantalum offers a crucial advantage over aluminum, the standard material in quantum chips. Tantalum is exceptionally robust and can survive the harsh cleaning needed for removing contamination from the fabrication process. This durability means fewer microscopic defects that trap and waste energy during quantum operations.
The switch from sapphire to silicon proved equally important. Silicon is widely available with extremely high purity—a property the semiconductor industry has spent decades optimizing. This readily available purity helps minimize energy loss that occurs through the substrate material.
Perhaps most significantly, the improvements don't just add up—they multiply. Because the improvements scale exponentially with system size, swapping the current industry best for Princeton's design would enable a hypothetical 1,000-qubit computer to work roughly 1 billion times better.
The design's compatibility with existing architectures means companies like Google and IBM could potentially integrate Princeton's qubits into their current processors. Michel Devoret, chief scientist for hardware at Google Quantum AI, which partially funded the research, said that the challenge of extending the lifetimes of quantum computing circuits had become a "graveyard" of ideas for many physicists. Google's Willow chip, currently among the industry's best, could theoretically operate 1,000 times more effectively with Princeton's qubit design.
The research received funding from the U.S. Department of Energy National Quantum Information Science Research Centers and the Co-design Center for Quantum Advantage, demonstrating the federal government's continued investment in quantum leadership.
Just one week after Princeton's announcement, IBM unveiled a comprehensive roadmap at its annual Quantum Developer Conference on November 12, 2025. The company introduced two new processors and announced progress toward both near-term quantum advantage and long-term fault tolerance.
IBM is unveiling IBM Quantum Nighthawk, its most advanced quantum processor yet and designed with an architecture to complement high-performing quantum software to deliver quantum advantage next year. Quantum advantage represents the milestone where quantum computers can solve specific problems better than any classical computer—a goal the industry has pursued for years.
Nighthawk's specifications reveal IBM's scaling strategy:
The processor's architecture focuses on connectivity. More couplers between qubits allow for more complex quantum circuits while maintaining low error rates. This architectural approach differs from simply adding more qubits; instead, it emphasizes how well those qubits can work together.
IBM's roadmap extends beyond Nighthawk's initial capabilities. By 2028, Nighthawk-based systems could support up to 15,000 two-qubit gates enabled by 1,000 or more connected qubits extended through long-range couplers.
Recognizing that quantum advantage claims need rigorous validation, IBM, Algorithmiq, researchers at the Flatiron Institute, and BlueQubit are contributing new results to an open, community-led quantum advantage tracker to systematically monitor and verify emerging demonstrations of advantage.
This collaborative approach addresses skepticism that has surrounded previous quantum advantage claims. By establishing transparent benchmarks and encouraging independent verification, the quantum community aims to build credibility as systems approach practical utility.
While Nighthawk targets near-term quantum advantage, IBM's experimental Loon processor addresses the long-term challenge of fault tolerance. IBM Quantum Loon is the company's experimental processor that, for the first time, shows IBM has demonstrated all the key processor components needed for fault-tolerant quantum computing.
Loon incorporates several innovations:
Complementing the hardware advances, IBM achieved a critical software milestone a full year ahead of schedule. IBM has proven it is possible to use classical computing hardware to accurately decode errors in real-time (less than 480 nanoseconds) using qLDPC codes. This speed is essential for error correction to work faster than new errors accumulate.
IBM is also addressing manufacturing challenges. The primary fabrication of IBM's quantum processor wafers is being undertaken at an advanced 300mm wafer fabrication facility at NY Creates' Albany NanoTech Complex in New York. This shift to industry-standard 300mm wafers enables several benefits:
The move to 300mm fabrication represents quantum computing's integration with established semiconductor manufacturing infrastructure—a necessary step for eventual mass production.
On November 11, 2025, a Harvard-MIT collaboration published research in Nature demonstrating the first truly scalable quantum error correction architecture. While Princeton focused on making individual qubits more stable and IBM showcased processor architectures, the Harvard-MIT team proved they could detect and correct errors across a large quantum system.
Quantum computers' extraordinary power comes from their fundamental fragility. Qubits are inherently susceptible to slipping out of their quantum states and losing their encoded information, making error correction a core prerequisite to achieving large quantum machines.
Classical computers handle errors through redundancy—storing the same information multiple times. Quantum computers can't simply copy quantum states due to fundamental physics principles. Instead, they require sophisticated error correction schemes that encode information across multiple physical qubits to create logical qubits that maintain coherence.
The Harvard-MIT breakthrough demonstrates this process at unprecedented scale and reliability.
In the new paper, the team demonstrated a fault-tolerant system using 448 atomic quantum bits manipulated with an intricate sequence of techniques to detect and correct errors. The system employs several sophisticated mechanisms:
Quantum teleportation: Transferring quantum states between atoms without physical contact
Physical and logical entanglement: Creating correlations between qubits at different levels
Entropy removal: Eliminating accumulated errors from the system
Most importantly, the team combined various methods to create complex circuits with dozens of error correction layers, and the system suppresses errors below a critical threshold—the point where adding qubits further reduces errors rather than increasing them.
This threshold represents a fundamental milestone. Below it, larger quantum computers become more reliable rather than less—enabling the scaling path to practical machines.
Mikhail Lukin, co-director of the Quantum Science and Engineering Initiative and senior author of the paper, said "For the first time, we combined all essential elements for a scalable, error-corrected quantum computation in an integrated architecture". Lead author Dolev Bluvstein, now an assistant professor at Caltech, noted that while technical challenges remain for million-qubit systems, they now have an architecture that's conceptually scalable.
The Harvard-MIT team uses neutral rubidium atoms as qubits, manipulated with lasers. This approach differs from the superconducting circuits used by IBM, Google, and Princeton, demonstrating that multiple qubit platforms are advancing simultaneously. Hartmut Neven, vice president of engineering at the Google Quantum AI team, said the new paper came amid an "incredibly exciting" race between qubit platforms.
This research received federal funding from multiple agencies including DARPA, the Department of Energy, and the National Science Foundation, reflecting broad government support for quantum research across different technological approaches.
The convergence of these three advances addresses quantum computing's core challenges from complementary angles. Princeton improved the fundamental quality of individual qubits. IBM demonstrated manufacturing scalability and practical processor architectures. Harvard-MIT proved that error correction can work at the system level.
Several industries are positioned to benefit from near-term quantum computers:
Drug Discovery and Materials Science: Quantum computers excel at simulating molecular behavior—a task where classical computers struggle. Pharmaceutical companies could design new drugs by accurately modeling how molecules interact with biological systems. Materials scientists could discover new compounds with specific properties for batteries, semiconductors, or superconductors.
Financial Modeling: Complex financial systems with many interdependent variables represent ideal quantum computing applications. Portfolio optimization, risk analysis, and derivative pricing could all benefit from quantum approaches.
Cryptography and Security: Quantum computers pose both challenges and opportunities for cryptography. While they threaten current encryption methods, they also enable quantum key distribution for unhackable communications.
Artificial Intelligence and Machine Learning: Certain machine learning algorithms could see dramatic speedups on quantum hardware, particularly for training large models or optimizing complex neural network architectures.
IBM anticipates that the first cases of verified quantum advantage will be confirmed by the wider community by the end of 2026. This aggressive timeline reflects confidence that hardware, software, and error correction are maturing simultaneously.
For fault-tolerant systems capable of running any quantum algorithm reliably, timelines extend to 2029-2035. IBM targets 2029 for its first large-scale fault-tolerant system. Independent analysts suggest 2035 as a conservative estimate for fully error-corrected quantum computers.
The November 2025 breakthroughs don't guarantee these timelines will hold—quantum computing has seen optimistic predictions before. However, the simultaneous progress across materials science, processor architecture, and error correction suggests the field has moved beyond isolated laboratory achievements to coordinated engineering challenges.
Success requires more than just hardware breakthroughs. The quantum computing ecosystem encompasses:
Software and Algorithms: IBM's Qiskit framework and similar tools must become more accessible to developers outside quantum physics. New quantum algorithms need development for practical problems.
Manufacturing Infrastructure: Princeton's use of silicon substrates and IBM's 300mm wafer fabrication demonstrate integration with semiconductor manufacturing. This connection to established industry reduces costs and increases reproducibility.
Workforce Development: Universities and companies must train quantum engineers, algorithm developers, and end-users. The field needs specialists who understand both quantum mechanics and practical computing applications.
Standards and Benchmarking: The quantum advantage tracker IBM supports represents necessary infrastructure for validating claims and comparing systems. Standardized benchmarks help users understand what quantum computers can actually deliver.
The three breakthroughs highlighted here represent just a portion of quantum computing research. Multiple qubit technologies compete for dominance:
Superconducting Qubits (IBM, Google, Princeton): Currently most mature, operating at near absolute zero temperatures. Princeton's tantalum-silicon innovation advances this approach.
Trapped Ions (Quantinuum, IonQ): Uses individual ions held by electromagnetic fields. Quantinuum recently launched its Helios system with 98 qubits and claims highest accuracy commercially available.
Neutral Atoms (Harvard-MIT, QuEra): Uses laser-trapped atoms like the Harvard-MIT error correction system. Promises excellent connectivity and scalability.
Photonic Qubits (Xanadu, PsiQuantum): Encodes information in light particles, potentially operating at room temperature.
Topological Qubits (Microsoft): Theoretical approach using exotic quantum states that resist errors inherently. Microsoft announced its Majorana 1 chip in February 2025.
This diversity suggests no single approach has emerged as definitively superior. Different applications might favor different qubit types, similar to how classical computing uses various processor architectures for different tasks.
Despite November's progress, significant obstacles persist:
Temperature Requirements: Most approaches require operation at temperatures near absolute zero, demanding expensive cryogenic systems. Scaling to thousands or millions of qubits while maintaining these temperatures poses engineering challenges.
Error Rates: While improving, error rates remain far higher than in classical computers. Current systems require extensive error correction overhead, meaning hundreds or thousands of physical qubits might be needed for each logical qubit.
Connectivity: Quantum computers need qubits to interact with each other. As systems scale, maintaining connectivity becomes increasingly difficult. IBM's focus on tunable couplers addresses this, but it remains a bottleneck.
Algorithm Development: Many potential quantum algorithms remain theoretical. Translating mathematical ideas into working code for noisy, error-prone hardware requires significant development.
Cost and Accessibility: Current quantum computers cost tens of millions of dollars and require specialized facilities. Cloud access has improved availability, but costs remain prohibitive for many potential users.
Quantum computers use qubits that leverage quantum mechanical properties like superposition and entanglement. Unlike classical bits that are either 0 or 1, qubits can exist in multiple states simultaneously. This enables quantum computers to explore many solutions at once, potentially solving certain problems exponentially faster than classical computers.
The timeline depends on the application. For specialized scientific problems, limited quantum advantage may arrive by 2026-2027. For broad commercial applications requiring fault-tolerant systems, 2029-2035 represents the realistic timeframe. Consumer quantum computers comparable to laptops are not expected in the foreseeable future.
No. Quantum computers excel at specific types of problems—simulating quantum systems, certain optimization tasks, and cryptographic applications. For most everyday computing tasks like word processing, web browsing, and running business software, classical computers remain far more practical and cost-effective.
Quantum advantage (sometimes called quantum supremacy) occurs when a quantum computer solves a problem better than any classical computer—either faster, more accurately, or enabling previously impossible calculations. It doesn't mean quantum computers are better at all tasks, just that they've proven useful for at least some real-world problems.
Princeton's tantalum-silicon qubit lasts 15 times longer than current industry standards. This extended coherence time allows quantum processors to perform more operations before errors accumulate, enabling more complex calculations and better error correction. The improvement scales exponentially, potentially making 1,000-qubit systems a billion times more effective.
Quantum error correction detects and fixes errors that occur during quantum computations without destroying the quantum information. It's essential because qubits are extremely fragile and error-prone. The Harvard-MIT breakthrough demonstrated error correction working below the critical threshold where adding more qubits actually reduces overall errors—a crucial milestone for scalable quantum computing.
IBM, Google, and Microsoft lead among large tech companies. Specialized startups include Quantinuum (Honeywell spinoff), IonQ, Rigetti, D-Wave, and QuEra. Academic institutions like Princeton, Harvard, MIT, and Caltech also drive fundamental research. The field remains competitive with no single dominant player.
Pharmaceutical and biotechnology companies will likely be early adopters for drug discovery and protein folding. Financial services firms are exploring portfolio optimization and risk modeling. Materials science, cryptography, artificial intelligence, and supply chain optimization represent other near-term applications. Use cases requiring simulation of quantum mechanical systems benefit most.
November 2025's convergence of breakthroughs from Princeton, IBM, and Harvard-MIT represents more than incremental progress. For the first time, solutions to quantum computing's fundamental challenges—qubit stability, processor scalability, and error correction—have advanced simultaneously.
Princeton demonstrated that materials innovation can dramatically extend qubit coherence times while maintaining compatibility with industry-standard manufacturing. IBM showed that practical quantum processors targeting verified quantum advantage are within reach by 2026, while simultaneously building the foundation for fault-tolerant systems by decade's end. Harvard-MIT proved that quantum error correction can suppress errors below the critical threshold necessary for scalable systems.
These advances don't guarantee quantum computing will soon replace classical systems for everyday tasks. Instead, they signal that quantum computers are transitioning from research curiosities to specialized tools for specific problems where they offer genuine advantages.
The quantum computing industry now faces primarily engineering challenges rather than fundamental physics questions. Manufacturing must scale from laboratory wafers to mass production. Software frameworks must become accessible to developers outside quantum physics. Applications must mature from proof-of-concept demonstrations to commercial products.
If current momentum continues, 2026 could see the first verified instances of quantum advantage in practical applications. By 2030, fault-tolerant quantum computers might tackle problems that would take classical supercomputers millennia to solve. November 2025 may be remembered as the month this future shifted from aspiration to expectation.
For scientists, engineers, and business leaders watching the quantum revolution unfold, the message is clear: prepare now. The companies and industries that understand quantum computing's capabilities—and limitations—will be positioned to capitalize when these systems mature into indispensable tools.