Quantum photonics roadmap — how Xanadu and PsiQuantum are looking to transfer qubits through beams of light

6 hours ago 12
Xanadu Lab wide shot (Image credit: Xanadu)

This article is part of a series documenting quantum computing technologies and their ecosystem – the differing approaches, the key players behind them, and the key technologies that are driving us towards a quantum future. Part one looked at superconducting qubits (materialized in key industry giants such as IBM and Google) and trapped ion qubits (through IonQ and Quantinuum).

In this second part, we’ll be looking at quantum photonics – a light-based technique of defining the quantum unit of computation, the qubit. We’ll take a brief look at the what and the why of quantum photonics, and then materialize it by focusing on two particular companies, their roadmaps, and their technologies: Toronto-based Xanadu Quantum Technologies (which is making a play for public Nasdaq listing this first quarter of 2026 at an estimated 3.6B$ enterprise valuation through a SPAC deal); and the Palo Alto, California-headquartered PsiQuantum (PSIQ.PVT, with an estimated 7B$ valuation buoyed by a 1$ billion worth Series E funding round in late 2025).

Like our previous roadmap analysis, this won’t be a technical article; it’s a technology and roadmap analysis that brings understandable bites on the underlying technologies, their roadmap evolution, current state, and expected next steps. For a better understanding of what quantum computing is all about, Tom’s Hardware has a more explanatory quantum computing article you can familiarize yourself with first.

Article continues below

What is Quantum Photonics?

To answer what quantum photonics actually is, we have to start with the most basic: photonics is the use of light to transmit encoded information. The most widespread application of photonics that’s already a part of our infrastructure today materializes through fiber optic cables: within them, light travels at its speed (which matters for latency) and crucially, without energy losses to electrical resistance.

Because light can contain multiple wavelengths (think colors, ranging through the visible spectrum and beyond), information in fiber optic cables can be encoded in multiple paths within the same ray (a technique known as multiplexing) for increased bandwidth.

This classical approach to photonics uses billions of photons (the essential unit of light) in coherent beams, using other elements such as phase and polarization as data carriers. Classical photonics is already a well-known quantity, with multiple applications in both intercontinental information transit, data center interconnects, and more specifically, inter-chip communication.

The transition towards the quantum realm occurs when you stop looking at light as a beam and focus on the singular elements that compose it: photons. Quantum photonics, then, makes use of single-photon sources and single-photon detectors to encode and decode information through the specific strengths of quantum properties: entanglement (where two entangled photons become a coherent system) and superposition (where the universe of possible information values can be contained in a single qubit until interfered with).

An IBM Quantum Nighthawk chip held by a gloved hand.

(Image credit: IBM)

This brings us to the great differentiator in current quantum photonics: the way operations are run on individual photons, and how information is encoded within them. PsiQuantum uses what’s known as a dual-rail encoding approach: informational states are derived from looking at a photon’s “choice” between path A (0) and path B (1) (these paths being known as waveguides). Xanadu approaches it through the lens of continuous-variable encoding: instead of looking at the photon itself, it looks at the photon’s light field and how it’s distributed (across properties like amplitude and phase), ‘squeezing’ them (reducing uncertainty in the amplitude variable at the cost of increased uncertainty in phase) to encode data.

These are two fundamentally different ways of obtaining the result of a photonics-based, large-scale, error-corrected quantum computer, each with its own set of engineering problems. The end-goal, however, is the same: when you can generate, manipulate, and measure individual photons, light stops being a mere transmission medium, and individual particles become the computational substrate itself.

Advantages, challenges, and the mechanics of photonic qubits

Quantum photonics is claimed to have some operational advantages over other approaches: unlike superconducting qubits, photons can be operated on at room temperature, theoretically reducing both installation, running, and maintenance costs.

The natural physical makeup of photons also means that photonic qubits are less susceptible to environmental interference, such as electromagnetic noise and thermal fluctuations. Scaling-wise, photonics-based chips can leverage semiconductor manufacturing infrastructure, and the natural speed of light means that gate times (gate operations being the result of inter-qubit operations towards a useful result) should have a higher operational limit compared to other approaches, such as trapped ions.

There’s always an opportunity cost in each quantum approach, however. In PsiQuantum’s dual-rail approach, identical photons that can be reliably entangled are very hard to generate: minute differences in wavelength, polarization, and spatial modes destroy systemic equilibrium and reliability. Photon generation (which is usually accomplished by shining a laser through a crystal) is a probabilistic operation: sometimes no photon is generated; sometimes, one is; and sometimes, more than that.

All of this leads us to the harsh truth that in quantum photonics - particularly in its dual-rail design - it’s easy to lose more than 90% of the generated photonic qubits (at generation or collection) before they ever get a chance to perform a useful computation. This means that to generate a 100-qubit photonic system, upwards of 10,000 photons must be generated. Everything else is lost.

PsiQuantum’s way of operating on individual photons means there’s no informational backup, such as what you’d get when operating on classical light beams: when the photon is lost, everything is. You can amplify billions of photons when they are a beam, but you can’t do the same for a single photon (a quirk of quantum mechanics known as the no-cloning theorem). And being incredibly small particles, a minute error in the photon’s directionality means that the emitted particle can easily fail to be detected on the other end (think of how a small angular difference at a bullet’s exit compounds on missing the bullseye).

Xanadu Lab

(Image credit: Xanadu)

Xanadu’s approach, on the other hand, sidesteps the requirement for photonic “perfection” at generation and is more tolerant to photon loss (the light fields don’t completely vanish on individual photon loss). But it does introduce different error correction challenges – errors are continuous (noise is present in amplitude and phase measurements), while PsiQuantum’s issues are discrete (photon present vs photon absent, resulting in discrete bit flips in calculations).

Clearly, the base technology of photonics can serve very different approaches. PsiQuantum bets that silicon photonics manufacturing can overcome the drawbacks of their dual-rail approach through scale and engineering precision to reduce errors and improve photon measurement reliability, while Xanadu’s intrinsically higher tolerance to process imperfections enables a faster timeline to quantum advantage, or so they hope.

Xanadu's approach

Founded in 2016, Xanadu’s declared mission is to build a fault-tolerant photonic quantum computing datacenter in the early 2030s. To do that, the company has been developing a particular qubit concept pioneered as early as 2001 – GKP qubits. Xanadu is seemingly keeping its cards close to its chest when materializing expectations in roadmap form.

What Xanadu does is declare its innovations through scientific publications and post-facto announcements on executed milestones, defining its fault-tolerant target architecture design as early as 2020. This happened in tandem with the company’s first quantum device demonstration, which occurred by Fall 2020 with its X8 photonic chip – a 4mm x 100mm 8-qubit device fabricated on a silicon nitride process. The blueprint for their fault-tolerant quantum future was thus laid out.

By June 2022, the company introduced Borealis – their first fully programmable photonic processor (across 1200 parameters), which leverages 216 squeezed-state photon qubits, enabling the company to claim quantum advantage through a peer-reviewed, Nature-published paper.

The problem this advantage was demonstrated in is a very specific application – namely, Gaussian Boson Sampling (GBS). Xanadu claimed that top-of-the-line supercomputers and the available state-of-the-art algorithms towards solving that problem space would take around 9,000 years to complete on classical hardware – Borealis did it in 36 microseconds. Alongside this scientific success claim, Xanadu also managed to offer the first photonic quantum computer available on cloud through Amazon Web Services’ Braket, with quantum operations being handled through Xanadu’s PennyLane open-source, quantum hardware-agnostic software stack.

In early 2025 (again through a peer-reviewed, Nature-published paper), Xanadu demonstrated its progress towards its fault-tolerant computing datacenter with Aurora – a room-temperature operated (barring the cryogenic photon detector system), modular scaling vehicle harnessing 12 physical qubits across 35 integrated photonics chips. These were integrated across 4 modular server racks with fiber optic interconnects and over 13km of optical fiber across components (which include required loops for photon timing matching).

Aurora is the company’s milestone in demonstrating all the required architectural elements of its 2020 blueprint for a fault-tolerant architecture operating together. If X8 was a proof of concept and Borealis the company’s demonstration of achievable quantum advantage through their quantum approach, Aurora is the vehicle that proved their modular integration aspirations as achievable.

Progress has fast-tracked since then: by June 2025, Xanadu was demonstrating the world’s first on-chip generation of GKP states (their error-resistant photonic qubits), with silicon manufacturing processes handling their required silicon nitride waveguides on 300mm wafers. Perhaps even more impressively, the company demonstrated its ability to integrate error-correction at the chip level, while significantly improving its photon detection efficiency.

People working at desks in a wide shot of Xanadu's lab

(Image credit: Xanadu)

The principal issue to still be solved, as the company identified it, relates to the quality of the GKP state photonic qubits themselves, which materializes in optical loss issues. This identified bottleneck directly relates to Xanadu’s July 2025 announcement of a strategic partnership with HyperLight and its TFLN (thin-film lithium niobate) chiplet platform technology, which replaces the silicon nitride waveguide design with HyperLight’s lithium niobate solution, significantly reducing waveguide losses and electro-optic chip losses while retaining high-volume manufacturing through semiconductor manufacturing technologies. Another angle relates to another strategic collaboration announced in August 2025 with DISCO Corporation, a developer of ultra-precision grinding and polishing machinery for photonic components, aiming to improve the quality of GKP photon generation on laser interactions.

Looking to the future, the company’s goal is to achieve up to 1,000 logical qubits by 2029; barring unexpected breakthroughs, the company expects to achieve that at a 100:1 ratio, with a requirement of around 100,000 physical qubits to do so. Besides pure qubit count, applications are the name of the game; in December 2025, Xanadu announced a breakthrough application in photodynamic cancer therapy, a medical application that joins their ongoing partnership with AstraZeneca (molecular simulations, protein folding, drug-protein binding affinity, and optimization of molecular conformations). Additionally, the company has developed quantum applications for machine learning (including classification and neuronal network implementations).

PsiQuantum

Founded in 2016 (Palo Alto, California), PsiQuantum has grown in scale in the intervening nine years, reaching its 7B$ valuation while expanding its facilities across Chicago, Australia, and the United Kingdom. The company hit the ground running with a particular vision: to skip the current era of Noisy Intermediate Scale Quantum (NISQ) computers while focusing its funding and developmental efforts on tackling the architectural, error-correction, and manufacturing problems for its choice of quantum computing architecture. The goal: to deliver a 1 million-plus qubit design as soon as feasible.

This decision flies in the face of most other quantum industry players, who have elected to develop proof-of-concept vehicles all the way through platform development and incremental, step-by-step scaling.

PsiQuantum’s ethos informed their technology choice of pursuing a photonic quantum architecture, which, as we’ve seen, can find an important common ground within semiconductor techniques, leveraging decades of already-funded and problem-solved manufacturing research and development.

PsiQuantum worked in the shadows between its 2016 founding and 2021 – the moment the company materialized a very public partnership through GlobalFoundries’ Fab 8, one of the world’s leading CMOS and – yes – photonics manufacturing players. And even as early as 2021, PsiQuantum knew exactly what it required out of GlobalFoundries’ facilities: manufacturing of its Omega quantum devices.

One year later, the company was already testing GlobalFoundries’ output through testing and validation of single photon sources, photonic switches, waveguide-integrated on-chip photon detectors, and demonstrations of quantum entanglement.

PsiQuantum test assembly facility

(Image credit: PsiQuantum)

Then, silence – up until 2025, when PsiQuantum finally revealed its work through the publication of the paper “A manufacturable platform for photonic quantum computing” in Nature. It finally shed light on what Omega is all about, and the engineering systems designed around its scaling up to 1 million-plus qubits: compatibility with 300mm wafer platforms; silicon-nitride waveguides; telecom-band (1550nm) single-photon sources, guaranteeing compatibility with existing fiber infrastructure; and its arguably most important development in the world’s first Barium Titanate (BTO) manufacturing process on 300 mm wafers, the demonstrably highest-performance electro-optic material known and a breakthrough in switching performance.

The paper claimed state-of-the-art performance in key metrics, conditional (as we’ve discussed before) on photon detection: if there’s no photon to detect, there’s no fidelity to measure. It’s an interesting way to expound on a quantum system and components reaching “beyond state-of-the-art-performance", as PsiQuantum put it, even if it leaves the question open on how good the photon hit rates are that are necessary for actual computational work to occur.

The confidence and planning are there: PsiQuantum places its achievement of a large-scale, error-corrected quantum computer somewhere between the 2027-2029 timeframe, which is ahead of most other quantum players, who tend to settle expectations around 2029-and-beyond.

But first, PsiQUantum still needs to showcase actual full-system integration through its Alpha system program (whose housing facilities covering a 120,000 square foot manufacturing and testing facility in Milpitas, California, are still under construction). Only then should the company be able to execute on the next phase: execution on its 1 million-plus qubit system is dependent not only on technological development but also on a relatively more mundane requirement: finishing the actual facilities where the system is to be housed, which saw groundbreaking at the Illinois Quantum and Microelectronics Park (IQMP) in Chicago in late 2025.

What lies ahead?

The complex reality of quantum mechanics means that there are two severe bottlenecks any company must face. First, the intellectual bottleneck, as there are very few people in the world capable of working and designing such systems. The second being the economic bottleneck, due to how research, development, and manufacturing of quantum-related technologies are simply very, very capital-intensive.

Xanadu’s lack of an official, public roadmap seems to be a strategic, science-first choice (compare it to IBM’s own extremely detailed roadmap for its superconducting qubits we explored in our previous article) – especially considering the way Xanadu has announced and executed on their plans for a large-scale, fault-tolerant quantum computer.

The one-two combo of announcing key milestones as they are executed while also moving them through peer-reviewed scientific publications shows the company is confident in their planned architecture, and the strategic partnership announcements align well with their identified bottlenecks.

Across the board, quantum is still a bet: no current quantum-related revenue can sustain development costs for pure-play quantum companies (something Google, Microsoft, and IBM don’t have to contend with), which helps explain the decisiveness of funding rounds and is perhaps a measure of their behind-closed-doors progress.

The bet is that when the tomorrow of quantum advantage comes, so too will the investment be justified. Like IBM, IonQ, and the other companies on our previous roadmap article, both PsiQuantum and Xanadu are also looking beyond the 2029 timeframe towards delivering large-scale, error-corrected quantum computers. Also like IBM, Quantinuum, and IonQ, Xanadu has made it to DARPA’s Quantum Breakthrough Initiative (QBI) Stage B.

PsiQuantum specifically hasn’t been a part of DARPA’s QBI, but is still involved with DARPA in a different capacity, being one of two companies (the other being Microsoft) to qualify for the Agency’s Underexplored Systems for Utility-Scale Quantum Computing (US2QC) Stage C program in February 2025. Beyond that, the company has seen both Australian and U.S. government backing; perhaps these government-corporation programs are one of the best ways to evaluate the feasibility of any given quantum solution, considering the validation work required for inclusion.

Francisco Pires is a freelance news writer for Tom's Hardware with a soft side for quantum computing.

Read Entire Article