DOE/LANL Jurisdiction Fire Danger Rating:
  1. LANL Home
  2. media
  3. news
April 9, 2026

Researchers show some quantum learning models are classically simulable

Fix for quantum ‘curse of dimensionality’ may mitigate advantage versus classical computing

2026-04-09
As proposed and demonstrated by the Los Alamos team, the architectures and techniques proposed to mitigate or altogether avoid barren plateaus in variational quantum computing make them classically simulable.

Variational quantum computing is a hybrid quantum-classical approach that has emerged as one of the most promising applications for quantum devices. But this approach is hindered by the “barren plateau” phenomenon, which undermines the approach’s machine learning training capabilities. As a team of Los Alamos researchers suggest in a recent perspective piece in Nature Communications, and as they go on to demonstrate with a simulated quantum neural network, the architectures and techniques proposed to mitigate or altogether avoid barren plateaus make them classically simulable.

“Barren plateaus typically result from what is known in the field as the ‘curse of dimensionality,’ where models need to navigate very big spaces, and finding the solution is like finding a needle in a haystack,” said Marco Cerezo, Los Alamos physicist and lead author on the perspective. “One avoids barren plateaus and circumvents the curse of dimensionality by restricting the model to a small subspace. But that solution might mean that the model can just as efficiently be simulated classically.”

If the connection between the absence of barren plateaus (for example, by restricting models to small subspaces) and classical simulability holds, the remedy for barren plateaus may prove worse than the problem. The advantage quantum computers have in solving machine learning tasks faster than classical supercomputers would be limited only to those models with no barren plateaus.

Subspaces and classical simulability

Variational quantum computing’s hybrid approach aims to solve tasks by classically optimizing the parameters of a quantum circuit, thus extending the advantage of classical neural networks to the quantum realm. However, the too-large space of possible quantum states (i.e., “the curse of dimensionality”) leads to an extremely flat optimization landscape — a barren plateau, upon which the approach’s algorithms fail.

The team’s recent research showed that the presence or absence of barren plateaus is clearly linked to whether the algorithm operates within a small subspace. The team undertook a case-by-case analysis of all known models and techniques, revealing a common pattern hidden in plain sight: Once the effective small subspace was pinpointed and identified, one only needs to emulate what the quantum computer does within it.

That is, the quantum computer’s approach may be simulable by classical computing. From all known barren plateau-free models, the team’s study found that a classical computer could do the same thing that the quantum computer does — a surprising result that undermined the quantum-only case for the promising architectures and techniques for quantum machine learning. The Los Alamos team recently undertook a concrete demonstration with a specific architecture, with results published in PRX Quantum pointing to an end-to-end simulability.

“It really seems the math is out to get us,” Cerezo said. “If you want to make an entirely quantum architecture capable of processing information, you have barren plateaus and the curse of dimensionality, and if you force the model to work in a subspace, then the subspace is always sufficiently small that it can have classically simulability. There seems to be no in-between or intermediate space.”

End-to-end simulability for quantum convolutional neural networks

To test their understanding of simulability, the team analyzed widely used variants of quantum convolutional neural networks, an architecture considered by many as one of the most promising models for quantum machine learning.

By identifying the correct subspace where the model acts, the team constructed and trained a purely classical surrogate for quantum convolutional neural networks. The surrogate matched or outperformed standard quantum convolutional neural networks on all benchmark datasets, and they ran simulations on as much as 1,024 qubits. The test suggests that the success of the quantum networks could be attributed to being benchmarked on simple problems, and the insight the team gleaned indicates the need for non-trivial datasets to move forward with quantum machine learning.

The caveats and the silver lining

The team’s study does not imply that quantum computers cannot operate in large spaces. Indeed, successful quantum algorithms, such as those that simulate quantum systems, avoid the curse of dimensionality by being extremely structured and carefully navigating the large quantum spaces.

“Unlike standard quantum algorithms, where every logical operation has a specific purpose, quantum machine learning algorithms follow the learning methodology of classical neural networks, where one seeks to find the right sequence of logical operations by training the algorithm based on data,” said Laboratory postdoctoral researcher Nahuel Diaz. “This means that, by design, they are unstructured and can get lost in the large spaces.”

The researchers show a path forward to beat barren plateaus without classical simulability by emulating how standard quantum algorithms work. An example provided by the team of a trainable but not simulable model may offer inspiration in the building of new quantum learning algorithms.

Finally, the team highlighted that a quantum computer might well be needed to initialize the classical simulation. In this case, the authors proposed a new hybrid paradigm, where quantum devices are used not to train a model, but to acquire data to build an efficient classical algorithm.

Paper: “Does provable absence of barren plateaus imply classical simulability? Or, why we need to rethink variational quantum computing.” Nature Communications. DOI: 10.1038/s41467-025-63099-6

Funding: The work was supported by Los Alamos’ Laboratory Directed Research and Development Program, the Center for Nonlinear Studies at Los Alamos, and the Laboratory’s ASC Beyond Moore’s Law project.

Paper: “Quantum Convolutional Neural Networks are Effectively Classically Simulable.” PRX Quantum. DOI: 10.1103/8qt9-72ts

Funding: The work was supported by Los Alamos’ Laboratory Directed Research and Development Program and the Laboratory’s ASC Beyond Moore’s Law project.

LA-UR-26-22854

Contact

Media Relations | media_relations@lanl.gov

Related Topics
  • Science, Technology & Engineering

Share

Explore More Topics
About the LabArtificial IntelligenceAwards and RecognitionsCommunityComputingEnergyHistoryOperationsScience, Technology & EngineeringSpaceWeapons

More Stories

All News
2026-03-31

Smarter monitoring with limited resources

The “Persistent DyNAMICS” framework makes monitoring more practical and cost-effective

2026-03-30

New technology enhances neutron detection capabilities

ICONS could improve a variety of applications

2026-03-25

AI model successfully trained in electroplating

Innovative model and experiment demonstrates capabilities for materials science

2026-03-19

Controls developed to reshape quantum arrow of time

Researchers explore ways to stretch, blur and even reverse quantum time flow

2026-03-17

Could fiber-optic cables detect moonquakes?

Researchers at Los Alamos National Laboratory explore how these inexpensive, robust cables could help future moon missions

2026-02-25

MicroBooNE results point to new directions in search for sterile neutrinos

One model is disfavored by results, though other mechanisms may be possible

Subscribe to our Newsletter

Sign up to receive the latest news and feature stories from Los Alamos National Laboratory