Today, the word "quantum" is everywhere -; in company names, movie titles, even theaters. But at its core, the concept of a quantum -; the tiniest, discrete amount of something -; was first developed to explain the behavior of the smallest bits of matter and energy.
Over the last century, scientists have developed mathematical descriptions of how these particles and packets of energy interact and used their understanding of "quantum mechanics" to design an array of amazing technologies -; from computers and cell phones to telescopes and spacecraft.
New applications, such as powerful quantum computers and quantum communications networks, are just over the horizon. But even before these applications reach the mainstream, scientists are developing quantum code to perform quantum calculations -; and using it to track complex quantum systems.
In a recent example, theorists and computational scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory and Stony Brook University (SBU) ran a series of quantum simulations to explore one of the quirkiest features of the quantum realm: entanglement. The study takes quantum back to its roots in seeking to explain the behavior of subatomic particles.
"The essential idea behind entanglement is that two quantum objects -; say, two particles -; can be correlated, or aware of one another, even if they are separated by very large distances," explained Brookhaven Lab/SBU theorist Dmitri Kharzeev, who led the research. Einstein called it "spooky action at a distance." But countless experiments have shown that the spooky effect is real.
To take it one step further, Kharzeev and his colleagues wanted to see if entanglement persists in jets of secondary particles -; cascades of particles produced by the fragmentation of supposedly entangled particles emitted from high energy particle collisions. They developed simulations to look for correlations between particles in one jet with those in a jet produced back-to-back by the same initial event. Their simulations, described in a publication in Physical Review Letters, revealed persistent strong entanglement, at least for short distances.
The results provide a foundation for testing these predictions in nuclear physics experiments at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven Lab, the Large Hadron Collider (LHC) at Europe's CERN laboratory, and the future Electron-Ion Collider (EIC), now in the design stage at Brookhaven. In addition, the method, which used quantum code run on a classical supercomputer, offers insights into ways to retrofit and leverage existing computing assets for running quantum calculations until more practical quantum computers come along.
Detecting Secondary Entanglement
"If you produce a quark and antiquark back-to-back in a high energy collision, you expect these two particles to be entangled because they were produced in same interaction," said study co-author Adrien Florio, a Goldhaber Fellow working with Kharzeev in Brookhaven Lab's Physics Department. "But detecting this entanglement is not easy because we cannot observe quarks directly. Quarks and antiquarks must always be 'confined' -; paired or tripled up to form composite particles called hadrons."
The confinement conundrum means that, as soon as the quark and antiquark emerge from the collision, they immediately start giving up their energy to the surrounding vacuum. That energy generates new quark-antiquark pairs -; a cascade, or jet, of bound hadrons for each initial particle.
Traditional models of jet production give probabilistic descriptions of the particles that make up the jets in three dimensions. Looking for one-to-one correlations of a particular particle in one jet with a particle in the other would be enormously challenging.
"Before quantum computing, we did not even know how to address this," Florio said.
But by simulating the particles using qubits, the fundamental units of quantum computing, the scientists could test whether the qubits representing individual points in space and time were entangled. In addition, they used a simpler theoretical framework that reduced the complexity of the jets to just two dimensions -; one spatial dimension plus time.
"Since the quark and antiquark are produced at very high energies, they move like bullets in the quantum vacuum along a straight line," Florio said. "We just look for correlations among qubits that represent particles along that straight-line trajectory over time."
Entanglement Entropy
The calculations were designed in collaboration with Kwang Min Yu of Brookhaven Lab's Computational Science Initiative (CSI) to show whether the "entanglement entropy" of a hadron at a particular point in one jet's trajectory was correlated with the entanglement entropy of a hadron at the corresponding point in the opposite jet.
"Entropy is a measure of uncertainty," Kharzeev explained. "When you have a lot of chaos and uncertainty in your life, your life has a high amount of entropy." Pure quantum states, in contrast, have zero entanglement entropy. "In such states, everything is under control. You know exactly what state you are in, so there is no uncertainty," he said.
But if two pure quantum states -; particles or qubits -; are entangled, "if you do something in one, then something is going to happen in the other," he explained. "This means that if I measure only one, I don't possess complete information about it because part of its state is controlled by another quantum state to which I have no access. There will be some uncertainty over its properties and behavior." The entropy value will not be zero.
"It is like you are in a close relationship with someone, and whatever this person does affects you and vice versa. So this means you are not in complete control of what is going on. This is the same thing on the quantum level," Kharzeev said.
To detect these entanglements, the scientists looked for correlations between qubits representing particles at various distances away from the collision point. Kharzeev likened the calculations to throwing dice and measuring the probability that rolling a certain number on one would produce the same number on the other.
"With the particles, you determine whether a particle produced at one point in space corresponds to one at the same point in space on the opposite side of the collision. If they match up once, it could be a coincidence. But if you throw the 'dice' a million times by studying millions of events, and they always show you identical results, then you know that these particles are correlated, or entangled," he said.
The scientists found that the quantum correlations among simulated hadrons exist and are quite strong. "But in our simulations, we see that the correlations die off if the separation between secondary particles is large," he said.
The findings provide a foundation for testing whether entanglement persists and dies off with increasing distance in experiments at RHIC, the LHC, and future EIC.
Leveraging Computing Assets
Even though the scientists wrote their simulations using quantum code, they ran the calculations on a classical supercomputer at the National Energy Research Scientific Computing Center (NERSC) at DOE's Lawrence Berkeley National Laboratory.
"For now, you can get very meaningful results for a small number of qubits, simulating their behavior on a classical computer," CSI's Yu explained.
Kharzeev and Yu are working with collaborators at NVIDIA, the company that originally developed the graphics processing units (GPUs) used in today's most powerful supercomputers, to make classical computers even more suitable for running quantum simulations.
"You can rearrange the quantum gates to optimize them for performing quantum simulations," Yu said.
But even these optimized classical computers will eventually top out when the number of qubits needed for simulations grows -; as it must for tracking the evolution of jets for longer times over greater distances, as one example.
Many efforts are underway to improve the performance of quantum computers, particularly to improve error mitigation. Kharzeev is participating in this work as part of the Co-design Center for Quantum Advantage (C2QA), a National Quantum Information Science (QIS) research center led by Brookhaven Lab and funded by DOE.
"Many people are working to solve the challenges of building quantum computers," Kharzeev said. "I'm confident that, in the near future, we will be able to run a wide variety of more complex quantum simulations on these next-generation machines, using the knowledge we've already gained about quantum interactions to further explore the behavior of the quantum particles that make up our world."
This research was funded by the DOE Office of Science. RHIC and NERSC are DOE Office of Science user facilities. Additional co-authors on the published paper include: David Frenklakh (SBU), Kazuki Ikeda (SBU, C2QA), Vladimir Korepin (SBU), and Shuzhe (Tsinghua University, SBU).