A pioneering algorithm has been put into operation at the Large Hadron Collider by a group of scientists from Staffordshire University, Massachusetts Institute of Technology, and CERN.
The Large Hadron Collider (LHC), the most potent particle accelerator ever created, is located at CERN, the European Organization for Nuclear Research, close to Geneva in Switzerland. It is housed in a tunnel 100 m below ground. It is the location of ongoing experiments that help physicists worldwide understand the nature of the universe.
The project is a part of the Compact Muon Solenoid (CMS) experiment, one of seven operational experiments that employ detectors to examine the particles generated by collisions in the accelerator.
The study was completed in advance of the Large Hadron Collider’s high luminosity upgrade, and the new academic paper was published in European Physical Journal C.
The High Luminosity Large Hadron Collider (HL-LHC) project seeks to boost the LHC’s efficiency to maximize the likelihood of scientific breakthroughs after 2029. Proton-proton interactions will go from 40 to 200 during an event due to the HL-LHC.
Limiting the increase of computing resource consumption at large pileups is a necessary step for the success of the HL-LHC physics program and we are advocating the use of modern machine learning techniques to perform particle reconstruction as a possible solution to this problem.
Professor Raheel Nawaz, Pro Vice-Chancellor, Digital Transformation, Staffordshire University
He added, “This project has been both a joy and a privilege to work on and is likely to dictate the future direction of research on particle reconstruction by using a more advanced AI-based solution.”
This is the first single-shot reconstruction of about 1000 particles from and in an unprecedentedly challenging environment with 200 simultaneous interactions each proton-proton collision.
Dr. Jan Kieseler, Experimental Physics Department, CERN
Dr. Kieseler added, “Showing that this novel approach, combining dedicated graph neural network layers (GravNet) and training methods (Object Condensation), can be extended to such challenging tasks while staying within resource constraints represents an important milestone towards future particle reconstruction.”
As part of his doctoral work at CERN and Manchester Metropolitan University, Shah Rukh Qasim, who is in charge of this study, said: “The amount of progress we have made on this project in the last three years is truly remarkable. It was hard to imagine we would reach this milestone when we started!”
The vice chancellor and chief executive of Staffordshire University, Professor Martin Jones, added: “CERN is one of the world’s most respected centers for scientific research and I congratulate the researchers on this project which is effectively paving the way for even greater discoveries in years to come.”
He further added, “Artificial Intelligence is continuously evolving to benefit many different industries and to know that academics at Staffordshire University and elsewhere are contributing to the research behind such advancements is both exciting and significant.”
Journal Reference:
Qasim, S. R., et al. (2022) End-to-end multiple-particle reconstruction in high occupancy imaging calorimeters with graph neural networks. The European Physical Journal C. doi:10.1140/epjc/s10052-022-10665-7.