Decoding high energy physics with AI and machine learning


Decoding the fundamental forces of the universe
A high-energy collision probes the internal structure of subatomic particles, depicted as a neural-network-like web of quantum connections. This graphic highlights how physicists use AI/ML to map the quark-gluon structure inside particles and search for new physics beyond the standard model. Credit: Brandon Kriesten/Argonne National Laboratory.

In the world of particle physics, where scientists unravel the mysteries of the universe, artificial intelligence (AI) and machine learning (ML) are making waves with how they’re increasing understanding of the most fundamental particles. Central to this exploration are parton distribution functions (PDFs). These complex mathematical models are crucial for predicting outcomes of high energy physics experiments that test the Standard Model of particle physics.

PDFs are mathematical models that help scientists understand the inner workings of protons, which are particles found in the nucleus of an atom. Protons are made up of even smaller particles called quarks and gluons, collectively known as partons. PDFs describe how these partons are distributed within a proton, essentially providing a map of where these tiny particles are likely to be found and how much momentum they carry.

This information helps scientists predict the outcomes of high energy physics experiments, such as those conducted at the Large Hadron Collider, where protons are smashed together to explore fundamental forces and particles.

Modeling these functions is difficult due to their complexity and limited availability of experimental data. However, AI and ML offer new ways to analyze and understand these complex functions by processing large sets of data gathered by collider facilities.

At the U.S. Department of Energy’s (DOE) Argonne National Laboratory, theoretical physicists Tim Hobbs and Brandon Kriesten are pioneering the use of AI/ML to tackle the challenges of modeling PDFs, improving both the accuracy of the PDFs derived from data as well as the interpretability of the ML models used to do so. This means that scientists can more easily identify patterns, relationships and underlying principles within PDFs and the techniques used to extract them, leading to more informed and reliable conclusions.

“Particle physics deals with elementary or fundamental particles,” Hobbs explained. “The current focus is on finding cracks in the Standard Model, which was completed in the 1970s. Despite its strength, we know it’s incomplete due to hints like dark matter from cosmology.”

PDFdecoder: Bridging theory and data

A recent study by Hobbs and Kriesten, published in Physical Review D, introduced the “PDFdecoder” framework. It uses encoder-decoder models, which are types of neural network architectures. These models simplify complex data into a more manageable form and then reconstruct the original data from this simplified version.

The reconstruction of PDFs is crucial because it allows scientists to predict the behavior of particles in high energy physics experiments. The key properties of a PDF are captured through “Mellin moments,” which are mathematical expressions that summarize the distribution of these particles.

“The model uses generative AI to fill in gaps and recreate initial conditions,” said Kriesten. In this context, “initial conditions” refer to the starting parameters or configurations needed to accurately model the distribution of quarks and gluons within protons.

“We looked at how PDFs can be decoded from Mellin moments and explored different possible solutions,” he added.

This approach enhances the accuracy of predictions in particle physics by ensuring that reconstructed PDFs closely match real-world data. It can make PDF models more precise, particularly in lattice gauge calculations—a computational technique that delves into the complexities of quantum chromodynamics, the theory describing the strong force binding quarks and gluons.

By incorporating Mellin moment data, the PDFdecoder framework provides a new way to integrate lattice information into PDF studies, strengthening the connection between theoretical models and experimental findings.

Decoding the fundamental forces of the universe
High energy physics experiments seek to reveal signs of unknown particles or interactions beyond the Standard Model (BSM). However, persistent uncertainties in theoretical predictions, experimental data and parton distribution functions can obscure these findings. AI/ML methods offer solutions to untangle these complexities in particle physics. Credit: Tim Hobbs/Argonne National Laboratory.

Unraveling AI’s decision-making in theoretical models

In another study, published in the Journal of High Energy Physics, Hobbs and Kriesten unveiled another framework called “XAI4PDF.” This framework uses explainable AI techniques, which are methods designed to make AI models’ decision-making processes more transparent and understandable.

The XAI4PDF framework uses ResNet architectures—a type of neural network that relies on shortcuts to improve training efficiency. These shortcuts allow the network to bypass certain layers, making it easier to train deep networks without losing important information.

This framework classifies PDFs based on their underlying theoretical assumptions. It not only determines which theoretical model best fits a given set of PDFs but also traces how specific assumptions affect the behavior of these PDFs. This provides valuable insights into the factors that guide AI decisions, helping researchers understand the influence of different theoretical parameters.

By adapting techniques originally developed for image recognition, the researchers have created a powerful tool for analyzing complex theoretical models in particle physics.

“We repurposed tools from computer vision,” Kriesten explained. “This helps us understand how different theoretical assumptions change the features of PDFs.”

Together, these frameworks represent a significant step forward in the application of AI/ML in particle theory.

Transforming the future of high energy physics

“Our work focuses on using AI/ML to unravel complex problems in particle physics,” Hobbs said. “By enhancing the understanding and accuracy of PDFs, we’re paving the way for more precise predictions in high energy physics experiments.”

As AI continues to advance, its role in particle physics is expected to deepen, potentially revealing more of the universe’s secrets. Hobbs and Kriesten are optimistic about AI/ML’s transformative potential in theoretical physics. They plan to expand their frameworks to encompass a wider range of particle interactions and explore foundation models to fully capture the complexity of particle physics.

By pushing the boundaries of AI/ML applications, they are not only advancing high energy physics but also setting the stage for future discoveries that could redefine our understanding of the universe.

Other contributors to this work include Jonathon Gomprecht from the University of Arizona.

More information:
Brandon Kriesten et al, Learning PDFs through interpretable latent representations in Mellin space, Physical Review D (2025). DOI: 10.1103/PhysRevD.111.014028

Brandon Kriesten et al, Explainable AI classification for parton density theory, Journal of High Energy Physics (2024). DOI: 10.1007/JHEP11(2024)007

Citation:
Decoding high energy physics with AI and machine learning (2025, June 13)
retrieved 13 June 2025
from https://phys.org/news/2025-06-decoding-high-energy-physics-ai.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Translate »
Share via
Copy link