Main menu


AI big data generated by particle accelerators

AI big data generated by particle accelerators

Big data generated by particle accelerators... The necessary and sufficient conditions for analysis are AI

DOE Announces $5.7 Million for Six Artificial Intelligence Projects in Nuclear Physics Research
Linking Particle Physics and Artificial Intelligence enables surprising discoveries without humans
“Can we see the moment of the birth of the universe?” It may seem impossible at first glance, but in the world of particle physics, it is possible. This is because there is a Hadron Accelerator (LHC) that can create a situation similar to the Big Bang, which is the beginning of the moment.

The LHC, made by the European Institute of Particle Physics (CERN), can reproduce the situation before and after the Big Bang by colliding two particles with a speed close to the speed of light.

In general, a particle accelerator is a device that accelerates charged particles such as electrons or protons with a strong electric field. Through this, researchers study the structure of the universe and the forces that govern it by colliding high-velocity particles with each other and analyzing the particles derived from them.

However, there is one problem, which is the processing of huge data from a single Hadron Collider (LHC) experiment. According to the researchers, crashing for a year generates close to a million petabytes of raw data.

This amount can fill about 1 billion moderately sized hard drives, and produces about 1 million gigabytes of data every second.

Particle physicists introduced machine learning techniques early on to process all this data. For one example, to process the large amount of big data generated in this LHC experiment, researchers apply an algorithm called 'Trigger', which keeps some data for analysis, and which data. It is a real-time decision whether to discard or not.

This machine learning algorithm has proven to be very successful in the analysis of particle physics, making at least 70% decisions on the preservation and disposal of big data.

$5.7 million for AI projects

The U.S. Department of Energy has announced a $5.7 million grant for artificial intelligence AI  research for nuclear physics accelerators and detectors.

According to US media Newswise on the 2nd, the US Department of Energy announced $5.7 million in grants to six projects that will implement artificial intelligence methods to accelerate scientific discovery in nuclear physics research.

The six projects are said to aim to optimize the overall performance of complex accelerator and detector systems using advanced computational methods.

"Artificial intelligence has the potential to shorten the timetable for experimental discovery in nuclear physics," said Timothy Holman, deputy director of the Nuclear Physics Institute.

Currently, particle accelerator facilities and nuclear physics instruments face technical challenges that artificial intelligence must solve in simulation and control, data acquisition and analysis, he added.

Six projects, carried out by researchers in nuclear physics from five national laboratories and four universities, include the development of deep learning algorithms to identify unique signals for a speculated and very slow nuclear process known as neutrino double beta decay. do.

If observed, this decay would be at least 10,000 times rarer than the rarest nuclear decay, suggesting whether the universe has become dominated by matter rather than antimatter, the researchers said.

In addition, AI-based detector designs for the Electron Ion Collider Accelerator Project under construction at Brookhaven National Laboratory are supported.

The project, supported by the U.S. Department of Energy's (DOE) Nuclear Physics Program, was awarded an award through a competitive peer review. The total planned funding is $5.7 million, $3.2 million in fiscal 2021, and the rest of the year funding is subject to congressional spending.

AI also analyzes blurry images

On the 13th of last month, Hussain Kanchwala, an electronic engineer and analyst at the University of Mumbai, India, published a column titled "How artificial intelligence is helping physicists research on particle accelerators?" raised

According to Kanchiwala, the world's largest particle accelerator, the Large Hadron Collider (LHC), tested at CERN, generating more than 1 million gigabytes of data every minute.

This huge amount of data, he argued, is too much for ordinary researchers to store and study.

So, how will artificial intelligence help physicists working on these particle accelerators? The answer to this question is artificial intelligence, Kanchiwala said.

Physicists working on particle accelerators, such as the Hadron Collider (LHC), have introduced artificial intelligence (AI) to address the flood of LHC data.

A machine running an artificial intelligence program learns activities such as voice recognition, planning, problem solving, perception, and planning by itself, so it can operate efficiently without getting lost in the labyrinth of data.

If the link between particle physicists and AI researchers goes well, the next generation of particle collision experiments will involve some of the world's most intelligent thought machines, which can make surprising discoveries with little or no human input. It can be done, he claimed.

The relationship between particle physics and artificial intelligence is no longer a stranger, as ATLAS and CMS, two of the LHC experiments that paved the way for Higgs Boson's discovery a few years ago, are examples.

Higgs boson is known as a particle that can be observed only in large particle accelerators because it has a very large mass and a short decay time compared to other particles.

On July 4, 2012, CERN announced that it had found a Higgs boson-like particle with a confidence of 4.9σ in a study conducted by the ATLAS team and the CMS team of the Large Hadron Collider.

Their success, Kanchiwala argued, was due to machine learning techniques that train algorithms to recognize patterns in data and draw meaningful conclusions from those patterns.

The AI ​​algorithm is said to have learned how to accurately detect patterns from the decay of rare Higgs particles among thousands of insignificant data using debris simulations of particle collisions.

Kanchiwala predicts that recent advances in artificial intelligence will further advance the application of particle accelerators. As artificial intelligence algorithms are fine-tuned day by day, they can solve problems related to particle physics.

Many of the new tasks used by AI programs have applications in computer vision, which deals with the automatic extraction, analysis, and detection of relevant information from a standalone image or a series of images.

It is similar to the facial recognition found in most high-end camera phones these days, except that the image features in particle physics are much more abstract than simple facial features like the eyes, ears, and nose.

However, particle accelerator experiments first require reconstructing images from a heterogeneous data pool generated by millions of sensor elements.

"Even if the data doesn't look like an image, if physicists can process it the right way, they can still use computer vision programs," said machine learning researcher Alexander Radovich.

According to Radovich, an area where this approach could have great results is the analysis of mass-produced particle jets during particle accelerator experiments such as the LHC. Particle jets are narrow clusters of particles that make it very difficult to detect individual tracks, and computer vision algorithms can help identify the jet's characteristics.

Today, particle physicists are using artificial intelligence to answer the biggest questions about particle physics, primarily from large data pools generated from particle acceleration experiments.

In the next decade, Kanchiwala argued, AI algorithms could independently ask questions and inform researchers when they make groundbreaking new discoveries in physics.

Using AI for crash simulation analysis

On September 20 this year, the website of the Weizmann Institute of Science in Israel reported that scientists at the institute were using artificial intelligence to unravel the mysteries of colliding particles.

In this article, PhD student Jonathan Schlomi describes the experiment as "similar to examining the wreckage of a plane crash to reconstruct what color pants the passengers in the A17 were wearing." .

Between 2011 and 2013, Gross, a researcher at the Weisman Institute, led a team of researchers looking for Higgs particles using an ATLAS detector. The Higgs particle, dubbed the 'God particle', was a decades-old physics puzzle

“How can a particle gain mass?” Predicted by Peter Higgs in the 1960s, the particle remained theoretical until 2012, but was eventually discovered and the mystery solved.

As such, particle physics is the discovery of unknown particles. After God's particle was finally discovered, CERN focused on proving other theoretical models, such as the supersymmetry theory.

However, as these efforts reached a dead end, Gross realized it was time to take a new path to refine and fine-tune current data analysis methods, both in the extraction of existing data and in the search for new particles in both current and future accelerators. .

To achieve this goal, a new research group using a machine learning approach was established at the Weisman Institute.

According to the researchers, although current techniques have reached their sensitivity limits, high-resolution simulations of particle collisions provide a new framework for answering most fundamental questions in physics.

As particles collide inside the ATLAS detector, the detection tool records energy measurements that scientists must decipher. Coupled with the fact that over a billion crashes occur in a given few seconds while the accelerator is running, this setup raises two issues.

These enormous amounts of data cannot be analyzed manually, and on the other hand, because they are microscopic and rapidly evolving, the detector cannot effectively collect all results with the same level of precision.

But deep learning techniques can use collision simulations to more accurately analyze computer-generated data, as detected by sensitive detectors.