Our research covers the development of new algorithms for physics research based on machine learning and artificial intelligence as well as their application to searches for new physical phenomena in experimental particle physics at the LHC.
See below for more details or look at the publications and recent talks. If you are a student at Hamburg University, contact me (gregor.kasieczka"AT"uni-hamburg.de) for possible BSc and MSc projects in these areas. Open PhD/postdoc positions are listed here.
While there are many experimental and theoretical results pointing towards physical phenomena beyond the Standard Model, no signs of such new physics have been found so far. Unsupervised machine learning methods such as autoencoders (arxiv:1808.08979) or density estimation (arxiv:2109.00546) allow to search for new physics relatively independet from specific model assumptions (LHC Olympics). Our group works on the development of new anomaly detection methods and their application to data collected by the CMS experiment.
Another promising direction for the discovery of new physics are exotic long-lived decays. Here new particles fly undetected for macroscopic distances before decaying in the CMS detector. These atypical decays offer a clear signature but require dedicated reconstruction techniques as standard algorithms are often insensitive to long-lived decays or discard them as noise. We search for long-lived decays over a wide range of possible lifetimes with the CMS experiment (analysis summary).
The simulation of particle physics data using traditional techniques is computationally very expensive. Generative models such as Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and autoregressive flows offer a potential way to increase the usable statistics and reduce the computing cost. We focus on simulating the interactions of particles with highly granular calorimeter detectors such as the ILD and the CMS HGCal (arxiv:2005.05334). We also study the quality and amplification properties of generative models (arxiv:2008.06545).
With 40 million collisions per second, the data rate at the Large Hadron Collider (LHC) far exceeds our ability to store data. Therefore a complex multi-stage trigger system selects which events to store for further offline analysis. Decisions at the first stage of this trigger need to be made within microseconds and therefore use dedicated hardware such as field-programmable gate arrays (FPGAs). Using complex machine learning algorithms for these decisions improves the selection quality, but conforming with the tight timing and other constraints is technically challenging. We work on developing jet signature and anomaly triggers for upcoming data-taking runs of the CMS experiment.
Automated machine learning for physics data:
Data in fundamental physics often is collected by complex sensors arranged in irregular geometries. This makes standard machine learning architectures impractical and often leads to custom architectures for physics data. To simplify the adaptation of advanced algorithms in physics research, we curate a collection of datasets and algorithms with the goal of providing automated machine learning for physics data (github).
Robustness and uncertainties:
The precise quantification of uncertainties is a crucial pillar of scientific data analysis. Similarly, decision algorithms need to be robust against systematic shifts in data or similar effects. We investigate methods to quantify the uncertainty of machine learning algorithms (arxiv:2003.11099) and to decorrelate their output against arbitrary features (arxiv:2001.05310).