Improving parametric neural networks for high-energy physics (and beyond)

Anzalone, Luca and Diotalevi, Tommaso and Bonacorsi, Daniele (2022) Improving parametric neural networks for high-energy physics (and beyond). Machine Learning: Science and Technology, 3 (3). 035017. ISSN 2632-2153

[thumbnail of Anzalone_2022_Mach._Learn.__Sci._Technol._3_035017.pdf] Text
Anzalone_2022_Mach._Learn.__Sci._Technol._3_035017.pdf - Published Version

Download (107MB)

Abstract

Signal-background classification is a central problem in high-energy physics, that plays a major role for the discovery of new fundamental particles. A recent method—the parametric neural network (pNN)—leverages multiple signal mass hypotheses as an additional input feature to effectively replace a whole set of individual classifiers, each providing (in principle) the best response for the corresponding mass hypothesis. In this work we aim at deepening the understanding of pNNs in light of real-world usage. We discovered several peculiarities of parametric networks, providing intuition, metrics, and guidelines to them. We further propose an alternative parametrization scheme, resulting in a new parametrized neural network architecture: the AffinePNN; along with many other generally applicable improvements, like the balanced training procedure. Finally, we extensively and empirically evaluate our models on the HEPMASS dataset, along its imbalanced version (called HEPMASS-IMB) we provide here for the first time, to further validate our approach. Provided results are in terms of the impact of the proposed design decisions, classification performance, and interpolation capability, as well.

Item Type: Article
Subjects: Scholar Eprints > Social Sciences and Humanities
Depositing User: Managing Editor
Date Deposited: 07 Jul 2023 03:32
Last Modified: 03 Jun 2024 12:28
URI: http://repository.stmscientificarchives.com/id/eprint/2239

Actions (login required)

View Item
View Item