First IACT Waveform Analysis Based on Deep Convolutional Neural Networks Using CTLearn
T. Miener, L. Burmistrov, B. Lacave, A. Cerviño
公開日: 2025/9/18
Abstract
Imaging atmospheric Cherenkov telescopes (IACTs) detect extended air showers (EASs) generated when very-high-energy (VHE) gamma rays or cosmic rays interact with the Earth's atmosphere. Cherenkov photons produced during an EAS are captured by fast-imaging cameras, which record both the spatial and temporal development of the shower, along with calorimetric data. By analyzing these recordings, the properties of the original VHE particle-such as its type, energy, and direction of arrival-can be reconstructed through machine learning techniques. This contribution focuses on the Large-Sized Telescopes (LSTs) of the Cherenkov Telescope Array Observatory, a next-generation ground-based gamma-ray observatory. LSTs are responsible for reconstructing lower-energy gamma rays in the tens of GeV range. We explore a novel event reconstruction technique based on deep convolutional neural networks (CNNs) applied on calibrated and cleaned waveforms of the IACT camera pixels using CTLearn. Our approach explicitly incorporates the time development of the shower, enabling a more accurate reconstruction of the event. This method eliminates the need for charge integration or handcrafted feature extraction, allowing the model to directly learn from waveform data.