English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Deep learning with photosensor timing information as a background rejection method for the Cherenkov Telescope Array

MPS-Authors
/persons/resource/persons271570

Watson,  J.
Division Prof. Dr. James A. Hinton, MPI for Nuclear Physics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Spencer, S., Armstrong, T., Watson, J., Mangano, S., Renier, Y., & Cotter, G. (2021). Deep learning with photosensor timing information as a background rejection method for the Cherenkov Telescope Array. Astroparticle Physics, 129: 102579. doi:10.1016/j.astropartphys.2021.102579.


Cite as: https://hdl.handle.net/21.11116/0000-000A-375C-2
Abstract
New deep learning techniques present promising new analysis methods for
Imaging Atmospheric Cherenkov Telescopes (IACTs) such as the upcoming
Cherenkov Telescope Array (CTA). In particular, the use of Convolutional
Neural Networks (CNNs) could provide a direct event classification
method that uses the entire information contained within the Cherenkov
shower image, bypassing the need to Hillas pa-rameterise the image and
allowing fast processing of the data.
Existing work in this field has utilised images of the integrated charge
from IACT camera photomultipliers, however the majority of current and
upcoming generation IACT cameras have the capacity to read out the
entire photosensor waveform following a trigger. As the arrival times of
Cherenkov photons from Extensive Air Showers (EAS) at the camera plane
are dependent upon the altitude of their emission and the impact
distance from the telescope, these waveforms contain information
potentially useful for IACT event classification.
In this test-of-concept simulation study, we investigate the potential
for using these camera pixel wave-forms with new deep learning
techniques as a background rejection method, against both proton and
electron induced EAS. We find that a means of utilising their
information is to create a set of seven addi-tional 2-dimensional pixel
maps of waveform parameters, to be fed into the machine learning
algorithm along with the integrated charge image. Whilst we ultimately
find that the only classification power against electrons is based upon
event direction, methods based upon timing information appear to
out-perform similar charge based methods for gamma/hadron separation. We
also review existing methods of event classifications using a
combination of deep learning and timing information in other
astroparticle physics experiments.
(c) 2021 Elsevier B.V. All rights reserved.