A framework is presented to extract and understand decision-making
information from a deep neural network (DNN) classifier of jet substructure
tagging techniques. The general method studied is to provide expert variables
that augment inputs ("eXpert AUGmented" variables, or XAUG variables), then
apply layerwise relevance propagation (LRP) to networks both with and without
XAUG variables. The XAUG variables are concatenated with the intermediate
layers after network-specific operations (such as convolution or recurrence),
and used in the final layers of the network. The results of comparing networks
with and without the addition of XAUG variables show that XAUG variables can be
used to interpret classifier behavior, increase discrimination ability when
combined with low-level features, and in some cases capture the behavior of the
classifier completely. The LRP technique can be used to find relevant
information the network is using, and when combined with the XAUG variables,
can be used to rank features, allowing one to find a reduced set of features
that capture part of the network performance. In the studies presented, adding
XAUG variables to low-level DNNs increased the efficiency of classifiers by as
much as 30-40\%. In addition to performance improvements, an approach to
quantify numerical uncertainties in the training of these DNNs is presented.Comment: 38 pages, 30 figure