2,792 research outputs found
Optimal statistical inference in the presence of systematic uncertainties using neural network optimization based on binned Poisson likelihoods with nuisance parameters
Data analysis in science, e.g., high-energy particle physics, is often
subject to an intractable likelihood if the observables and observations span a
high-dimensional input space. Typically the problem is solved by reducing the
dimensionality using feature engineering and histograms, whereby the latter
technique allows to build the likelihood using Poisson statistics. However, in
the presence of systematic uncertainties represented by nuisance parameters in
the likelihood, the optimal dimensionality reduction with a minimal loss of
information about the parameters of interest is not known. This work presents a
novel strategy to construct the dimensionality reduction with neural networks
for feature engineering and a differential formulation of histograms so that
the full workflow can be optimized with the result of the statistical
inference, e.g., the variance of a parameter of interest, as objective. We
discuss how this approach results in an estimate of the parameters of interest
that is close to optimal and the applicability of the technique is demonstrated
with a simple example based on pseudo-experiments and a more complex example
from high-energy particle physics
Adaptive Technomythography: The Apotheosis Of Machine And Development Of Legend In A System Of Dynamic Technology
Human beings will effectively deify any suitably complex system that cannot be explained through basic haptic interaction. Our culture loves technology. These days it seems we need it to feel whole. In an effort to explore the development of mythology and modular aesthetic in a technological age I have designed and constructed a number of interactive robotic \u27organisms\u27 to engage in arbitrary movement in geometric enclosures. Through observation and dialog I seek to assess the extent to which people assign human characteristics to the random and oft times aberrant mechanical behavior. To supplement this endeavor, a fictional astrological system that proposes logical (albeit mythological) explanations for the peculiarities in these relationships has been created
Messung diffraktiver D*+- Meson Produktion in tief-inelastischer ep-Streuung und Photoproduktion bei HERA
Es werden Messungen fuer die Produktion diffraktiver D*-Mesonen in tief-inelastischer Streuung (DIS) und Photoproduktion bei HERA vorgestellt. Die Ereignis-Topologie ist durch ep-->eXY vorgegeben, wobei das zentrale System X mindestens ein D*-Meson enthaelt und deutlich durch eine Rapiditaets-Luecke vom fuehrenden System Y des gestreuten Protons getrennt ist. Die analysierten Daten wurden in den Jahren 1999 und 2000 mit dem H1-Detektor aufgenommen und entsprechen einer integrierten Luminositaet von 47.0 pb^{-1}. Die Messungen werden mit Vorhersagen der perturbativen QCD in naechst-fuehrender Ordnung (NLO) verglichen. Diese Vorhersagen basieren auf diffraktiven Parton-Dichtefunktionen, die zuvor durch eine QCD-Analyse der bei H1 gemessenen diffraktiven Strukturfunktion F_{2}^{D(3)} ermittelt wurden. Die Uebereinstimmung der QCD-Vorhersagen mit den gemessenen Wirkungsquerschnitten erweist sich als gut, was die Gueltigkeit der QCD-Faktorisierung in DIS und Photoproduktion unterstuetzt
Reducing the dependence of the neural network function to systematic uncertainties in the input space
Applications of neural networks to data analyses in natural sciences are
complicated by the fact that many inputs are subject to systematic
uncertainties. To control the dependence of the neural network function to
variations of the input space within these systematic uncertainties, several
methods have been proposed. In this work, we propose a new approach of training
the neural network by introducing penalties on the variation of the neural
network output directly in the loss function. This is achieved at the cost of
only a small number of additional hyperparameters. It can also be pursued by
treating all systematic variations in the form of statistical weights. The
proposed method is demonstrated with a simple example, based on
pseudo-experiments, and by a more complex example from high-energy particle
physics
Identifying the relevant dependencies of the neural network response on characteristics of the input space
The relation between the input and output spaces of neural networks (NNs) is
investigated to identify those characteristics of the input space that have a
large influence on the output for a given task. For this purpose, the NN
function is decomposed into a Taylor expansion in each element of the input
space. The Taylor coefficients contain information about the sensitivity of the
NN response to the inputs. A metric is introduced that allows for the
identification of the characteristics that mostly determine the performance of
the NN in solving a given task. Finally, the capability of this metric to
analyze the performance of the NN is evaluated based on a task common to data
analyses in high-energy particle physics experiments
UNLV Horn Day Recital
Program listing performers and works performed
Performance of the bwHPC cluster in the production of μ -> t embedded events used for the prediction of background for H -> tt analyses
In high energy physics, a main challenge is the accurate prediction of background
events at a particle detector. These events are usually estimated by simulation.
As an alternative, data-driven methods use observed events to derive a background
prediction and are often less computationally expensive than simulation.
The lepton embedding method presents a data-driven method to estimate the
background from Z ! events for Higgs boson analyses in the same final state.
Z ! μμ events recorded by the CMS experiment are selected, the muons are
removed from the event and replaced with simulated leptons with the same
kinematic properties as the removed muons. The resulting hybrid event provides
an improved description of pile-up and the underlying event compared to the simulation
of the full proton-proton collision. In this paper the production of these
hybrid events used by the CMS collaboration is described. The production relies
on the resources made available by the bwHPC project. The data used for this
purpose correspond to 65 million di-muon events collected in 2017 by CMS
- …