276 research outputs found
Search for rare b to open-charm two-body decays of baryons at LHCb
A search for the rare two-body decays Λb → D°Λ and Ξb° → D°Λ is performed with proton-proton collision data, collected by the LHCb experiment at a center-of-mass energy of 13 TeV. The decay Λb → D°Λ is seen with a statistical significance of 5.5 standard deviations and constitutes the discovery for this decay. An excess of Ξb° → D°Λ candidates w.r.t. the background is observed with a statistical significance of 1.8 standard deviations.In dieser Arbeit wird eine Suche nach den seltenen Zweikörper-Zerfällen Λb → D°Λ und Ξb° → D°Λ mit Proton-Proton Kollisionen präsentiert. Der analysierte Datensatz wurde durch das LHCb Experiment bei einer Schwerpunktsenergie von 13 TeV aufgezeichnet. Der Zerfall Λb → D°Λ wird mit einer statistischen Signifikanz von 5,5 Standardabweichungen beobachtet und ist somit als Neuentdeckung einzustufen. Eine Anhäufung von Ξb° → D°Λ Kandidaten gegenüber dem Untergrund wird mit einer statistischen Signifikanz von 1,8 Standardabweichungen beobachtet
PyTorch and automatic differentiation
I will explain automatic differentiation for the gifted amateur and how to use it in the PyTorch framework. You don't have to know the PyTorch framework but some experience with NumPy could help to follow the tutorial session. I would like to keep the session interactive but you will not need to run code on your laptops. Instead, I will present code snippets, discuss them and thus introduce some common pitfalls when using PyTorch. In the end I hope that you understand better what happens behind the curtain of PyTorch (and other autodiff libraries), helping you to more effectively debug your every day code
The Unreasonable Effectiveness of Deep Evidential Regression
There is a significant need for principled uncertainty reasoning in machine
learning systems as they are increasingly deployed in safety-critical domains.
A new approach with uncertainty-aware regression-based neural networks (NNs),
based on learning evidential distributions for aleatoric and epistemic
uncertainties, shows promise over traditional deterministic methods and typical
Bayesian NNs, notably with the capabilities to disentangle aleatoric and
epistemic uncertainties. Despite some empirical success of Deep Evidential
Regression (DER), there are important gaps in the mathematical foundation that
raise the question of why the proposed technique seemingly works. We detail the
theoretical shortcomings and analyze the performance on synthetic and
real-world data sets, showing that Deep Evidential Regression is a heuristic
rather than an exact uncertainty quantification. We go on to propose
corrections and redefinitions of how aleatoric and epistemic uncertainties
should be extracted from NNs.Comment: 11 pages, 25 figure
Inverted CERN School of Computing 2020
From a syntactical point of view, the Lambda expression of C++ is nothing but syntactic sugar of a struct with an appropriate call operator overload. On the other hand, this simple syntax is shockingly flexible and allows powerful abstractions in a functional way, while providing elegant and easy to read code in a language that is notoriously famous for being unnecessary clunky and verbose.
I will give an overview about the basic syntax and best practices. I will then talk about stateful Lambdas, Lambda inheritance and their real-world applications
Search for Rare to Open-Charm Two-Body Decays of Baryons at LHCb
A search for the rare two-body decays and is performed with proton-proton collision data, corresponding to an integrated luminosity of 6, collected by the LHCb experiment at a center-of-mass energy of 13 TeV. The decay is seen with a statistical significance of 5.5 standard deviations, and constitutes the discovery for this decay. The branching fraction, measured using the decay for normalization, is \begin{equation*} B(\Lambda_b\rightarrow D^0 \Lambda) = (9.9 \pm 2.3 \pm 1.6 \pm 1.1) \times 10^{-6} \,, \end{equation*} where the uncertainties are statistical, systematic, and external, respectively. An excess of candidates w.r.t. the background is observed with a statistical significance of 1.8 standard deviations and is used to estimate the upper limit \begin{equation*} \frac{f_{\Xi^0_b}}{f_{\Lambda_b}} \times \frac{B{(\Xi^0_c \rightarrow D^0\Lambda)}}{B{(\Lambda_b\rightarrow D^0\Lambda})} < 0.5 \quad (\text{CL}\,=\,95\,\%) \,, \end{equation*} where is the ratio of the fragmentation fractions of -quarks into and baryons
Hunter Power Plant, Emery County, Utah [1632]
Scan of transparency of sunflowers in foreground of Hunter plan
Calculating Lower Bounds within the PyTorch Framework
Lower estimation bounds are an important tool in the development of parametric estimators, which form a basis for a large
number of navigation and position solutions. The well-known Cramér-Rao bound (CRB) is such a bound and provides the
optimal mean squared error performance of locally unbiased estimators based on a signal model. If the model depends on a
random variable, the bound depends on the realization of this variable.
We consider the R-Mode navigation system as a case study in this paper. In this case, the signal is influenced by a modulated
signal where, in general, the transmitted bit sequence is unknown. Therefore, it becomes difficult to derive and evaluate the
performance bound as the complexity of the computation increases. To overcome the aforementioned challenge, we suggest
utilizing PyTorch and its automatic differentiation framework to calculate the bound for each realization, thus leveraging fast
calculation for each given scenario
- …