16,496 research outputs found

    Boosted Decision Trees as an Alternative to Artificial Neural Networks for Particle Identification

    Full text link
    The efficacy of particle identification is compared using artificial neutral networks and boosted decision trees. The comparison is performed in the context of the MiniBooNE, an experiment at Fermilab searching for neutrino oscillations. Based on studies of Monte Carlo samples of simulated data, particle identification with boosting algorithms has better performance than that with artificial neural networks for the MiniBooNE experiment. Although the tests in this paper were for one experiment, it is expected that boosting algorithms will find wide application in physics.Comment: 6 pages, 5 figures; Accepted for publication in Nucl. Inst. & Meth.

    Efficient Monte Carlo Integration Using Boosted Decision Trees and Generative Deep Neural Networks

    Get PDF
    New machine learning based algorithms have been developed and tested for Monte Carlo integration based on generative Boosted Decision Trees and Deep Neural Networks. Both of these algorithms exhibit substantial improvements compared to existing algorithms for non-factorizable integrands in terms of the achievable integration precision for a given number of target function evaluations. Large scale Monte Carlo generation of complex collider physics processes with improved efficiency can be achieved by implementing these algorithms into commonly used matrix element Monte Carlo generators once their robustness is demonstrated and performance validated for the relevant classes of matrix elements

    Testing the Martingale Difference Hypothesis Using Neural Network Approximations

    Get PDF
    The martingale difference restriction is an outcome of many theoretical analyses in economics and finance. A large body of econometric literature deals with tests of that restriction. We provide new tests based on radial basis function neural networks. Our work is based on the test design of Blake and Kapetanios (2000, 2003a,b). However, unlike that work we can provide a formal theoretical justification for the validity of these tests using approximation results from Kapetanios and Blake (2007). These results take advantage of the link between the algorithms of Blake and Kapetanios (2000, 2003a,b) and boosting. We carry out a Monte Carlo study of the properties of the new tests and find that they have superior power performance to all existing tests of the martingale difference hypothesis we consider. An empirical application to the S&P500 constituents illustrates the usefulness of our new test.Martingale difference hypothesis, Neural networks, Boosting

    Cascade Training Technique for Particle Identification

    Full text link
    The cascade training technique which was developed during our work on the MiniBooNE particle identification has been found to be a very efficient way to improve the selection performance, especially when very low background contamination levels are desired. The detailed description of this technique is presented here based on the MiniBooNE detector Monte Carlo simulations, using both artifical neural networks and boosted decision trees as examples.Comment: 12 pages and 4 EPS figure
    • …
    corecore