690 research outputs found

    Statistical Challenges for Searches for New Physics at the LHC

    Full text link
    Because the emphasis of the LHC is on 5 sigma discoveries and the LHC environment induces high systematic errors, many of the common statistical procedures used in High Energy Physics are not adequate. I review the basic ingredients of LHC searches, the sources of systematics, and the performance of several methods. Finally, I indicate the methods that seem most promising for the LHC and areas that are in need of further study.Comment: 12 pages, 7 figures, proceedings of PhyStat2005, Oxford. To be published by Imperial College Press. See http://www.physics.ox.ac.uk/phystat05/index.ht

    On the (In)feasibility of ML Backdoor Detection as an Hypothesis Testing Problem

    Full text link
    We introduce a formal statistical definition for the problem of backdoor detection in machine learning systems and use it to analyze the feasibility of such problems, providing evidence for the utility and applicability of our definition. The main contributions of this work are an impossibility result and an achievability result for backdoor detection. We show a no-free-lunch theorem, proving that universal (adversary-unaware) backdoor detection is impossible, except for very small alphabet sizes. Thus, we argue, that backdoor detection methods need to be either explicitly, or implicitly adversary-aware. However, our work does not imply that backdoor detection cannot work in specific scenarios, as evidenced by successful backdoor detection methods in the scientific literature. Furthermore, we connect our definition to the probably approximately correct (PAC) learnability of the out-of-distribution detection problem

    End-to-end Learning of Waveform Generation and Detection for Radar Systems

    Full text link
    An end-to-end learning approach is proposed for the joint design of transmitted waveform and detector in a radar system. Detector and transmitted waveform are trained alternately: For a fixed transmitted waveform, the detector is trained using supervised learning so as to approximate the Neyman-Pearson detector; and for a fixed detector, the transmitted waveform is trained using reinforcement learning based on feedback from the receiver. No prior knowledge is assumed about the target and clutter models. Both transmitter and receiver are implemented as feedforward neural networks. Numerical results show that the proposed end-to-end learning approach is able to obtain a more robust radar performance in clutter and colored noise of arbitrary probability density functions as compared to conventional methods, and to successfully adapt the transmitted waveform to environmental conditions.Comment: Presented at the 2019 Asilomar Conference on Signals, Systems, and Computer

    CFARnet: deep learning for target detection with constant false alarm rate

    Full text link
    We consider the problem of learning detectors with a Constant False Alarm Rate (CFAR). Classical model-based solutions to composite hypothesis testing are sensitive to imperfect models and are often computationally expensive. In contrast, data-driven machine learning is often more robust and yields classifiers with fixed computational complexity. Learned detectors usually do not have a CFAR as required in many applications. To close this gap, we introduce CFARnet where the loss function is penalized to promote similar distributions of the detector under any null hypothesis scenario. Asymptotic analysis in the case of linear models with general Gaussian noise reveals that the classical generalized likelihood ratio test (GLRT) is actually a minimizer of the CFAR constrained Bayes risk. Experiments in both synthetic data and real hyper-spectral images show that CFARnet leads to near CFAR detectors with similar accuracy as their competitors.Comment: arXiv admin note: substantial text overlap with arXiv:2206.0574

    Noise-Enhanced Information Systems

    Get PDF
    Noise, traditionally defined as an unwanted signal or disturbance, has been shown to play an important constructive role in many information processing systems and algorithms. This noise enhancement has been observed and employed in many physical, biological, and engineered systems. Indeed stochastic facilitation (SF) has been found critical for certain biological information functions such as detection of weak, subthreshold stimuli or suprathreshold signals through both experimental verification and analytical model simulations. In this paper, we present a systematic noise-enhanced information processing framework to analyze and optimize the performance of engineered systems. System performance is evaluated not only in terms of signal-to-noise ratio but also in terms of other more relevant metrics such as probability of error for signal detection or mean square error for parameter estimation. As an important new instance of SF, we also discuss the constructive effect of noise in associative memory recall. Potential enhancement of image processing systems via the addition of noise is discussed with important applications in biomedical image enhancement, image denoising, and classification

    A detection-based pattern recognition framework and its applications

    Get PDF
    The objective of this dissertation is to present a detection-based pattern recognition framework and demonstrate its applications in automatic speech recognition and broadcast news video story segmentation. Inspired by the studies of modern cognitive psychology and real-world pattern recognition systems, a detection-based pattern recognition framework is proposed to provide an alternative solution for some complicated pattern recognition problems. The primitive features are first detected and the task-specific knowledge hierarchy is constructed level by level; then a variety of heterogeneous information sources are combined together and the high-level context is incorporated as additional information at certain stages. A detection-based framework is a â divide-and-conquerâ design paradigm for pattern recognition problems, which will decompose a conceptually difficult problem into many elementary sub-problems that can be handled directly and reliably. Some information fusion strategies will be employed to integrate the evidence from a lower level to form the evidence at a higher level. Such a fusion procedure continues until reaching the top level. Generally, a detection-based framework has many advantages: (1) more flexibility in both detector design and fusion strategies, as these two parts can be optimized separately; (2) parallel and distributed computational components in primitive feature detection. In such a component-based framework, any primitive component can be replaced by a new one while other components remain unchanged; (3) incremental information integration; (4) high level context information as additional information sources, which can be combined with bottom-up processing at any stage. This dissertation presents the basic principles, criteria, and techniques for detector design and hypothesis verification based on the statistical detection and decision theory. In addition, evidence fusion strategies were investigated in this dissertation. Several novel detection algorithms and evidence fusion methods were proposed and their effectiveness was justified in automatic speech recognition and broadcast news video segmentation system. We believe such a detection-based framework can be employed in more applications in the future.Ph.D.Committee Chair: Lee, Chin-Hui; Committee Member: Clements, Mark; Committee Member: Ghovanloo, Maysam; Committee Member: Romberg, Justin; Committee Member: Yuan, Min
    • …
    corecore