674 research outputs found

    Statistical Challenges for Searches for New Physics at the LHC

    Full text link
    Because the emphasis of the LHC is on 5 sigma discoveries and the LHC environment induces high systematic errors, many of the common statistical procedures used in High Energy Physics are not adequate. I review the basic ingredients of LHC searches, the sources of systematics, and the performance of several methods. Finally, I indicate the methods that seem most promising for the LHC and areas that are in need of further study.Comment: 12 pages, 7 figures, proceedings of PhyStat2005, Oxford. To be published by Imperial College Press. See http://www.physics.ox.ac.uk/phystat05/index.ht

    End-to-end Learning of Waveform Generation and Detection for Radar Systems

    Full text link
    An end-to-end learning approach is proposed for the joint design of transmitted waveform and detector in a radar system. Detector and transmitted waveform are trained alternately: For a fixed transmitted waveform, the detector is trained using supervised learning so as to approximate the Neyman-Pearson detector; and for a fixed detector, the transmitted waveform is trained using reinforcement learning based on feedback from the receiver. No prior knowledge is assumed about the target and clutter models. Both transmitter and receiver are implemented as feedforward neural networks. Numerical results show that the proposed end-to-end learning approach is able to obtain a more robust radar performance in clutter and colored noise of arbitrary probability density functions as compared to conventional methods, and to successfully adapt the transmitted waveform to environmental conditions.Comment: Presented at the 2019 Asilomar Conference on Signals, Systems, and Computer

    CFARnet: deep learning for target detection with constant false alarm rate

    Full text link
    We consider the problem of learning detectors with a Constant False Alarm Rate (CFAR). Classical model-based solutions to composite hypothesis testing are sensitive to imperfect models and are often computationally expensive. In contrast, data-driven machine learning is often more robust and yields classifiers with fixed computational complexity. Learned detectors usually do not have a CFAR as required in many applications. To close this gap, we introduce CFARnet where the loss function is penalized to promote similar distributions of the detector under any null hypothesis scenario. Asymptotic analysis in the case of linear models with general Gaussian noise reveals that the classical generalized likelihood ratio test (GLRT) is actually a minimizer of the CFAR constrained Bayes risk. Experiments in both synthetic data and real hyper-spectral images show that CFARnet leads to near CFAR detectors with similar accuracy as their competitors.Comment: arXiv admin note: substantial text overlap with arXiv:2206.0574

    Noise-Enhanced Information Systems

    Get PDF
    Noise, traditionally defined as an unwanted signal or disturbance, has been shown to play an important constructive role in many information processing systems and algorithms. This noise enhancement has been observed and employed in many physical, biological, and engineered systems. Indeed stochastic facilitation (SF) has been found critical for certain biological information functions such as detection of weak, subthreshold stimuli or suprathreshold signals through both experimental verification and analytical model simulations. In this paper, we present a systematic noise-enhanced information processing framework to analyze and optimize the performance of engineered systems. System performance is evaluated not only in terms of signal-to-noise ratio but also in terms of other more relevant metrics such as probability of error for signal detection or mean square error for parameter estimation. As an important new instance of SF, we also discuss the constructive effect of noise in associative memory recall. Potential enhancement of image processing systems via the addition of noise is discussed with important applications in biomedical image enhancement, image denoising, and classification

    A detection-based pattern recognition framework and its applications

    Get PDF
    The objective of this dissertation is to present a detection-based pattern recognition framework and demonstrate its applications in automatic speech recognition and broadcast news video story segmentation. Inspired by the studies of modern cognitive psychology and real-world pattern recognition systems, a detection-based pattern recognition framework is proposed to provide an alternative solution for some complicated pattern recognition problems. The primitive features are first detected and the task-specific knowledge hierarchy is constructed level by level; then a variety of heterogeneous information sources are combined together and the high-level context is incorporated as additional information at certain stages. A detection-based framework is a â divide-and-conquerâ design paradigm for pattern recognition problems, which will decompose a conceptually difficult problem into many elementary sub-problems that can be handled directly and reliably. Some information fusion strategies will be employed to integrate the evidence from a lower level to form the evidence at a higher level. Such a fusion procedure continues until reaching the top level. Generally, a detection-based framework has many advantages: (1) more flexibility in both detector design and fusion strategies, as these two parts can be optimized separately; (2) parallel and distributed computational components in primitive feature detection. In such a component-based framework, any primitive component can be replaced by a new one while other components remain unchanged; (3) incremental information integration; (4) high level context information as additional information sources, which can be combined with bottom-up processing at any stage. This dissertation presents the basic principles, criteria, and techniques for detector design and hypothesis verification based on the statistical detection and decision theory. In addition, evidence fusion strategies were investigated in this dissertation. Several novel detection algorithms and evidence fusion methods were proposed and their effectiveness was justified in automatic speech recognition and broadcast news video segmentation system. We believe such a detection-based framework can be employed in more applications in the future.Ph.D.Committee Chair: Lee, Chin-Hui; Committee Member: Clements, Mark; Committee Member: Ghovanloo, Maysam; Committee Member: Romberg, Justin; Committee Member: Yuan, Min

    Decision Fusion in Non-stationary Environments

    Get PDF
    A parallel distributed detection system consists of multiple local sensors/detectors that observe a phenomenon and process the gathered observations using inbuilt processing capabilities. The end product of the local processing is transmitted from each sensor/detector to a centrally located data fusion center for integration and decision making. The data fusion center uses a specific optimization criterion to obtain global decisions about the environment seen by the sensors/detectors. In this study, the overall objective is to make a globally-optimal binary (target/non-target) decision with respect to a Bayesian cost, or to satisfy the Neyman-Pearson criterion. We also note that in some cases a globally-optimal Bayesian decision is either undesirable or impractical, in which case other criteria or localized decisions are used. In this thesis, we investigate development of several fusion algorithms under different constraints including sequential availability of data and dearth of statistical information. The main contribution of this study are: (1) an algorithm that provides a globally optimal solution for local detector design that satisfies a Neyman-Pearson criterion for systems with identical local sensors; (2) an adaptive fusion algorithm that fuses local decisions without a prior knowledge of the local sensor performance; and (3) a fusion rule that applies a genetic In addition, we develop a parallel decision fusion system where each local sensor is a sequential decision maker that implements the modified Wald's sequential probability test (SPRT) as proposed by Lee and Thomas (1984).Ph.D., Electrical Engineering -- Drexel University, 201
    • …
    corecore