5,188 research outputs found

    A bayesian approach to adaptive detection in nonhomogeneous environments

    Get PDF
    We consider the adaptive detection of a signal of interest embedded in colored noise, when the environment is nonhomogeneous, i.e., when the training samples used for adaptation do not share the same covariance matrix as the vector under test. A Bayesian framework is proposed where the covariance matrices of the primary and the secondary data are assumed to be random, with some appropriate joint distribution. The prior distributions of these matrices require a rough knowledge about the environment. This provides a flexible, yet simple, knowledge-aided model where the degree of nonhomogeneity can be tuned through some scalar variables. Within this framework, an approximate generalized likelihood ratio test is formulated. Accordingly, two Bayesian versions of the adaptive matched filter are presented, where the conventional maximum likelihood estimate of the primary data covariance matrix is replaced either by its minimum mean-square error estimate or by its maximum a posteriori estimate. Two detectors require generating samples distributed according to the joint posterior distribution of primary and secondary data covariance matrices. This is achieved through the use of a Gibbs sampling strategy. Numerical simulations illustrate the performances of these detectors, and compare them with those of the conventional adaptive matched filter

    Using epidemic prevalence data to jointly estimate reproduction and removal

    Full text link
    This study proposes a nonhomogeneous birth--death model which captures the dynamics of a directly transmitted infectious disease. Our model accounts for an important aspect of observed epidemic data in which only symptomatic infecteds are observed. The nonhomogeneous birth--death process depends on survival distributions of reproduction and removal, which jointly yield an estimate of the effective reproduction number R(t)R(t) as a function of epidemic time. We employ the Burr distribution family for the survival functions and, as special cases, proportional rate and accelerated event-time models are also employed for the parameter estimation procedure. As an example, our model is applied to an outbreak of avian influenza (H7N7) in the Netherlands, 2003, confirming that the conditional estimate of R(t)R(t) declined below unity for the first time on day 23 since the detection of the index case.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS270 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Nonlinear analysis of spacecraft thermal models

    Full text link
    We study the differential equations of lumped-parameter models of spacecraft thermal control. Firstly, we consider a satellite model consisting of two isothermal parts (nodes): an outer part that absorbs heat from the environment as radiation of various types and radiates heat as a black-body, and an inner part that just dissipates heat at a constant rate. The resulting system of two nonlinear ordinary differential equations for the satellite's temperatures is analyzed with various methods, which prove that the temperatures approach a steady state if the heat input is constant, whereas they approach a limit cycle if it varies periodically. Secondly, we generalize those methods to study a many-node thermal model of a spacecraft: this model also has a stable steady state under constant heat inputs that becomes a limit cycle if the inputs vary periodically. Finally, we propose new numerical analyses of spacecraft thermal models based on our results, to complement the analyses normally carried out with commercial software packages.Comment: 29 pages, 4 figure

    Note on Logarithmic Switchback Terms in Regular and Singular Perturbation Expansions

    Get PDF
    The occurrence of logarithmic switchback is studied for ordinary differential equations containing a parameter k which is allowed to take any value in a continuum of real numbers and with boundary conditions imposed at x = Δ and x = ∞. Classical theory tells us that if the equation has a regular singular point at the origin there is a family of solutions which varies continuously with k, and the expansion around the origin has log x terms for a discrete set of values of k. It is shown here how nonlinearity enlarges this set so that it may even be dense in some interval of the real numbers. A log x term in the expansion in x leads to expansion coefficients containing log Δ (switchback) in the perturbation expansion. If for a given value of k logarithmic terms in x and Δ occur they may be obtained by continuity from neighboring values of k. Switchback terms occurred conspicuously in singular-perturbation solutions of problems posed for semi-infinite domain x ≄ Δ. This connection is historical rather than logical. In particular we study here switchback terms for a specific example using methods of both singular and regular perturbations

    Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    Get PDF
    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases
    • 

    corecore