522,037 research outputs found

    A subpixel target detection algorithm for hyperspectral imagery

    Get PDF
    The goal of this research is to develop a new algorithm for the detection of subpixel scale target materials on the hyperspectral imagery. The signal decision theory is typically to decide the existence of a target signal embedded in the random noise. This implies that the detection problem can be mathematically formalized by signal decision theory based on the statistical hypothesis test. In particular, since any target signature provided by airborne/spaceborne sensors is embedded in a structured noise such as background or clutter signatures as well as broad band unstructured noise, the problem becomes more complicated, and particularly much more under the unknown noise structure. The approach is based on the statistical hypothesis method known as Generalized Likelihood Ratio Test (GLRT). The use of GLRT requires estimating the unknown parameters, and assumes the prior information of two subspaces describing target variation and background variation respectively. Therefore, this research consists of two parts, the implementation of GLRT and the characterization of two subspaces through new approaches. Results obtained from computer simulation, HYDICE image and AVI RIS image show that this approach is feasible

    A Robotic Test of Proprioception within the Hemiparetic Arm Post-stroke

    Get PDF
    Background: Proprioception plays important roles in planning and control of limb posture and movement. The impact of proprioceptive deficits on motor function post-stroke has been difficult to elucidate due to limitations in current tests of arm proprioception. Common clinical tests only provide ordinal assessment of proprioceptive integrity (eg. intact, impaired or absent). We introduce a standardized, quantitative method for evaluating proprioception within the arm on a continuous, ratio scale. We demonstrate the approach, which is based on signal detection theory of sensory psychophysics, in two tasks used to characterize motor function after stroke. Methods: Hemiparetic stroke survivors and neurologically intact participants attempted to detect displacement- or force-perturbations robotically applied to their arm in a two-interval, two-alternative forced-choice test. A logistic psychometric function parameterized detection of limb perturbations. The shape of this function is determined by two parameters: one corresponds to a signal detection threshold and the other to variability of responses about that threshold. These two parameters define a space in which proprioceptive sensation post-stroke can be compared to that of neurologically-intact people. We used an auditory tone discrimination task to control for potential comprehension, attention and memory deficits. Results: All but one stroke survivor demonstrated competence in performing two-alternative discrimination in the auditory training test. For the remaining stroke survivors, those with clinically identified proprioceptive deficits in the hemiparetic arm or hand had higher detection thresholds and exhibited greater response variability than individuals without proprioceptive deficits. We then identified a normative parameter space determined by the threshold and response variability data collected from neurologically intact participants. By plotting displacement detection performance within this normative space, stroke survivors with and without intact proprioception could be discriminated on a continuous scale that was sensitive to small performance variations, e.g. practice effects across days. Conclusions: The proposed method uses robotic perturbations similar to those used in ongoing studies of motor function post-stroke. The approach is sensitive to small changes in the proprioceptive detection of hand motions. We expect this new robotic assessment will empower future studies to characterize how proprioceptive deficits compromise limb posture and movement control in stroke survivors

    Test signal generation for analog circuits

    Get PDF
    In this paper a new test signal generation approach for general analog circuits based on the variational calculus and modern control theory methods is presented. The computed transient test signals also called test stimuli are optimal with respect to the detection of a given fault set by means of a predefined merit functional representing a fault detection criterion. The test signal generation problem of finding optimal test stimuli detecting all faults form the fault set is formulated as an optimal control problem. The solution of the optimal control problem representing the test stimuli is computed using an optimization procedure. The optimization procedure is based on the necessary conditions for optimality like the maximum principle of Pontryagin and adjoint circuit equations

    BAYESIAN APPROACH TO THE MIXTURE OF GAUSSIAN RANDOM FIELDS AND ITS APPLICATION TO AN FUNCTIONAL MAGNETIC RESONANCE IMAGING STUDY

    Get PDF
    Due to the functional nature of fMRI data, random field theory is used as a remedy to the multiple comparisons problem in brain signal detection. Traditionally, a Gaussian random field model is fitted to the functional data using this approach. However, fMRI data are not homogeneous, and there exist multiple underlying classes in functional data, so traditional inferential methods may fail. Here, we proposed a new model for signal detection in fMRI data in which we addressed the heterogeneity in such data. The proposed model is a mixture of two Gaussian random fields. We developed a Bayesian approach for hypothesis testing by using the notion of Bayes factor in infinite-dimensional parameter spaces. For such spaces, the Bayes factor is defined based on the concept of the Radon-Nikodym derivative. In our model, the Bayes factor is interpreted as the inverse of the expected value of a likelihood ratio with respect to the prior density of the model parameters. Obtaining the Bayes factor in infinite-dimensional parameter spaces is not analytically tractable, and we needed to compute it through numerical methods. Our methodology is empirically justified by Monte Carlo simulations and illustrated by an analysis of the simulated dataset
    • …
    corecore