1,625 research outputs found

    Utilising temporal signal features in adverse noise conditions: Detection, estimation, and the reassigned spectrogram

    Get PDF
    Visual displays in passive sonar based on the Fourier spectrogram are underpinned by detection models that rely on signal and noise power statistics. Time-frequency representations specialised for sparse signals achieve a sharper signal representation, either by reassigning signal energy based on temporal structure or by conveying temporal structure directly. However, temporal representations involve nonlinear transformations that make it difficult to reason about how they respond to additive noise. This article analyses the effect of noise on temporal fine structure measurements such as zero crossings and instantaneous frequency. Detectors that rely on zero crossing intervals, intervals and peak amplitudes, and instantaneous frequency measurements are developed, and evaluated for the detection of a sinusoid in Gaussian noise, using the power detector as a baseline. Detectors that rely on fine structure outperform the power detector under certain circumstances; and detectors that rely on both fine structure and power measurements are superior. Reassigned spectrograms assume that the statistics used to reassign energy are reliable, but the derivation of the fine structure detectors indicates the opposite. The article closes by proposing and demonstrating the concept of a doubly reassigned spectrogram, wherein temporal measurements are reassigned according to a statistical model of the noise background

    Time domain threshold crossing for signals in noise

    Get PDF
    This work investigates the discrimination of times between threshold crossings for deterministic periodic signals with added band-limited noise. The methods include very low signal to noise ratio (one or less).Investigation has concentrated on the theory of double threshold crossings, with especial care taken in the effects of correlations in the noise, and their effects on the probability of detection of double crossings. A computer program has been written to evaluate these probabilities for a wide range of signal to noise ratiOS, a wide range of signal to bandwidth ratios, and a range of times between crossings of up to two signal periods. Correlations due to the extreme cases of a Brickwall filter and a second order Butterworth filter have been included; other filters can easily be included in the program.The method is simulated and demonstrated by implementing on a digital signal processor (DSP) using a TMS32020. Results from the DSP technique are in agreement with the theoretical evaluations.Probability results could be used to determine optimum time thresholds and windows for signal detection and frequency discrimination, to determine the signal length for adequate discrimination, and to evaluate channel capacities.The ability to treat high noise, including exact effects of time correlations, promises new applications in electronic signal detection, communications, and pulse discrimination neural networks

    Development of reliability methodology for systems engineering. Volume III - Theoretical investigations - An approach to a class of reliability problems Final report

    Get PDF
    Random quantities from continuous time stochastic process with application to reliability and probabilit

    A study of speech probability distributions W.B. Davenport, Jr.

    Get PDF
    "August 25, 1950."Bibliography: p. 75-76.Army Signal Corps Contract No. W-36-039-sc-32037 Project No. 102B. Dept. of the Army Project No. 3-99-10-022

    The BayesWave analysis pipeline in the era of gravitational wave observations

    Get PDF
    We describe updates and improvements to the BayesWave gravitational wave transient analysis pipeline, and provide examples of how the algorithm is used to analyze data from ground-based gravitational wave detectors. BayesWave models gravitational wave signals in a morphology-independent manner through a sum of frame functions, such as Morlet-Gabor wavelets or chirplets. BayesWave models the instrument noise using a combination of a parametrized Gaussian noise component and non-stationary and non-Gaussian noise transients. Both the signal model and noise model employ trans-dimensional sampling, with the complexity of the model adapting to the requirements of the data. The flexibility of the algorithm makes it suitable for a variety of analyses, including reconstructing generic unmodeled signals; cross checks against modeled analyses for compact binaries; as well as separating coherent signals from incoherent instrumental noise transients (glitches). The BayesWave model has been extended to account for gravitational wave signals with generic polarization content and the simultaneous presence of signals and glitches in the data. We describe updates in the BayesWave prior distributions, sampling proposals, and burn-in stage that provide significantly improved sampling efficiency. We present standard review checks indicating the robustness and convergence of the BayesWave trans-dimensional sampler

    Investigations into the Uncertainties of Interferometric Measurements of Linear and Circular Vibrations

    Get PDF

    Electromyogram (EMG) Signal Analysis: Extraction of a Novel EMG Feature and Optimal Root Difference of Squares (RDS) Processing in Additive Noise

    Get PDF
    Electromyogram signals generated by human muscles can be measured on the surface of the skin and then processed for use in applications such as prostheses control, kinesiology and diagnostic medicine. Most EMG applications extract an estimate of the EMG amplitude, defined as the time-varying standard deviation of EMG, EMGσ. To improve the quality of EMGσ, additional signal processing techniques, such as whitening, noise reduction and additional signal features can be incorporated into the EMGσ processing. Implementation of these additional processing techniques improve the quality of the processed signal but at the cost of increased computational complexity and required calibration contractions. Whitening filters are employed to temporally decorrelate data so that the samples are statistically independent. Different types of whitening filters, linear and adaptive, and their performance have been previously studied in (Clancy and Hogan) and (Clancy and Farry). The linear filter fails at low effort levels and the adaptive filter requires a calibration every time electrodes are removed and reapplied. With the goal of avoiding the disadvantages of the previous whitening filter approaches, the first signal processing technique studied herein developed a universal fixed whitening filter using the ensemble mean of the power spectrum density of EMG recordings from the 64 subjects available in an existing data set. Performance of the EMG to torque model with the universal fixed whitening filter was computed to be 4.8% maximum voluntary contraction (MVC); this is comparable to the 4.84 %MVC error computed for the adaptive whitening filter. The universal fixed whitening filter preserves the performance of the adaptive filter but need not be calibrated for each electrode. To optimize noise reduction, the second signal processing technique studied derived analytical models using the resting EMG data. The probability density function of the rest contractions was observed to be very close to a Gaussian distribution, showing only a 1.6% difference when compared to a Gaussian distribution. Once the models were developed, they were used to prove that the optimal subtraction of the noise variance is to compute the root of the difference between the signal squared and noise variance (RDS). If this result would lead to a negative value, it must be set to zero; EMGσ cannot contain negative components. Once the RDS was proven to be the optimal noise subtraction, it was implemented on 0% MVC and 50% MVC data. The RDS processing has a considerable impact on lower level contractions (0% MVC), but not on higher level contractions (50% MVC), as expected. The third signal processing technique involved the creation of a new EMG feature from four individual signal features. Different techniques were used to combine EMGσ, zero crossings (ZC), slope sign changes (SSC) and waveform length (WL) into a single new EMG feature that would be used in an end application, such as the modeling of torque about the elbow or prosthesis control. The new EMG feature was developed to reduce the variance of the traditional EMGσ only feature and to eliminate the need for calibration contractions. Five different methods of combination were attempted, but none of the new EMG features improved performance in EMG to torque model
    corecore