125 research outputs found

    Time-Frequency Analysis of the Auditory Brainstem Response

    Get PDF
    This thesis is about time-frequency analysis of the brainstem auditory evoked potential (BAEP). The work can be divided into two parts. One part where a model is built up from a very simple example to a more complex model resulting in a model consisting of a sum of sinusoids with stochastic starting points and amplitudes. Dierent time-frequency methods have been evaluated for these models and the multi window spectrogram with Hermitian base functions performs the best in a real life situation with more than one component and a high level of noise. The second part consists of investigating real BAEP data. BAEP data from ve patients were available. Each patient has two data sets which have been studied. One while the patient is awake and one while it is asleep. A hypothesis is that there exists some sort of dierence between these two datasets. It turns out that it does. The earlier peaks dier slightly in latency and the later peaks for the sleeping data seem to disappear. This result is concluded from dierent time frequency methods, where the spectrogram and the multi-window spectrogram are the most successful methods. An attempt to make a bootstrap simulation in order to estimate the mean and condence bounds of each peak is also made for one dataset

    Multi-signal Anomaly Detection for Real-Time Embedded Systems

    Get PDF
    This thesis presents MuSADET, an anomaly detection framework targeting timing anomalies found in event traces from real-time embedded systems. The method leverages stationary event generators, signal processing, and distance metrics to classify inter-arrival time sequences as normal/anomalous. Experimental evaluation of traces collected from two real-time embedded systems provides empirical evidence of MuSADET’s anomaly detection performance. MuSADET is appropriate for embedded systems, where many event generators are intrinsically recurrent and generate stationary sequences of timestamp. To find timinganomalies, MuSADET compares the frequency domain features of an unknown trace to a normal model trained from well-behaved executions of the system. Each signal in the analysis trace receives a normal/anomalous score, which can help engineers isolate the source of the anomaly. Empirical evidence of anomaly detection performed on traces collected from an industrygrade hexacopter and the Controller Area Network (CAN) bus deployed in a real vehicle demonstrates the feasibility of the proposed method. In all case studies, anomaly detection did not require an anomaly model while achieving high detection rates. For some of the studied scenarios, the true positive detection rate goes above 99 %, with false-positive rates below one %. The visualization of classification scores shows that some timing anomalies can propagate to multiple signals within the system. Comparison to the similar method, Signal Processing for Trace Analysis (SiPTA), indicates that MuSADET is superior in detection performance and provides complementary information that can help link anomalies to the process where they occurred

    Uncertainty modelling in power spectrum estimation of environmental processes

    Get PDF
    For efficient reliability analysis of buildings and structures, robust load models are required in stochastic dynamics, which can be estimated in particular from environmental processes, such as earthquakes or wind loads. To determine the response behaviour of a dynamic system under such loads, the power spectral density (PSD) function is a widely used tool for identifying the frequency components and corresponding amplitudes of environmental processes. Since the real data records required for this purpose are often subject to aleatory and epistemic uncertainties, and the PSD estimation process itself can induce further uncertainties, a rigorous quantification of these is essential, as otherwise a highly inaccurate load model could be generated which may yield in misleading simulation results. A system behaviour that is actually catastrophic can thus be shifted into an acceptable range, classifying the system as safe even though it is exposed to a high risk of damage or collapse. To address these issues, alternative loading models are proposed using probabilistic and non-deterministic models, that are able to efficiently account for these uncertainties and to model the loadings accordingly. Various methods are used in the generation of these load models, which are selected in particular according to the characteristic of the data and the number of available records. In case multiple data records are available, reliable statistical information can be extracted from a set of similar PSD functions that differ, for instance, only slightly in shape and peak frequency. Based on these statistics, a PSD function model is derived utilising subjective probabilities to capture the epistemic uncertainties and represent this information effectively. The spectral densities are characterised as random variables instead of employing discrete values, and thus the PSD function itself represents a non-stationary random process comprising a range of possible valid PSD functions for a given data set. If only a limited amount of data records is available, it is not possible to derive such reliable statistical information. Therefore, an interval-based approach is proposed that determines only an upper and lower bound and does not rely on any distribution within these bounds. A set of discrete-valued PSD functions is transformed into an interval-valued PSD function by optimising the weights of pre-derived basis functions from a Radial Basis Function Network such that they compose an upper and lower bound that encompasses the data set. Therefore, a range of possible values and system responses are identified rather than discrete values, which are able to quantify the epistemic uncertainties. When generating such a load model using real data records, the problem can arise that the individual records exhibit a high spectral variance in the frequency domain and therefore differ too much from each other, although they appear to be similar in the time domain. A load model derived from these data may not cover the entire spectral range and is therefore not representative. The data are therefore grouped according to their similarity using the Bhattacharyya distance and k-means algorithm, which may generate two or more load models from the entire data set. These can be applied separately to the structure under investigation, leading to more accurate simulation results. This approach can also be used to estimate the spectral similarity of individual data sets in the frequency domain, which is particularly relevant for the load models mentioned above. If the uncertainties are modelled directly in the time signal, it can be a challenging task to transform them efficiently into the frequency domain. Such a signal may consist only of reliable bounds in which the actual signal lies. A method is presented that can automatically propagate this interval uncertainty through the discrete Fourier transform, obtaining the exact bounds on the Fourier amplitude and an estimate of the PSD function. The method allows such an interval signal to be propagated without making assumptions about the dependence and distribution of the error over the time steps. These novel representations of load models are able to quantify epistemic uncertainties inherent in real data records and induced due to the PSD estimation process. The strengths and advantages of these approaches in practice are demonstrated by means of several numerical examples concentrated in the field of stochastic dynamics.FĂŒr eine effiziente ZuverlĂ€ssigkeitsanalyse von GebĂ€uden und Strukturen sind robuste Belastungsmodelle in der stochastischen Dynamik erforderlich, die insbesondere aus Umweltprozessen wie Erdbeben oder Windlasten geschĂ€tzt werden können. Um das Antwortverhalten eines dynamischen Systems unter solchen Belastungen zu bestimmen, ist die Funktion der Leistungsspektraldichte (PSD) ein weit verbreitetes Werkzeug zur Identifizierung der Frequenzkomponenten und der entsprechenden Amplituden von Umweltprozessen. Da die zu diesem Zweck benötigten realen DatensĂ€tze hĂ€ufig mit aleatorischen und epistemischen Unsicherheiten behaftet sind und der PSD-SchĂ€tzprozess selbst weitere Unsicherheiten induzieren kann, ist eine strenge Quantifizierung dieser Unsicherheiten unerlĂ€sslich, da andernfalls ein sehr ungenaues Belastungsmodell erzeugt werden könnte, das zu fehlerhaften Simulationsergebnissen fĂŒhren kann. Ein eigentlich katastrophales Systemverhalten kann so in einen akzeptablen Bereich verschoben werden, so dass das System als sicher eingestuft wird, obwohl es einem hohen Risiko der BeschĂ€digung oder des Zusammenbruchs ausgesetzt ist. Um diese Probleme anzugehen, werden alternative Belastungsmodelle vorgeschlagen, die probabilistische und nicht-deterministische Modelle verwenden, welche in der Lage sind, diese Unsicherheiten effizient zu berĂŒcksichtigen und die Belastungen entsprechend zu modellieren. Bei der Erstellung dieser Lastmodelle werden verschiedene Methoden verwendet, die insbesondere nach dem Charakter der Daten und der Anzahl der verfĂŒgbaren DatensĂ€tze ausgewĂ€hlt werden. Wenn mehrere DatensĂ€tze verfĂŒgbar sind, können zuverlĂ€ssige statistische Informationen aus einer Reihe Ă€hnlicher PSD-Funktionen extrahiert werden, die sich z.B. nur geringfĂŒgig in Form und Spitzenfrequenz unterscheiden. Auf der Grundlage dieser Statistiken wird ein Modell der PSD-Funktion abgeleitet, das subjektive Wahrscheinlichkeiten verwendet, um die epistemischen Unsicherheiten zu erfassen und diese Informationen effektiv darzustellen. Die spektralen Leistungsdichten werden als Zufallsvariablen charakterisiert, anstatt diskrete Werte zu verwenden, somit stellt die PSD-Funktion selbst einen nicht-stationĂ€ren Zufallsprozess dar, der einen Bereich möglicher gĂŒltiger PSD-Funktionen fĂŒr einen gegebenen Datensatz umfasst. Wenn nur eine begrenzte Anzahl von DatensĂ€tzen zur VerfĂŒgung steht, ist es nicht möglich, solche zuverlĂ€ssigen statistischen Informationen abzuleiten. Daher wird ein intervallbasierter Ansatz vorgeschlagen, der nur eine obere und untere Grenze bestimmt und sich nicht auf eine Verteilung innerhalb dieser Grenzen stĂŒtzt. Ein Satz von diskret wertigen PSD-Funktionen wird in eine intervallwertige PSD-Funktion umgewandelt, indem die Gewichte von vorab abgeleiteten Basisfunktionen aus einem Radialbasisfunktionsnetz so optimiert werden, dass sie eine obere und untere Grenze bilden, die den Datensatz umfassen. Damit wird ein Bereich möglicher Werte und Systemreaktionen anstelle diskreter Werte ermittelt, welche in der Lage sind, epistemische Unsicherheiten zu erfassen. Bei der Erstellung eines solchen Lastmodells aus realen DatensĂ€tzen kann das Problem auftreten, dass die einzelnen DatensĂ€tze eine hohe spektrale Varianz im Frequenzbereich aufweisen und sich daher zu stark voneinander unterscheiden, obwohl sie im Zeitbereich Ă€hnlich erscheinen. Ein aus diesen Daten abgeleitetes Lastmodell deckt möglicherweise nicht den gesamten Spektralbereich ab und ist daher nicht reprĂ€sentativ. Die Daten werden daher mit Hilfe der Bhattacharyya-Distanz und des k-means-Algorithmus nach ihrer Ähnlichkeit gruppiert, wodurch zwei oder mehr Belastungsmodelle aus dem gesamten Datensatz erzeugt werden können. Diese können separat auf die zu untersuchende Struktur angewandt werden, was zu genaueren Simulationsergebnissen fĂŒhrt. Dieser Ansatz kann auch zur SchĂ€tzung der spektralen Ähnlichkeit einzelner DatensĂ€tze im Frequenzbereich verwendet werden, was fĂŒr die oben genannten Lastmodelle besonders relevant ist. Wenn die Unsicherheiten direkt im Zeitsignal modelliert werden, kann es eine schwierige Aufgabe sein, sie effizient in den Frequenzbereich zu transformieren. Ein solches Signal kann möglicherweise nur aus zuverlĂ€ssigen Grenzen bestehen, in denen das tatsĂ€chliche Signal liegt. Es wird eine Methode vorgestellt, mit der diese Intervallunsicherheit automatisch durch die diskrete Fourier Transformation propagiert werden kann, um die exakten Grenzen der Fourier-Amplitude und der SchĂ€tzung der PSD-Funktion zu erhalten. Die Methode ermöglicht es, ein solches Intervallsignal zu propagieren, ohne Annahmen ĂŒber die AbhĂ€ngigkeit und Verteilung des Fehlers ĂŒber die Zeitschritte zu treffen. Diese neuartigen Darstellungen von Lastmodellen sind in der Lage, epistemische Unsicherheiten zu quantifizieren, die in realen DatensĂ€tzen enthalten sind und durch den PSD-SchĂ€tzprozess induziert werden. Die StĂ€rken und Vorteile dieser AnsĂ€tze in der Praxis werden anhand mehrerer numerischer Beispiele aus dem Bereich der stochastischen Dynamik demonstriert

    Electric load information system based on non-intrusive power monitoring

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2003.Includes bibliographical references (leaves 192-200).Obtaining high quality information economically and reliably is always a difficult objective to achieve. The electric power industry and consumers recently face many challenges, such as deregulation, autonomous power systems and power quality. The knowledge of the nature and state of the power systems will undoubtedly be the key in meeting these challenges. The Non-Intrusive Power Monitor is a novel attempt to collect such information with a minimal physical installation. Raw voltage and current are measured at a single location to yield harmonic power signals. They typically carry the fingerprints of the electric loads present in a system, and their analysis can produce such information as the operational and diagnostic status of the loads. The power signals can also be used for the system identification, parameter estimation and energy consumption optimization study. In this research, the power signals are mostly modeled as stochastic processes and various detection, estimation and pattern recognition algorithms are developed to extract desired information. A constant load status identifier is developed in this thesis which can identify the ON and OFF status of electric loads, both from their steady-state power consumptions and transient patterns. The identifier can also classify multiple load events occurring at a same time and estimate states without load events. The power consumed by a variable speed drive is also estimated using the correlations between the fundamental powers and higher harmonic powers. The harmonic signal generated by the imbalance of a rotating machine is estimated to monitor the drive, i.e. its speed and magnitude of the imbalance. The algorithms are thoroughly tested using the data collected at real buildings, and some of them are implemented on-line.(cont.) This thesis focuses on developing mathematical models and signal processing algorithms for the customers at the end of the AC distribution system. Its results will directly benefit the developments of a ubiquitous electric meter in a deregulated market, a diagnostic or prognostic tool for mission-critical systems and an intelligent power quality monitor.by Kwangduk Douglas Lee.Ph.D

    Statistical signal processing for echo signals from ultrasound linear and nonlinear scatterers

    Get PDF

    EEG Connectivity Analysis Using Denoising Autoencoders for the Detection of Dyslexia

    Get PDF
    The Temporal Sampling Framework (TSF) theorizes that the characteristic phonological difficulties of dyslexia are caused by an atypical oscillatory sampling at one or more temporal rates. The LEEDUCA study conducted a series of Electroencephalography (EEG) experiments on children listening to amplitude modulated (AM) noise with slow-rythmic prosodic (0.5–1Hz), syllabic (4–8Hz) or the phoneme (12–40Hz) rates, aimed at detecting differences in perception of oscillatory sampling that could be associated with dyslexia. The purpose of this work is to check whether these differences exist and how they are related to children’s performance in different language and cognitive tasks commonly used to detect dyslexia. To this purpose, temporal and spectral inter-channel EEG connectivity was estimated, and a denoising autoencoder (DAE) was trained to learn a low-dimensional representation of the connectivity matrices. This representation was studied via correlation and classification analysis, which revealed ability in detecting dyslexic subjects with an accuracy higher than 0.8, and balanced accuracy around 0.7. Some features of the DAE representation were significantly correlated (ïżœ<0.005 ) with children’s performance in language and cognitive tasks of the phonological hypothesis category such as phonological awareness and rapid symbolic naming, as well as reading efficiency and reading comprehension. Finally, a deeper analysis of the adjacency matrix revealed a reduced bilateral connection between electrodes of the temporal lobe (roughly the primary auditory cortex) in DD subjects, as well as an increased connectivity of the F7 electrode, placed roughly on Broca’s area. These results pave the way for a complementary assessment of dyslexia using more objective methodologies such as EEG

    A new recursive high-resolution parametric method for power spectral density estimation

    Get PDF
    Thesis (M.Eng.Sc.)--University of Adelaide, Dept. of Electrical and Electronic Engineering, 199

    Unsupervised Classification of Uterine Contractions Recorded Using Electrohysterography

    Get PDF
    Pregnancy still poses health risks that are not attended to by current clinical practice motorization procedures. Electrohysterography (EHG) record signals are analyzed in the course of this thesis as a contribution and effort to evaluate their suitability for pregnancy monitoring. The presented work is a contributes with an unsupervised classification solution for uterine contractile segments to FCT’s Uterine Explorer (UEX) project, which explores analysis procedures for EHG records. In a first part, applied processing procedures are presented and a brief exploration of the best practices for these. The procedures include those to elevate the representation of uterine events relevant characteristics, ease further computation requirements, extraction of contractile segments and spectral estimation. More detail is put into the study of which characteristics should be chosen to represent uterine events in the classification process and feature selection methods. To such end, it is presented the application of a principal component analysis (PCA) to three sets: interpolated contractile events, contractions power spectral densities, and to a number of computed features that attempt evidencing time, spectral and non-linear characteristics usually used in EHG related studies. Subsequently, a wrapper model approach is presented as a mean to optimize the feature set through cyclically attempting the removal and re-addition of features based on clustering results. This approach takes advantage of the fact that one class is known beforehand to use its classification accuracy as the criteria that defines whether the modification made to the feature set was ominous. Furthermore, this work also includes the implementation of a visualization tool that allows inspecting the effect of each processing procedure, the uterine events detected by different methods and clusters they were associated to by the final iteration of the wrapper model
    • 

    corecore