9 research outputs found

    High-accuracy real-time microseismic analysis platform : case study based on the super-sauze mud-based landslide

    Get PDF
    Understanding the evolution of landslide and other subsurface processes via microseismic monitoring and analysis is of paramount importance in predicting or even avoiding an imminent slope failure (via an early warning system). Microseismic monitoring recordings are often continuous, noisy and consist of signals emitted by various sources. Automated analysis of landslide processes comprises detection, localization and classification of microseismic events (with magnitude <2 richter scale). Previous research has mainly focused on manually tuning signal processing methods for detecting and classifying microseismic signals based on the signal waveform and its spectrum, which is time-consuming especially for long-term monitoring and big datasets. This paper proposes an automatic analysis platform that performs event detection and classification, after suitable feature selection, in near realtime. The platform is evaluated using seismology data from the Super-Sauze mud-based landslide, which islocated in the southwestern French Alps, and features earthquake, slidequake and tremor type events

    Variational mode decomposition denoising combined with the Euclidean distance for diesel engine vibration signal

    Get PDF
    Variational mode decomposition (VMD) is a recently introduced adaptive signal decomposition algorithm with a solid theoretical foundation and good noise robustness compared with empirical mode decomposition (EMD). There is a lot of background noise in the vibration signal of diesel engine. To solve the problem, a denoising algorithm based on VMD and Euclidean Distance is proposed. Firstly, a multi-component, non-Gauss, and noisy simulation signal is established, and decomposed into a given number K of band-limited intrinsic mode functions by VMD. Then the Euclidean distance between the probability density function of each mode and that of the simulation signal are calculated. The signal is reconstructed using the relevant modes, which are selected on the basis of noticeable similarities between the probability density function of the simulation signal and that of each mode. Finally, the vibration signals of diesel engine connecting rod bearing faults are analyzed by the proposed method. The results show that compared with other denoising algorithms, the proposed method has better denoising effect, and the fault characteristics of vibration signals of diesel engine connecting rod bearings can be effectively enhanced

    A SEISMOLOGIC STUDY OF THE NORTHERN MISSISSIPPI EMBAYMENT

    Get PDF
    Part 1: Crustal structure in the New Madrid Seismic Zone (NMSZ) is investigated through a detailed study of explosion data obtained from the Embayment Seismic Excitation Experiment. The data show a distinct anisotropy in distance attenuation for both P and S waves in the range from 0 to 200km distance. Waves that propagate northward from the 1,134kg Marked Tree, Arkansas, explosion attenuate quickly with distance until a range of about 100km from the source where high-amplitude, high-phase velocity critical reflections from the boundary between the middle crust and rift pillow structure produce high amplitude waves. Propagation southward from the 2,268kg Mooring, Tennessee blast shows less distance attenuation compared to northward propagation. Reflections from the middle crust-lower crust boundary occur but do not significantly increase in amplitude with distance and travel with slower apparent phase velocity than observed for the northward propagation data set. A smooth velocity model is developed using a stabilized Weichert-Herglotz travel time inversion using first arrival travel times. Then an inversion using the travel time of both direct and middle crustal reflected waves is developed to obtain a 2D inhomogeneous-layered isotropic crustal model. The result reveals that there is a significant southwest dip to the top of the middle crust interface in the vicinity of the NMSZ, consistent with previously inferred changes in the thickness of the rift pillow model. This 2D feature characterizes the local wave propagation along the Reelfoot Rift and demonstrates the need for an improvement of the current Central United States velocity model.Part 2: Obtaining reliable empirical Greens functions (EGFs) from ambient noise by seismic interferometry requires homogenously distributed noise sources. However, it is difficult to attain this condition since ambient noise data usually contains highly correlated signals from earthquakes or other transient sources from human activities. Removing these transient signals is one of the most essential steps in the whole data processing flow to obtain EGFs. We propose to use a denoising method based on the continuous wavelet transform to achieve this goal. The noise level is estimated in the wavelet domain for each scale by determing the 99% confidence level of the empirical probability density function of the noise wavelet coefficients. The correlated signals are then removed by an efficient soft thresholding method. The same denoising algorithm is also applied to remove the noise in the final stacked cross-correlogram. A complete data processing workflow is provided with the overall data processing procedure divided into four stages: (1) single station data preparation, (2) removal of earthquakes and other transient signals in the seismic record, (3) spectrum whitening, cross-correlation and temporal stacking, and (4) remove the noise in the stacked cross-correlogram to deliver the final EGF. The whole process is automated to make it accessible for large datasets. Synthetic data constructed with a recorded earthquake and recorded ambient noise is used to test the denoising method. We then apply the new processing workflow to data recorded by the USArray Transportable Array stations near the New Madrid Seismic Zone where many seismic events and transient signals are observed. We compare the EGFs calculated from our workflow with commonly used time domain normalization method and our results show improved signal-to-noise ratios. The new workflow can deliever reliable EGFs for further studies.Part 3: We incorporate seismic ambient noise data recorded by different temporary and permanent broadband stations around the northern Mississippi Embayment from 1990 to 2018 to develop a crustal shear wave velocity (Vs) model for this area with full waveform ambient noise tomography. Empirical Greens functions at periods between 8 and 40s for all the possible pairs of stations are extracted by using a new seismic ambient noise data processing flow based on the continuous wavelet transform. Synthetic waveforms are then calculated through a heterogeneous Earth model using a GPU-enabled collocated finite-difference code. The cross-correlation time shifts between the synthetic waveforms and the extracted empirical Greens functions are used to construct the velocity updated kernel by using the adjoint method. Starting from the Central United States Velocity Model, the shear wave velocity model is then iteratively updated with the Vs kernel calculated in each iteration. Checkerboard tests show that perturbations in the top 30km of the crust are well recovered but amplitude recovery ability gradually decreases for deeper structure. We find that velocity lows characterize the Reelfoot Rift Graben and Rough Creek Graben separated by a high velocity crust. High velocity anomalies are observed under the Ozark Uplift and Paducah Gravity Lineament. A low velocity area previously interpreted as the Missouri Batholith is observed between them. A massive high velocity body in the southeast Mississippi Embayment is observed and is explained by the faulting as well as partly mafic intrusion. The Ouachita-Appalachian Thrust Front is clearly observed with a thinner crustal layer underneath. The rift pillow is well observed in the final tomography model along the Reelfoot Rift in the lower crust. The final inverted velocity model is consistent with local geological features and can be used for other seismological studies such as earthquake source determination and earthquake hazard assessment

    A sign-preserving filter for signal decomposition

    Get PDF
    There are optimization problems in which an improvement in performance or a reduction in cost can be attained if the input signal of the system is split into multiple components. Splitting the signal allows customizing the design of the system’s hardware for a narrower range of frequencies, which in turn allows making a better use of its physical properties. There exist applications that have very specific signal-splitting requirements, such as ‘counter-flow avoidance’, that conventional signal processing tools cannot meet. Accordingly, a novel ‘Sign-Preserving’ filter has been developed and is presented in this article. The underlying algorithm of the filter is comprehensively explained with the aim of facilitating its reproduction, and the aspects of its operation are thoroughly discussed. The filter has two key features: (1) it separates a discrete signal a into two components – a mostly low-frequency signal b and a predominantly high-frequency signal c such that the sum of b and c replicates exactly the original signal a and, more importantly, (2) the signs of the two output signals are equal to the sign of a at all times. The article presents two case studies which demonstrate the use of the Sign-Preserving filter for the optimization of real-life applications, in which counter-flow must be avoided: the hybridization of the battery pack of an electric vehicle and the parallelization of a packed bed thermal energy store

    A new signal processing method for acoustic emission/microseismic data analysis

    Get PDF
    The acoustic emission/microseismic technique (AE/MS) has emerged as one of the most important techniques in recent decades and has found wide applications in different fields. Extraction of seismic event with precise timing is the first step and also the foundation for processing AE/MS signals. However, this process remains a challenging task for most AE/MS applications. The process has generally been performed by human analysts. However, manual processing is time consuming and subjective. These challenges continue to provide motivation for the search for new and innovative ways to improve the signal processing needs of the AE/MS technique. This research has developed a highly efficient method to resolve the problems of background noise and outburst activities characteristic of AE/MS data to enhance the picking of P-phase onset time. The method is a hybrid technique, comprising the characteristic function (CF), high order statistics, stationary discrete wavelet transform (SDWT), and a phase association theory. The performance of the algorithm has been evaluated with data from a coal mine and a 3-D concrete pile laboratory experiment. The accuracy of picking was found to be highly dependent on the choice of wavelet function, the decomposition scale, CF, and window size. The performance of the algorithm has been compared with that of a human expert and the following pickers: the short-term average to long-term average (STA/LTA), the Baer and Kradolfer, the modified energy ratio, and the short-term to long-term kurtosis. The results show that the proposed method has better picking accuracy (84% to 78% based on data from a coal mine) than the STA/LTA. The introduction of the phase association theory and the SDWT method in this research provided a novelty, which has not been seen in any of the previous algorithms --Abstract, page iii

    Advanced Techniques and Efficiency Assessment of Mechanical Processing

    Get PDF
    Mechanical processing is just one step in the value chain of metal production, but to some exten,t it determines an effectiveness of separation through suitable preparation of the raw material for beneficiation processes through production of required particle sze composition and useful mineral liberation. The issue is mostly related to techniques of comminution and size classification, but it also concerns methods of gravity separation, as well as modeling and optimization. Technological and economic assessment supplements the issue

    Magnetic Particle Imaging - Anwendungen von magnetischen Nanopartikeln in Analytik und Bildgebung

    Get PDF
    Magnetic Particle Imaging (MPI) is a new imaging modality that delivers tracer-based volume images with high spatial and temporal resolution. The properties of the nanoparticular tracer, that needs to be present in the imaging volume for MPI to render image contrast, have direct impact on the MPI performance. The magnetization dynamics of the superparamagnetic nanoparticles are a critical factor in MPI system design. However, once understood and numerically modelled the particle's magnetization dynamics are key to enabling functional imaging with MPI based on potential particle functionalization. This thesis describes the development of a magnetic particle imaging scanner and its accompanying particle characterization technique, magnetic particle spectroscopy (MPS). The devices have been designed, built and tested to deliver insights into particle dynamics and to function as a prototype platform for MPI research. That includes the scanner hardware as well as the software for modelling the particle's magnetization response and image reconstruction. The main focus is on the development and evolution of the so called 'Mobility MPI' (mMPI) which promises to provide an estimate of the particle mobility, including the hydrodynamic diameter of the particles and the viscosity of the surrounding medium, in additional to the standard concentration-weighted MPI image. By allowing a discrimination between NĂ©el and Brownian contributions, mMPI in conjunction with a suitable tracer enables binding detection in the imaging volume. The harmonic spectrum connected with the dynamic magnetization response of the tracer is studied in MPS. The ability for conducting bio-assays with MPS is explored and the results are evaluated in context of appropriate numerical models. Furthermore, the effect of viscosity on the MPI system matrix is studied and different approaches for deducing mobility information from an MPI experiment are investigated.Magnetic Particle Imaging (MPI) ist eine neue BildgebungsmodalitĂ€t, die Volumenbilder mit hoher rĂ€umlicher und zeitlicher Auflösung liefert. Die Eigenschaften des nanopartikulĂ€ren Markers, der im Bildgebungsvolumen anwesend einen Bildkontrast generiert, haben dabei direkten Einfluss auf die MPI-Performance. Die Magnetisierungsdynamik der superparamagnetischen Nanopartikel ist auch ein entscheidender Faktor im MPI Systemdesign. Ein eingehendes VerstĂ€ndnis und die numerische Modellierung der Partikel-Magnetisierungsdynamik kann dabei ein SchlĂŒssel zur Realisierung von funktionaler Bildgebung im MPI sein, die auf einer möglichen Funktionalisierung der Partikel beruht. Diese Arbeit beschreibt die Entwicklung eines Magnetic Particle Imaging Scanners und der dazugehörigen Charakterisierungstechnik, der Magnetic Particle Spektroscopy (MPS). Die GerĂ€te wurden dabei entwickelt, gebaut und getestet, um Einblicke in die Partikeldynamik zu geben und um als Prototyp-Plattform fĂŒr die MPI-Forschung zu dienen. Das schließt sowohl die Scanner-Hardware als auch die Software zur Modellierung der dynamischen Partikelantwort und zur Bildrekonstruktion ein. Der Fokus liegt hierbei auf der Entwicklung des sogenannten 'Mobility MPI' (mMPI), welches eine Bestimmung der Partikelbeweglichkeit zusĂ€tzlich zur konventionellen konzentrations-gewichteten MPI-Bildgebung ermöglicht. Die Partikelbeweglichkeit umfasst dabei den hydrodynamischen Durchmesser der Partikel und die ViskositĂ€t des sie umgebenden Mediums. Durch die Unterscheidung von NĂ©el'schen und Brown'schen BeitrĂ€gen zur Magnetisierung ermöglicht mMPI in Verbindung mit einem geeigneten Marker die Bindungsdetektion im Bildgebungsvolumen. Das Harmonischen-Spektrum und die dynamische Magnetisierungsantwort des MPI-Markers werden im MPS untersucht. Außerdem wird die DurchfĂŒhrung von Bio-Assays auf der Basis von MPS erkundet, und die Ergebnisse werden mit entsprechenden numerischen Modellen verglichen. DarĂŒber hinaus wird der Einfluss der ViskositĂ€t auf die MPI System-Matrix analysiert und verschiedene AnsĂ€tze zur Ableitung der MobilitĂ€tsinformation der Partikel aus den MPI Messdaten untersucht
    corecore