38 research outputs found

    Smart FRP Composite Sandwich Bridge Decks in Cold Regions

    Get PDF
    INE/AUTC 12.0

    Seismic Correction in the Wavelet Domain

    Get PDF
    This thesis summarises novel approaches and methods in the wavelet domain employed and published in the literature by the author for the correction and processing of time-series data from recorded seismic events, obtained from strong motion accelerographs. Historically, the research developed to first de-convolve the instrument response from legacy analogue strong-motion instruments, of which there are a large number. This was to make available better estimates of the acceleration ground motion before the more problematic part of the research that of obtaining ground velocities and displacements. The characteristics of legacy analogue strongmotion instruments are unfortunately in most cases not available, making it difficult to de-couple the instrument response. Essentially this is a system identification problem presented and summarised therein with solutions that are transparent to this lack of instrument data. This was followed by the more fundamental and problematic part of the research that of recovering the velocity and displacement from the recorded data. In all cases the instruments are tri-axial, i.e. translation only. This is a limiting factor and leads to distortions manifest by dc shifts in the recorded data as a consequence of the instrument pitching, rolling and yawing during seismic events. These distortions are embedded in the translation acceleration time–series, their contributions having been recorded by the same tri-axial sensors. In the literature this is termed ‘baseline error’ and it effectively prevents meaningful integration to velocity and displacement. Sophisticated methods do exist, which recover estimates of velocity and displacement, but these require a good measure of expertise and do not recover all the possible information from the recorded data. A novel, automated wavelet transform method developed by the author and published in the earthquake engineering literature is presented. This surmounts the problem of obtaining the velocity and displacement and in addition recovers both a low-frequency pulse called the ‘fling’, the displacement ‘fling-step’ and the form of the baseline error, both inferred in the literature, but hitherto never recovered. Once the acceleration fling pulse is recovered meaningful integration becomes a reality. However, the necessity of developing novel algorithms in order to recover important information emphasises the weakness of modern digital instruments in that they are all tri- rather than sextaxial instruments

    DATA COMPRESSION OVER SEISMIC SENSOR NETWORKS

    Get PDF

    Novi algoritam za kompresiju seizmičkih podataka velike amplitudske rezolucije

    Get PDF
    Renewable sources cannot meet energy demand of a growing global market. Therefore, it is expected that oil & gas will remain a substantial sources of energy in a coming years. To find a new oil & gas deposits that would satisfy growing global energy demands, significant efforts are constantly involved in finding ways to increase efficiency of a seismic surveys. It is commonly considered that, in an initial phase of exploration and production of a new fields, high-resolution and high-quality images of the subsurface are of the great importance. As one part in the seismic data processing chain, efficient managing and delivering of a large data sets, that are vastly produced by the industry during seismic surveys, becomes extremely important in order to facilitate further seismic data processing and interpretation. In this respect, efficiency to a large extent relies on the efficiency of the compression scheme, which is often required to enable faster transfer and access to data, as well as efficient data storage. Motivated by the superior performance of High Efficiency Video Coding (HEVC), and driven by the rapid growth in data volume produced by seismic surveys, this work explores a 32 bits per pixel (b/p) extension of the HEVC codec for compression of seismic data. It is proposed to reassemble seismic slices in a format that corresponds to video signal and benefit from the coding gain achieved by HEVC inter mode, besides the possible advantages of the (still image) HEVC intra mode. To this end, this work modifies almost all components of the original HEVC codec to cater for high bit-depth coding of seismic data: Lagrange multiplier used in optimization of the coding parameters has been adapted to the new data statistics, core transform and quantization have been reimplemented to handle the increased bit-depth range, and modified adaptive binary arithmetic coder has been employed for efficient entropy coding. In addition, optimized block selection, reduced intra prediction modes, and flexible motion estimation are tested to adapt to the structure of seismic data. Even though the new codec after implementation of the proposed modifications goes beyond the standardized HEVC, it still maintains a generic HEVC structure, and it is developed under the general HEVC framework. There is no similar work in the field of the seismic data compression that uses the HEVC as a base codec setting. Thus, a specific codec design has been tailored which, when compared to the JPEG-XR and commercial wavelet-based codec, significantly improves the peak-signal-tonoise- ratio (PSNR) vs. compression ratio performance for 32 b/p seismic data. Depending on a proposed configurations, PSNR gain goes from 3.39 dB up to 9.48 dB. Also, relying on the specific characteristics of seismic data, an optimized encoder is proposed in this work. It reduces encoding time by 67.17% for All-I configuration on trace image dataset, and 67.39% for All-I, 97.96% for P2-configuration and 98.64% for B-configuration on 3D wavefield dataset, with negligible coding performance losses. As a side contribution of this work, HEVC is analyzed within all of its functional units, so that the presented work itself can serve as a specific overview of methods incorporated into the standard

    Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)

    Get PDF
    The implicit objective of the biennial "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) is to foster collaboration between international scientific teams by disseminating ideas through both specific oral/poster presentations and free discussions. For its second edition, the iTWIST workshop took place in the medieval and picturesque town of Namur in Belgium, from Wednesday August 27th till Friday August 29th, 2014. The workshop was conveniently located in "The Arsenal" building within walking distance of both hotels and town center. iTWIST'14 has gathered about 70 international participants and has featured 9 invited talks, 10 oral presentations, and 14 posters on the following themes, all related to the theory, application and generalization of the "sparsity paradigm": Sparsity-driven data sensing and processing; Union of low dimensional subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph sensing/processing; Blind inverse problems and dictionary learning; Sparsity and computational neuroscience; Information theory, geometry and randomness; Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?; Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website: http://sites.google.com/site/itwist1

    Advanced Techniques and Efficiency Assessment of Mechanical Processing

    Get PDF
    Mechanical processing is just one step in the value chain of metal production, but to some exten,t it determines an effectiveness of separation through suitable preparation of the raw material for beneficiation processes through production of required particle sze composition and useful mineral liberation. The issue is mostly related to techniques of comminution and size classification, but it also concerns methods of gravity separation, as well as modeling and optimization. Technological and economic assessment supplements the issue

    Data-driven methods for analyzing ballistocardiograms in longitudinal cardiovascular monitoring

    Get PDF
    Cardiovascular disease (CVD) is the leading cause of death in the US; about 48% of American adults have one or more types of CVD. The importance of continuous monitoring of the older population, for early detection of changes in health conditions, has been shown in the literature, as the key to a successful clinical intervention. We have been investigating environmentally-embedded in-home networks of non-invasive sensing modalities. This dissertation concentrates on the signal processing techniques required for the robust extraction of morphological features from the ballistocardiographs (BCG), and machine learning approaches to utilize these features in non-invasive monitoring of cardiovascular conditions. At first, enhancements in the time domain detection of the cardiac cycle are addressed due to its importance in the estimation of heart rate variability (HRV) and sleep stages. The proposed enhancements in the energy-based algorithm for BCG beat detection have shown at least 50% improvement in the root mean square error (RMSE) of the beat to beat heart rate estimations compared to the reference estimations from the electrocardiogram (ECG) R to R intervals. These results are still subject to some errors, primarily due to the contamination of noise and motion artifacts caused by floor vibration, unconstrained subject movements, or even the respiratory activities. Aging, diseases, breathing, and sleep disorders can also affect the quality of estimation as they slightly modify the morphology of the BCG waveform.Includes bibliographical reference
    corecore