43 research outputs found

    Low bit-rate image sequence coding

    Get PDF

    Data Hiding and Its Applications

    Get PDF
    Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others

    UQ and AI: data fusion, inverse identification, and multiscale uncertainty propagation in aerospace components

    Get PDF
    A key requirement for engineering designs is that they offer good performance across a range of uncertain conditions while exhibiting an admissibly low probability of failure. In order to design components that offer good performance across a range of uncertain conditions, it is necessary to take account of the effect of the uncertainties associated with a candidate design. Uncertainty Quantification (UQ) methods are statistical methods that may be used to quantify the effect of the uncertainties inherent in a system on its performance. This thesis expands the envelope of UQ methods for the design of aerospace components, supporting the integration of UQ methods in product development by addressing four industrial challenges. Firstly, a method for propagating uncertainty through computational models in a hierachy of scales is described that is based on probabilistic equivalence and Non-Intrusive Polynomial Chaos (NIPC). This problem is relevant to the design of aerospace components as the computational models used to evaluate candidate designs are typically multiscale. This method was then extended to develop a formulation for inverse identification, where the probability distributions for the material properties of a coupon are deduced from measurements of its response. We demonstrate how probabilistic equivalence and the Maximum Entropy Principle (MEP) may be used to leverage data from simulations with scarce experimental data- with the intention of making this stage of product design less expensive and time consuming. The third contribution of this thesis is to develop two novel meta-modelling strategies to promote the wider exploration of the design space during the conceptual design phase. Design Space Exploration (DSE) in this phase is crucial as decisions made at the early, conceptual stages of an aircraft design can restrict the range of alternative designs available at later stages in the design process, despite limited quantitative knowledge of the interaction between requirements being available at this stage. A histogram interpolation algorithm is presented that allows the designer to interactively explore the design space with a model-free formulation, while a meta-model based on Knowledge Based Neural Networks (KBaNNs) is proposed in which the outputs of a high-level, inexpensive computer code are informed by the outputs of a neural network, in this way addressing the criticism of neural networks that they are purely data-driven and operate as black boxes. The final challenge addressed by this thesis is how to iteratively improve a meta-model by expanding the dataset used to train it. Given the reliance of UQ methods on meta-models this is an important challenge. This thesis proposes an adaptive learning algorithm for Support Vector Machine (SVM) metamodels, which are used to approximate an unknown function. In particular, we apply the adaptive learning algorithm to test cases in reliability analysis.Open Acces

    Simulation of wireless communication system using OFDM principle

    Get PDF
    FDMA, TDMA and CDMA are the well known multiplexing techniques used in wireless communication systems. While working with the wireless systems using these techniques various problems encountered are (1) multi-path fading (2) time dispersion which lead to intersymbol interference (ISI) (3) lower bit rate capacity (4) requirement of larger transmit power for high bit rate and (5) less spectral efficiency. The use of orthogonal frequency division multiplexing (OFDM) technique provides better solution for the above mentioned problems. The benefits of OFDM are high spectral efficiency, resiliency of RF interference, and lower multi-path distortion. OFDM is a powerful modulation technique that is capable of high data rate and is able to eliminate ISI. The use of FFT technique to implement modulation and demodulation functions makes it computationally more efficient

    Fractal Analysis and Chaos in Geosciences

    Get PDF
    The fractal analysis is becoming a very useful tool to process obtained data from chaotic systems in geosciences. It can be used to resolve many ambiguities in this domain. This book contains eight chapters showing the recent applications of the fractal/mutifractal analysis in geosciences. Two chapters are devoted to applications of the fractal analysis in climatology, two of them to data of cosmic and solar geomagnetic data from observatories. Four chapters of the book contain some applications of the (multi-) fractal analysis in exploration geophysics. I believe that the current book is an important source for researchers and students from universities

    Scalable video compression with optimized visual performance and random accessibility

    Full text link
    This thesis is concerned with maximizing the coding efficiency, random accessibility and visual performance of scalable compressed video. The unifying theme behind this work is the use of finely embedded localized coding structures, which govern the extent to which these goals may be jointly achieved. The first part focuses on scalable volumetric image compression. We investigate 3D transform and coding techniques which exploit inter-slice statistical redundancies without compromising slice accessibility. Our study shows that the motion-compensated temporal discrete wavelet transform (MC-TDWT) practically achieves an upper bound to the compression efficiency of slice transforms. From a video coding perspective, we find that most of the coding gain is attributed to offsetting the learning penalty in adaptive arithmetic coding through 3D code-block extension, rather than inter-frame context modelling. The second aspect of this thesis examines random accessibility. Accessibility refers to the ease with which a region of interest is accessed (subband samples needed for reconstruction are retrieved) from a compressed video bitstream, subject to spatiotemporal code-block constraints. We investigate the fundamental implications of motion compensation for random access efficiency and the compression performance of scalable interactive video. We demonstrate that inclusion of motion compensation operators within the lifting steps of a temporal subband transform incurs a random access penalty which depends on the characteristics of the motion field. The final aspect of this thesis aims to minimize the perceptual impact of visible distortion in scalable reconstructed video. We present a visual optimization strategy based on distortion scaling which raises the distortion-length slope of perceptually significant samples. This alters the codestream embedding order during post-compression rate-distortion optimization, thus allowing visually sensitive sites to be encoded with higher fidelity at a given bit-rate. For visual sensitivity analysis, we propose a contrast perception model that incorporates an adaptive masking slope. This versatile feature provides a context which models perceptual significance. It enables scene structures that otherwise suffer significant degradation to be preserved at lower bit-rates. The novelty in our approach derives from a set of "perceptual mappings" which account for quantization noise shaping effects induced by motion-compensated temporal synthesis. The proposed technique reduces wavelet compression artefacts and improves the perceptual quality of video

    A picture is worth a thousand words : content-based image retrieval techniques

    Get PDF
    In my dissertation I investigate techniques for improving the state of the art in content-based image retrieval. To place my work into context, I highlight the current trends and challenges in my field by analyzing over 200 recent articles. Next, I propose a novel paradigm called __artificial imagination__, which gives the retrieval system the power to imagine and think along with the user in terms of what she is looking for. I then introduce a new user interface for visualizing and exploring image collections, empowering the user to navigate large collections based on her own needs and preferences, while simultaneously providing her with an accurate sense of what the database has to offer. In the later chapters I present work dealing with millions of images and focus in particular on high-performance techniques that minimize memory and computational use for both near-duplicate image detection and web search. Finally, I show early work on a scene completion-based image retrieval engine, which synthesizes realistic imagery that matches what the user has in mind.LEI Universiteit LeidenNWOImagin

    Stochastic representation of material heterogeneity and its effects on flow: applications in soils of mixed wettabilities

    Get PDF
    This work presents an investigation into the modelling of the hydraulic behaviour of heterogeneous soils of mixed wettabilities. The model represents moisture transfer of a liquid phase, and can account for the highly non-uniform nature of unsaturated flow in soil due to the presence of strong heterogeneity. This is accounted for with Gaussian random fields to represent an arbitrary number of spatially varying material properties. The theoretical formulations are presented to model this complex behaviour, as well as their numerical solutions which form the foundation of the developed model. The chosen method of random field generation is also investigated in terms of reducing the error in the correlated structures near the domain boundaries, as well as removing the need to solve over an extended domain. Methods to account for water repellency in soil are also given, such that the exaggerated flow behaviour it exhibits in relation to wettable unsaturated soil can be represented. Validation of the model was conducted through representing field tracer experiments to assess the ability of the model to predict suitable vertical profiles of dye coverage. This was conducted for both wettable and water repellent soil, and quantified through confidence intervals such that a very low number of simulations was able to describe the overall model response to a high level of confidence. The following conclusions can be drawn. The inclusion of material heterogeneity is crucial in representing complex unstable flow processes in soil of any wettability. The non-linear constitutive behaviour of the material that was predicted by the model simulations would be difficult to account for without spatial variability of material parameters. Similarly, the applied field generation method is a fast way to introduce this, and the proposed method of error reduction in the correlation structure is suitable for complex domains. The proposed methods to account for hydrophobicity based on the soil water retention curve are sufficient to allow unstable flow to develop. The similarity in finger characteristics of both the wettable and water repellent cases with the experimental observations suggest the model is more than sufficient in representing the complex flow behaviour that both can exhibit
    corecore