26 research outputs found

    Wavelet Domain Watermark Detection and Extraction using the Vector-based Hidden Markov Model

    Get PDF
    Multimedia data piracy is a growing problem in view of the ease and simplicity provided by the internet in transmitting and receiving such data. A possible solution to preclude unauthorized duplication or distribution of digital data is watermarking. Watermarking is an identifiable piece of information that provides security against multimedia piracy. This thesis is concerned with the investigation of various image watermarking schemes in the wavelet domain using the statistical properties of the wavelet coefficients. The wavelet subband coefficients of natural images have significantly non-Gaussian and heavy-tailed features that are best described by heavy-tailed distributions. Moreover the wavelet coefficients of images have strong inter-scale and inter-orientation dependencies. In view of this, the vector-based hidden Markov model is found to be best suited to characterize the wavelet coefficients. In this thesis, this model is used to develop new digital image watermarking schemes. Additive and multiplicative watermarking schemes in the wavelet domain are developed in order to provide improved detection and extraction of the watermark. Blind watermark detectors using log-likelihood ratio test, and watermark decoders using the maximum likelihood criterion to blindly extract the embedded watermark bits from the observation data are designed. Extensive experiments are conducted throughout this thesis using a number of databases selected from a wide variety of natural images. Simulation results are presented to demonstrate the effectiveness of the proposed image watermarking scheme and their superiority over some of the state-of-the-art techniques. It is shown that in view of the use of the hidden Markov model characterize the distributions of the wavelet coefficients of images, the proposed watermarking algorithms result in higher detection and decoding rates both before and after subjecting the watermarked image to various kinds of attacks

    Probabilistic modeling of wavelet coefficients for processing of image and video signals

    Get PDF
    Statistical estimation and detection techniques are widely used in signal processing including wavelet-based image and video processing. The probability density function (PDF) of the wavelet coefficients of image and video signals plays a key role in the development of techniques for such a processing. Due to the fixed number of parameters, the conventional PDFs for the estimators and detectors usually ignore higher-order moments. Consequently, estimators and detectors designed using such PDFs do not provide a satisfactory performance. This thesis is concerned with first developing a probabilistic model that is capable of incorporating an appropriate number of parameters that depend on higher-order moments of the wavelet coefficients. This model is then used as the prior to propose certain estimation and detection techniques for denoising and watermarking of image and video signals. Towards developing the probabilistic model, the Gauss-Hermite series expansion is chosen, since the wavelet coefficients have non-compact support and their empirical density function shows a resemblance to the standard Gaussian function. A modification is introduced in the series expansion so that only a finite number of terms can be used for modeling the wavelet coefficients with rendering the resulting PDF to become negative. The parameters of the resulting PDF, called the modified Gauss-Hermite (NIGH) PDF, are evaluated in terms of the higher-order sample-moments. It is shown that the MGH PDF fits the empirical density function better than the existing PDFs that use a limited number of parameters do. The proposed MGH PDF is used as the prior of image and video signals in designing maximum a posteriori and minimum mean squared error-based estimators for denoising of image and video signals and log-likelihood ratio-based detector for watermarking of image signals. The performance of the estimation and detection techniques are then evaluated in terms of the commonly used metrics. It is shown through extensive experimentations that the estimation and detection techniques developed utilizing the proposed MGH PDF perform substantially better than those that utilize the conventional PDFs. These results confirm that the superior fit of the MGH PDF to the empirical density function resulting from the flexibility of the MGH PDF in choosing the number of parameters, which are functions of higher-order moments of data, leads to the better performance. Thus, the proposed MGH PDF should play a significant role in wavelet-based image and video signal processin

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Minimum spanning tree reconstruction using autoencoders

    Get PDF
    Minimum spanning trees are widely used graph structures used to find relationships between data organized in a graph. This work proposes a new approach based on neural networks, in particular, autoencoders, to extrapolate this tree from the dissimilarity representation of the data, seen as a noisy version of the minimum spanning tree. After many tests done on different network, the final results confirm the validity of the idea

    Dynamical Systems

    Get PDF
    Complex systems are pervasive in many areas of science integrated in our daily lives. Examples include financial markets, highway transportation networks, telecommunication networks, world and country economies, social networks, immunological systems, living organisms, computational systems and electrical and mechanical structures. Complex systems are often composed of a large number of interconnected and interacting entities, exhibiting much richer global scale dynamics than the properties and behavior of individual entities. Complex systems are studied in many areas of natural sciences, social sciences, engineering and mathematical sciences. This special issue therefore intends to contribute towards the dissemination of the multifaceted concepts in accepted use by the scientific community. We hope readers enjoy this pertinent selection of papers which represents relevant examples of the state of the art in present day research. [...

    Applications of MATLAB in Science and Engineering

    Get PDF
    The book consists of 24 chapters illustrating a wide range of areas where MATLAB tools are applied. These areas include mathematics, physics, chemistry and chemical engineering, mechanical engineering, biological (molecular biology) and medical sciences, communication and control systems, digital signal, image and video processing, system modeling and simulation. Many interesting problems have been included throughout the book, and its contents will be beneficial for students and professionals in wide areas of interest

    Parameter Estimation and Uncertainty Analysis of Contaminant First Arrival Times at Household Drinking Water Wells

    Get PDF
    Exposure assessment, which is an investigation of the extent of human exposure to a specific contaminant, must include estimates of the duration and frequency of exposure. For a groundwater system, the duration of exposure is controlled largely by the arrival time of the contaminant of concern at a drinking water well. This arrival time, which is normally estimated by using groundwater flow and transport models, can have a range of possible values due to the uncertainties that are typically present in real problems. Earlier arrival times generally represent low likelihood events, but play a crucial role in the decision-making process that must be conservative and precautionary, especially when evaluating the potential for adverse health impacts. Therefore, an emphasis must be placed on the accuracy of the leading tail region in the likelihood distribution of possible arrival times. To demonstrate an approach to quantify the uncertainty of arrival times, a real contaminant transport problem which involves TCE contamination due to releases from the Lockformer Company Facility in Lisle, Illinois is used. The approach used in this research consists of two major components: inverse modelling or parameter estimation, and uncertainty analysis. The parameter estimation process for this case study was selected based on insufficiencies in the model and observational data due to errors, biases, and limitations. A consideration of its purpose, which is to aid in characterising uncertainty, was also made in the process by including many possible variations in attempts to minimize assumptions. A preliminary investigation was conducted using a well-accepted parameter estimation method, PEST, and the corresponding findings were used to define characteristics of the parameter estimation process applied to this case study. Numerous objective functions, which include the well-known L2-estimator, robust estimators (L1-estimators and M-estimators), penalty functions, and deadzones, were incorporated in the parameter estimation process to treat specific insufficiencies. The concept of equifinality was adopted and multiple maximum likelihood parameter sets were accepted if pre-defined physical criteria were met. For each objective function, three procedures were implemented as a part of the parameter estimation approach for the given case study: a multistart procedure, a stochastic search using the Dynamically-Dimensioned Search (DDS), and a test for acceptance based on predefined physical criteria. The best performance in terms of the ability of parameter sets to satisfy the physical criteria was achieved using a Cauchy’s M-estimator that was modified for this study and designated as the LRS1 M-estimator. Due to uncertainties, multiple parameter sets obtained with the LRS1 M-estimator, the L1-estimator, and the L2-estimator are recommended for use in uncertainty analysis. Penalty functions had to be incorporated into the objective function definitions to generate a sufficient number of acceptable parameter sets; in contrast, deadzones proved to produce negligible benefits. The characteristics for parameter sets were examined in terms of frequency histograms and plots of parameter value versus objective function value to infer the nature of the likelihood distributions of parameters. The correlation structure was estimated using Pearson’s product-moment correlation coefficient. The parameters are generally distributed uniformly or appear to follow a random nature with few correlations in the parameter space that results after the implementation of the multistart procedure. The execution of the search procedure results in the introduction of many correlations and in parameter distributions that appear to follow lognormal, normal, or uniform distributions. The application of the physical criteria refines the parameter characteristics in the parameter space resulting from the search procedure by reducing anomalies. The combined effect of optimization and the application of the physical criteria performs the function of behavioural thresholds by removing parameter sets with high objective function values. Uncertainty analysis is performed with parameter sets obtained through two different sampling methodologies: the Monte Carlo sampling methodology, which randomly and independently samples from user-defined distributions, and the physically-based DDS-AU (P-DDS-AU) sampling methodology, which is developed based on the multiple parameter sets acquired during the parameter estimation process. Monte Carlo samples are found to be inadequate for uncertainty analysis of this case study due to its inability to find parameter sets that meet the predefined physical criteria. Successful results are achieved using the P-DDS-AU sampling methodology that inherently accounts for parameter correlations and does not require assumptions regarding parameter distributions. For the P-DDS-AU samples, uncertainty representation is performed using four definitions based on pseudo-likelihoods: two based on the Nash and Sutcliffe efficiency criterion, and two based on inverse error or residual variance. The definitions consist of shaping factors that strongly affect the resulting likelihood distribution. In addition, some definitions are affected by the objective function definition. Therefore, all variations are considered in the development of likelihood distribution envelopes, which are designed to maximize the amount of information available to decision-makers. The considerations that are important to the creation of an uncertainty envelope are outlined in this thesis. In general, greater uncertainty appears to be present at the tails of the distribution. For a refinement of the uncertainty envelopes, the application of additional physical criteria is recommended. The selection of likelihood and objective function definitions and their properties are made based on the needs of the problem; therefore, preliminary investigations should always be conducted to provide a basis for selecting appropriate methods and definitions. It is imperative to remember that the communication of assumptions and definitions used in both parameter estimation and uncertainty analysis is crucial in decision-making scenarios

    Cryptography and Its Applications in Information Security

    Get PDF
    Nowadays, mankind is living in a cyber world. Modern technologies involve fast communication links between potentially billions of devices through complex networks (satellite, mobile phone, Internet, Internet of Things (IoT), etc.). The main concern posed by these entangled complex networks is their protection against passive and active attacks that could compromise public security (sabotage, espionage, cyber-terrorism) and privacy. This Special Issue “Cryptography and Its Applications in Information Security” addresses the range of problems related to the security of information in networks and multimedia communications and to bring together researchers, practitioners, and industrials interested by such questions. It consists of eight peer-reviewed papers, however easily understandable, that cover a range of subjects and applications related security of information

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC
    corecore