355 research outputs found

    Application of Stochastic Diffusion for Hiding High Fidelity Encrypted Images

    Get PDF
    Cryptography coupled with information hiding has received increased attention in recent years and has become a major research theme because of the importance of protecting encrypted information in any Electronic Data Interchange system in a way that is both discrete and covert. One of the essential limitations in any cryptography system is that the encrypted data provides an indication on its importance which arouses suspicion and makes it vulnerable to attack. Information hiding of Steganography provides a potential solution to this issue by making the data imperceptible, the security of the hidden information being a threat only if its existence is detected through Steganalysis. This paper focuses on a study methods for hiding encrypted information, specifically, methods that encrypt data before embedding in host data where the ‘data’ is in the form of a full colour digital image. Such methods provide a greater level of data security especially when the information is to be submitted over the Internet, for example, since a potential attacker needs to first detect, then extract and then decrypt the embedded data in order to recover the original information. After providing an extensive survey of the current methods available, we present a new method of encrypting and then hiding full colour images in three full colour host images with out loss of fidelity following data extraction and decryption. The application of this technique, which is based on a technique called ‘Stochastic Diffusion’ are wide ranging and include covert image information interchange, digital image authentication, video authentication, copyright protection and digital rights management of image data in general

    Discrete Wavelet Transforms

    Get PDF
    The discrete wavelet transform (DWT) algorithms have a firm position in processing of signals in several areas of research and industry. As DWT provides both octave-scale frequency and spatial timing of the analyzed signal, it is constantly used to solve and treat more and more advanced problems. The present book: Discrete Wavelet Transforms: Algorithms and Applications reviews the recent progress in discrete wavelet transform algorithms and applications. The book covers a wide range of methods (e.g. lifting, shift invariance, multi-scale analysis) for constructing DWTs. The book chapters are organized into four major parts. Part I describes the progress in hardware implementations of the DWT algorithms. Applications include multitone modulation for ADSL and equalization techniques, a scalable architecture for FPGA-implementation, lifting based algorithm for VLSI implementation, comparison between DWT and FFT based OFDM and modified SPIHT codec. Part II addresses image processing algorithms such as multiresolution approach for edge detection, low bit rate image compression, low complexity implementation of CQF wavelets and compression of multi-component images. Part III focuses watermaking DWT algorithms. Finally, Part IV describes shift invariant DWTs, DC lossless property, DWT based analysis and estimation of colored noise and an application of the wavelet Galerkin method. The chapters of the present book consist of both tutorial and highly advanced material. Therefore, the book is intended to be a reference text for graduate students and researchers to obtain state-of-the-art knowledge on specific applications

    BioClimate: a Science Gateway for Climate Change and Biodiversity research in the EUBrazilCloudConnect project

    Get PDF
    [EN] Climate and biodiversity systems are closely linked across a wide range of scales. To better understand the mutual interaction between climate change and biodiversity there is a strong need for multidisciplinary skills, scientific tools, and access to a large variety of heterogeneous, often distributed, data sources. Related to that, the EUBrazilCloudConnect project provides a user-oriented research environment built on top of a federated cloud infrastructure across Europe and Brazil, to serve key needs in different scientific domains, which is validated through a set of use cases. Among them, the most data-centric one is focused on climate change and biodiversity research. As part of this use case, the BioClimate Science Gateway has been implemented to provide end-users transparent access to (i) a highly integrated user-friendly environment, (ii) a large variety of data sources, and (iii) different analytics & visualization tools to serve a large spectrum of users needs and requirements. This paper presents a complete overview of BioClimate and the related scientific environment, in particular its Science Gateway, delivered to the end-user community at the end of the project.This work was supported by the EU FP7 EUBrazilCloudConnect Project (Grant Agreement 614048), and CNPq/Brazil (Grant Agreement no 490115/2013-6).Fiore, S.; Elia, D.; Blanquer Espert, I.; Brasileiro, FV.; Nuzzo, A.; Nassisi, P.; Rufino, LAA.... (2019). BioClimate: a Science Gateway for Climate Change and Biodiversity research in the EUBrazilCloudConnect project. Future Generation Computer Systems. 94:895-909. https://doi.org/10.1016/j.future.2017.11.034S8959099

    Micro Signal Extraction and Analytics

    Get PDF
    This dissertation studies the extraction of signals that have smaller magnitudes—typically one order of magnitude or more—than the dominating signals, or the extraction of signals that have a smaller topological scale than what conventional algorithms resolve. We name such a problem the micro signal extraction problem. The micro signal extraction problem is challenging due to the relatively low signal strength. In terms of relative magnitude, the micro signal of interest may very well be considered as one signal within a group of many types of tiny, nuisance signals, such as sensor noise and quantization noise. This group of nuisance signals is usually considered as the “noisy,” unwanted component in contrast to the “signal” component dominating the multimedia content. To extract the micro signal that has much smaller magnitude than the dominating signal and simultaneously to protect it from being corrupted by other nuisance signals, one usually has to tackle the problem with extra caution: the modeling assumptions behind a proposed extraction algorithm needs to be closely calibrated with the behavior of the multimedia data. In this dissertation, we tackle three micro signal extraction problems by synergistically applying and adapting signal processing theories and techniques. In the first part of the dissertation, we use mobile imaging to extract a collection of directions of microscopic surfaces as a unique identifier for authentication and counterfeit detection purposes. This is the first work showing that the 3-D structure at the microscopic level can be precisely estimated using techniques related to the photometric stereo. By enabling the mobile imaging paradigm, we have significantly reduced the barriers for extending the counterfeit detection system to end users. In the second part of the dissertation, we explore the possibility of extracting the Electric Network Frequency (ENF) signal from a single image. This problem is much more challenging compared to its audio and video counterparts, as the duration and the magnitude of the embedded signal are both very small. We investigate and show how the detectability of the ENF signal changes as a function of the magnitude of the embedded ENF signal. In the last part of the dissertation, we study the problem of heart-rate from fitness exercise videos, which is challenging due to the existence of fitness motions. We show that a highly precise motion compensation scheme is the key to a reliable heart-rate extraction system

    Biometrics

    Get PDF
    Biometrics uses methods for unique recognition of humans based upon one or more intrinsic physical or behavioral traits. In computer science, particularly, biometrics is used as a form of identity access management and access control. It is also used to identify individuals in groups that are under surveillance. The book consists of 13 chapters, each focusing on a certain aspect of the problem. The book chapters are divided into three sections: physical biometrics, behavioral biometrics and medical biometrics. The key objective of the book is to provide comprehensive reference and text on human authentication and people identity verification from both physiological, behavioural and other points of view. It aims to publish new insights into current innovations in computer systems and technology for biometrics development and its applications. The book was reviewed by the editor Dr. Jucheng Yang, and many of the guest editors, such as Dr. Girija Chetty, Dr. Norman Poh, Dr. Loris Nanni, Dr. Jianjiang Feng, Dr. Dongsun Park, Dr. Sook Yoon and so on, who also made a significant contribution to the book

    Data hiding in multimedia - theory and applications

    Get PDF
    Multimedia data hiding or steganography is a means of communication using subliminal channels. The resource for the subliminal communication scheme is the distortion of the original content that can be tolerated. This thesis addresses two main issues of steganographic communication schemes: 1. How does one maximize the distortion introduced without affecting fidelity of the content? 2. How does one efficiently utilize the resource (the distortion introduced) for communicating as many bits of information as possible? In other words, what is a good signaling strategy for the subliminal communication scheme? Close to optimal solutions for both issues are analyzed. Many techniques for the issue for maximizing the resource, viz, the distortion introduced imperceptibly in images and video frames, are proposed. Different signaling strategies for steganographic communication are explored, and a novel signaling technique employing a floating signal constellation is proposed. Algorithms for optimal choices of the parameters of the signaling technique are presented. Other application specific issues like the type of robustness needed are taken into consideration along with the established theoretical background to design optimal data hiding schemes. In particular, two very important applications of data hiding are addressed - data hiding for multimedia content delivery, and data hiding for watermarking (for proving ownership). A robust watermarking protocol for unambiguous resolution of ownership is proposed

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Feature Selection and Classifier Development for Radio Frequency Device Identification

    Get PDF
    The proliferation of simple and low-cost devices, such as IEEE 802.15.4 ZigBee and Z-Wave, in Critical Infrastructure (CI) increases security concerns. Radio Frequency Distinct Native Attribute (RF-DNA) Fingerprinting facilitates biometric-like identification of electronic devices emissions from variances in device hardware. Developing reliable classifier models using RF-DNA fingerprints is thus important for device discrimination to enable reliable Device Classification (a one-to-many looks most like assessment) and Device ID Verification (a one-to-one looks how much like assessment). AFITs prior RF-DNA work focused on Multiple Discriminant Analysis/Maximum Likelihood (MDA/ML) and Generalized Relevance Learning Vector Quantized Improved (GRLVQI) classifiers. This work 1) introduces a new GRLVQI-Distance (GRLVQI-D) classifier that extends prior GRLVQI work by supporting alternative distance measures, 2) formalizes a framework for selecting competing distance measures for GRLVQI-D, 3) introducing response surface methods for optimizing GRLVQI and GRLVQI-D algorithm settings, 4) develops an MDA-based Loadings Fusion (MLF) Dimensional Reduction Analysis (DRA) method for improved classifier-based feature selection, 5) introduces the F-test as a DRA method for RF-DNA fingerprints, 6) provides a phenomenological understanding of test statistics and p-values, with KS-test and F-test statistic values being superior to p-values for DRA, and 7) introduces quantitative dimensionality assessment methods for DRA subset selection

    Detection of network anomalies and novel attacks in the internet via statistical network traffic separation and normality prediction

    Get PDF
    With the advent and the explosive growth of the global Internet and the electronic commerce environment, adaptive/automatic network and service anomaly detection is fast gaining critical research and practical importance. If the next generation of network technology is to operate beyond the levels of current networks, it will require a set of well-designed tools for its management that will provide the capability of dynamically and reliably identifying network anomalies. Early detection of network anomalies and performance degradations is a key to rapid fault recovery and robust networking, and has been receiving increasing attention lately. In this dissertation we present a network anomaly detection methodology, which relies on the analysis of network traffic and the characterization of the dynamic statistical properties of traffic normality, in order to accurately and timely detect network anomalies. Anomaly detection is based on the concept that perturbations of normal behavior suggest the presence of anomalies, faults, attacks etc. This methodology can be uniformly applied in order to detect network attacks, especially in cases where novel attacks are present and the nature of the intrusion is unknown. Specifically, in order to provide an accurate identification of the normal network traffic behavior, we first develop an anomaly-tolerant non-stationary traffic prediction technique, which is capable of removing both pulse and continuous anomalies. Furthermore we introduce and design dynamic thresholds, and based on them we define adaptive anomaly violation conditions, as a combined function of both the magnitude and duration of the traffic deviations. Numerical results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, under different anomaly traffic scenarios and attacks, such as mail-bombing and UDP flooding attacks. In order to improve the prediction accuracy of the statistical network traffic normality, especially in cases where high burstiness is present, we propose, study and analyze a new network traffic prediction methodology, based on the frequency domain traffic analysis and filtering, with the objective_of enhancing the network anomaly detection capabilities. Our approach is based on the observation that the various network traffic components, are better identified, represented and isolated in the frequency domain. As a result, the traffic can be effectively separated into a baseline component, that includes most of the low frequency traffic and presents low burstiness, and the short-term traffic that includes the most dynamic part. The baseline traffic is a mean non-stationary periodic time series, and the Extended Resource-Allocating Network (BRAN) methodology is used for its accurate prediction. The short-term traffic is shown to be a time-dependent series, and the Autoregressive Moving Average (ARMA) model is proposed to be used for the accurate prediction of this component. Furthermore, it is demonstrated that the proposed enhanced traffic prediction strategy can be combined with the use of dynamic thresholds and adaptive anomaly violation conditions, in order to improve the network anomaly detection effectiveness. The performance evaluation of the proposed overall strategy, in terms of the achievable network traffic prediction accuracy and anomaly detection capability, and the corresponding numerical results demonstrate and quantify the significant improvements that can be achieved
    corecore