39 research outputs found

    Using Quaternion Fourier Transform in Steganography Systems

    Get PDF
    steganography is the discipline of exchanging information messages in such way that no one, other than the intended recipient, suspects the existence of the message.  The transmitted message can be in textual or multimedia form (audio, image or video) and can be hidden within cover media. Moreover, the hidden message can be in either plain or cipher form.  In steganography, the majority of hiding techniques are implemented either in spatial domain or in frequency domain of the cover media.   The current contribution introduces a new a steganography technique for hiding a textual message within a cover image.   Both the message and the cover image is converted to quaternion form and then only the quaternion message is converted to the frequency domain using Quaternion Fast Fourier Discrete Transform (QFFDT) technique.  Simple quaternion mathematics are used to combine the message (in quaternion frequency domain) within the cover image (in quaternion form).  Conversely, the hidden message can be revealed at the receiver using simple quaternion mathematics in presence of the original cover image.  The proposed method allows hiding a huge amount of data and it is much complicated against steganalysis compared to the traditional methods. The method is assessed using the known performance metrics and the obtained results show that it is robust and more secure against steganalysis attacks without affecting the consumed bandwidth of the communication channel

    A Study on Visually Encrypted Images for Rights Protection and Authentication

    Get PDF
    首都大学東京, 2014-03-25, 博士(工学), 甲第444号首都大学東

    Digital watermarking methods for data security and authentication

    Get PDF
    Philosophiae Doctor - PhDCryptology is the study of systems that typically originate from a consideration of the ideal circumstances under which secure information exchange is to take place. It involves the study of cryptographic and other processes that might be introduced for breaking the output of such systems - cryptanalysis. This includes the introduction of formal mathematical methods for the design of a cryptosystem and for estimating its theoretical level of securit

    User-controlled cyber-security using automated key generation

    Get PDF
    Traditionally, several different methods are fully capable of providing an adequate degree of security to the threats and attacks that exists for revealing different keys. Though almost all the traditional methods give a good level of immunity to any possible breach in security keys, the biggest issue that exist with these methods is the dependency over third-party applications. Therefore, use of third-party applications is not an acceptable method to be used by high-security applications. For high-security applications, it is more secure that the key generation process is in the hands of the end users rather than a third-party. Giving access to third parties for high-security applications can also make the applications more venerable to data theft, security breach or even a loss in their integrity. In this research, the evolutionary computing tool Eureqa is used for the generation of encryption keys obtained by modelling pseudo-random input data. Previous approaches using this tool have required a calculation time too long for practical use and addressing this drawback is the main focus of the research. The work proposes a number of new approaches to the generation of secret keys for the encryption and decryption of data files and they are compared in their ability to operate in a secure manner using a range of statistical tests and in their ability to reduce calculation time using realistic practical assessments. A number of common tests of performance are the throughput, chi-square, histogram, time for encryption and decryption, key sensitivity and entropy analysis. From the results of the statistical tests, it can be concluded that the proposed data encryption and decryption algorithms are both reliable and secure. Being both reliable and secure eliminates the need for the dependency over third-party applications for the security keys. It also takes less time for the users to generate highly secure keys compared to the previously known techniques.The keys generated via Eureqa also have great potential to be adapted to data communication applications which require high security

    Buyer-seller watermarking protocol in digital cinema

    Get PDF
    Master'sMASTER OF SCIENC

    Design and Analysis of Fair Content Tracing Protocols

    Get PDF
    The work in this thesis examines protocols designed to address the issues of tracing illegal distribution of digital content in a fair manner. In digital content distribution, a client requests content from a distributor, and the distributor sends content to the client. The main concern is misuse of content by the client, such as illegal distribution. As a result, digital watermarking schemes that enable the distributor to trace copies of content and identify the perpetrator were proposed. However, such schemes do not provide a mechanism for the distributor to prove to a third party that a client illegally distributed copies of content. Furthermore, it is possible that the distributor falsely accuses a client as he has total control of the tracing mechanisms. Fair content tracing (FaCT) protocols were thus proposed to allow tracing of content that does not discriminate either the distributor or the client. Many FaCT protocols have been proposed, mostly without an appropriate design framework, and so there is no obvious and systematic way to evaluate them. Therefore, we propose a framework that provides a definition of security and which enables classification of FaCT protocols so that they can be analysed in a systematic manner. We define, based on our framework, four main categories of FaCT protocols and propose new approaches to designing them. The first category is protocols without trusted third parties. As the name suggests, these protocols do not rely on a central trusted party for fair tracing of content. It is difficult to design such a protocol without drawing on extra measures that increase communication and computation costs. We show this is the case by demonstrating flaws in two recent proposals. We also illustrate a possible repair based on relaxing the assumption of trust on the distributor. The second category is protocols with online trusted third parties, where a central online trusted party is deployed. This means a trusted party must always be available during content distribution between the distributor and the client. While the availability of a trusted third party may simplify the design of such protocols, efficiency may suffer due to the need to communicate with this third party. The third category is protocols with offline trusted third parties, where a central offline trusted party is deployed. The difference between the offline and the online trusted party is that the offline trusted party need not be available during content distribution. It only needs to be available during the initial setup and when there is a dispute between the distributor and the client. This reduces the communication requirements compared to using an online trusted party. Using a symmetric-based cryptographic primitive known as Chameleon encryption, we proposed a new approach to designing such protocols. The fourth category is protocols with trusted hardware. Previous protocols proposed in this category have abstracted away from a practical choice of the underlying trusted hardware. We propose new protocols based on a Trusted Platform Module (TPM). Finally, we examine the inclusion of payment in a FaCT protocol, and how adding payment motivates the requirement for fair exchange of buying and selling digital content

    On the Chirp Function, the Chirplet Transform and the Optimal Communication of Information

    Get PDF
    —The purpose of this extended paper is to provide a review of the chirp function and the chirplet transform and to investigate the application of chirplet modulation for digital communications, in particular, the transmission of binary strings. The significance of the chirp function in the solution to a range of fundamental problems in physics is revisited to provide a background to the case and to present the context in which the chirp function plays a central role, the material presented being designed to show a variety of problems with solutions and applications that are characterized by a chirp function in a fundamental way. A study is then provided whose aim is to investigate the uniqueness of the chirp function in regard to its use for convolutionalcodinganddecoding,thelattercase(i.e.decoding) being related to the autocorrelation of the chirp function which provides a unique solution to the deconvolution problem. Complementary material in regard to the uniqueness of a chirp is addressed through an investigation into the selfcharacterizationofthechirpfunctionuponFouriertransformation. This includes a short study on the eigenfunctions of the Fourier transform, leading to a uniqueness conjecture which is based on an application of the Bluestein decomposition of a Fourier transform. The conjecture states that the chirp function is the only phase-only function to have a self-characteristic Fourier transform, and, for a specific scaling constant, a conjugate eigenfunction. In the context of this conjecture, we consider the transmission of information through a channel characterized by additive noise and the detection of signals with very low Signal-to-Noise Ratios. It is shown that application of chirplet modulation can provide a simple and optimal solution to the problem of transmitting binary strings through noisy communication channels, a result which suggests that all digital communication systems should ideally by predicated on the application of chirplet modulation. In the latter part of the paper, a method is proposed for securing the communication of information (in the form of a binary string) through chirplet modulation that is based on prime number factorization of the chirplet (angular) bandwidth. Coupled with a quantum computer for factorizing very large prime numbers using Shor’s algorithm, the method has the potential for designing a communications protocol specifically for users with access to quantum computing when the factorization of very large prime numbers is required. In thisrespect,and,inthefinalpartofthepaper,weinvestigatethe application of chirplet modulation for communicating through the ‘Water-Hole’. This includes the introduction of a method for distinguishing between genuine ‘intelligible’ binary strings through the Kullback-Leibler divergence which is shown to be statistically significant for a number of natural languages

    Discrete Wavelet Transforms

    Get PDF
    The discrete wavelet transform (DWT) algorithms have a firm position in processing of signals in several areas of research and industry. As DWT provides both octave-scale frequency and spatial timing of the analyzed signal, it is constantly used to solve and treat more and more advanced problems. The present book: Discrete Wavelet Transforms: Algorithms and Applications reviews the recent progress in discrete wavelet transform algorithms and applications. The book covers a wide range of methods (e.g. lifting, shift invariance, multi-scale analysis) for constructing DWTs. The book chapters are organized into four major parts. Part I describes the progress in hardware implementations of the DWT algorithms. Applications include multitone modulation for ADSL and equalization techniques, a scalable architecture for FPGA-implementation, lifting based algorithm for VLSI implementation, comparison between DWT and FFT based OFDM and modified SPIHT codec. Part II addresses image processing algorithms such as multiresolution approach for edge detection, low bit rate image compression, low complexity implementation of CQF wavelets and compression of multi-component images. Part III focuses watermaking DWT algorithms. Finally, Part IV describes shift invariant DWTs, DC lossless property, DWT based analysis and estimation of colored noise and an application of the wavelet Galerkin method. The chapters of the present book consist of both tutorial and highly advanced material. Therefore, the book is intended to be a reference text for graduate students and researchers to obtain state-of-the-art knowledge on specific applications

    Cybersecurity: Past, Present and Future

    Full text link
    The digital transformation has created a new digital space known as cyberspace. This new cyberspace has improved the workings of businesses, organizations, governments, society as a whole, and day to day life of an individual. With these improvements come new challenges, and one of the main challenges is security. The security of the new cyberspace is called cybersecurity. Cyberspace has created new technologies and environments such as cloud computing, smart devices, IoTs, and several others. To keep pace with these advancements in cyber technologies there is a need to expand research and develop new cybersecurity methods and tools to secure these domains and environments. This book is an effort to introduce the reader to the field of cybersecurity, highlight current issues and challenges, and provide future directions to mitigate or resolve them. The main specializations of cybersecurity covered in this book are software security, hardware security, the evolution of malware, biometrics, cyber intelligence, and cyber forensics. We must learn from the past, evolve our present and improve the future. Based on this objective, the book covers the past, present, and future of these main specializations of cybersecurity. The book also examines the upcoming areas of research in cyber intelligence, such as hybrid augmented and explainable artificial intelligence (AI). Human and AI collaboration can significantly increase the performance of a cybersecurity system. Interpreting and explaining machine learning models, i.e., explainable AI is an emerging field of study and has a lot of potentials to improve the role of AI in cybersecurity.Comment: Author's copy of the book published under ISBN: 978-620-4-74421-

    Deterministic, Efficient Variation of Circuit Components to Improve Resistance to Reverse Engineering

    Get PDF
    This research proposes two alternative methods for generating semantically equivalent circuit variants which leave the circuit\u27s internal structure pseudo-randomly determined. Component fusion deterministically selects subcircuits using a component identification algorithm and replaces them using a deterministic algorithm that generates canonical logic forms. Component encryption seeks to alter the semantics of individual circuit components using an encoding function, but preserves the overall circuit semantics by decoding signal values later in the circuit. Experiments were conducted to examine the performance of component fusion and component encryption against representative trials of subcircuit selection-and-replacement and Boundary Blurring, two previously defined methods for circuit obfuscation. Overall, results support the conclusion that both component fusion and component encryption generate more secure variants than previous methods and that these variants are more efficient in terms of required circuit delay and the power and area required for their implementation
    corecore