237 research outputs found

    A Digital Watermarking Approach Based on DCT Domain Combining QR Code and Chaotic Theory

    Full text link
    This paper proposes a robust watermarking approach based on Discrete Cosine Transform domain that combines Quick Response Code and chaotic system.Comment: 7 pages, 6 figure

    Computational Intelligence and Complexity Measures for Chaotic Information Processing

    Get PDF
    This dissertation investigates the application of computational intelligence methods in the analysis of nonlinear chaotic systems in the framework of many known and newly designed complex systems. Parallel comparisons are made between these methods. This provides insight into the difficult challenges facing nonlinear systems characterization and aids in developing a generalized algorithm in computing algorithmic complexity measures, Lyapunov exponents, information dimension and topological entropy. These metrics are implemented to characterize the dynamic patterns of discrete and continuous systems. These metrics make it possible to distinguish order from disorder in these systems. Steps required for computing Lyapunov exponents with a reorthonormalization method and a group theory approach are formalized. Procedures for implementing computational algorithms are designed and numerical results for each system are presented. The advance-time sampling technique is designed to overcome the scarcity of phase space samples and the buffer overflow problem in algorithmic complexity measure estimation in slow dynamics feedback-controlled systems. It is proved analytically and tested numerically that for a quasiperiodic system like a Fibonacci map, complexity grows logarithmically with the evolutionary length of the data block. It is concluded that a normalized algorithmic complexity measure can be used as a system classifier. This quantity turns out to be one for random sequences and a non-zero value less than one for chaotic sequences. For periodic and quasi-periodic responses, as data strings grow their normalized complexity approaches zero, while a faster deceasing rate is observed for periodic responses. Algorithmic complexity analysis is performed on a class of certain rate convolutional encoders. The degree of diffusion in random-like patterns is measured. Simulation evidence indicates that algorithmic complexity associated with a particular class of 1/n-rate code increases with the increase of the encoder constraint length. This occurs in parallel with the increase of error correcting capacity of the decoder. Comparing groups of rate-1/n convolutional encoders, it is observed that as the encoder rate decreases from 1/2 to 1/7, the encoded data sequence manifests smaller algorithmic complexity with a larger free distance value

    Secrecy and Randomness: Encoding Cloud data Locally using a One-Time Pad

    Get PDF
    There is no secrecy without randomness, and we address poor cloud security using an analogue chaotic onetime pad encryption system to achieve perfect secrecy. Local encoding returns control to the client and makes stored cloud data unreadable to an adversary. Most cloud service providers encode client data using public encryption algorithms, but ultimately businesses and organisations are responsible for encoding data locally before uploading to the Cloud. As recommended by the Cloud Security Alliance, companies employing authentication and local encryption will reduce or eliminate, EU fines for late data breach discoveries when the EU implements the new general data protection regulations in 2018. Companies failing to detect data breaches within a 72-hour limit will be fined up to four percent of their global annual turnover and estimates of several hundred billion euros could be levied in fines based on the present 146 days average EU breach discovery. The proposed localised encryption system is additional to public encryption, and obeying the rules of one-time pad encryption will mean intercepted encrypted data will be meaningless to an adversary. Furthermore, the encoder has no key distribution problem because applications for it are of “one-to-cloud” type

    One-to-Cloud One-Time Pad Data Encryption: Introducing Virtual Prototyping with PSpice

    Get PDF
    In this paper, we examine the design and application of a one-time pad encryption system for protecting data stored in the Cloud. Personalising security using a one-time pad generator at the client-end protects data from break-ins, side-channel attacks and backdoors in public encryption algorithms. The one-time pad binary sequences were obtained from modified analogue chaos oscillators initiated by noise and encoded client data locally. Specific ``one-to-Cloud\u27\u27 storage applications returned control back to the end user but without the key distribution problem normally associated with one-time pad encryption. Development of the prototype was aided by ``Virtual Prototyping\u27\u27 in the latest version of Cadence OrCAD PSpice©^\copyright. This addition allows the prototype simulation schematic to be connected to an actual microcontroller in real time using device model interfacing for bi-directional communication

    On rate capacity and signature sequence adaptation in downlink of MC-CDMA system

    Get PDF
    This dissertation addresses two topics in the MC-CDMA system: rate capacity and adaptation of users\u27 signature sequences. Both of them are studied for the downlink communication scenario with multi-code scheme. The purpose of studying rate capacity is to understand the potential of applying MC-CDMA technique for high speed wireless data communications. It is shown that, to maintain high speed data transmission with multi-code scheme, each mobile should cooperatively decode its desired user\u27s encoded data symbols which are spread with different signature sequences simultaneously. Higher data rate can be achieved by implementing dirty paper coding (DPC) to cooperatively encode all users\u27 data symbols at the base station. However, the complexity of realizing DPC is prohibitively high. Moreover, it is found that the resource allocation policy has profound impact on the rate capacity that can be maintained in the system. Nevertheless, the widely adopted proportional resource allocation policy is only suitable for the communication scenario in which the disparity of users\u27 channel qualities is small. When the difference between users\u27 channel qualities is large, one must resort to non-proportional assignment of power and signature sequences. Both centralized and distributed schemes are proposed to adapt users\u27 signature sequences in the downlink of MC-CDMA system. With the former, the base station collects complete channel state information and iteratively adapts all users\u27 signature sequences to optimize an overall system performance objective function, e.g. the weighted total mean square error (WTMSE). Since the proposed centralized scheme is designed such that each iteration of signature sequence adaptation decreases the WTMSE which is lower bounded, the convergence of the proposed centralized scheme is guaranteed. With the distributed signature sequence adaptation, each user\u27s signature sequences are independently adapted to optimize the associated user\u27s individual performance objective function with no regard to the performance of other users in the system. Two distributed adaptation schemes are developed. In one scheme, each user adapts its signature sequences under a pre-assigned power constraint which remains unchanged during the process of adaptation. In the other scheme, pricing methodology is applied so that the transmission power at the base station is properly distributed among users when users\u27 signature sequences are adapted. The stability issue of these distributed adaptation schemes is analyzed using game theory frame work. It is proven that there always exists a set of signature sequences at which no user can unilaterally adapt its signature sequences to further improve its individual performance, given the signature sequences chosen by other users in the system

    On the Application of PSpice for Localised Cloud Security

    Get PDF
    The work reported in this thesis commenced with a review of methods for creating random binary sequences for encoding data locally by the client before storing in the Cloud. The first method reviewed investigated evolutionary computing software which generated noise-producing functions from natural noise, a highly-speculative novel idea since noise is stochastic. Nevertheless, a function was created which generated noise to seed chaos oscillators which produced random binary sequences and this research led to a circuit-based one-time pad key chaos encoder for encrypting data. Circuit-based delay chaos oscillators, initialised with sampled electronic noise, were simulated in a linear circuit simulator called PSpice. Many simulation problems were encountered because of the nonlinear nature of chaos but were solved by creating new simulation parts, tools and simulation paradigms. Simulation data from a range of chaos sources was exported and analysed using Lyapunov analysis and identified two sources which produced one-time pad sequences with maximum entropy. This led to an encoding system which generated unlimited, infinitely-long period, unique random one-time pad encryption keys for plaintext data length matching. The keys were studied for maximum entropy and passed a suite of stringent internationally-accepted statistical tests for randomness. A prototype containing two delay chaos sources initialised by electronic noise was produced on a double-sided printed circuit board and produced more than 200 Mbits of OTPs. According to Vladimir Kotelnikov in 1941 and Claude Shannon in 1945, one-time pad sequences are theoretically-perfect and unbreakable, provided specific rules are adhered to. Two other techniques for generating random binary sequences were researched; a new circuit element, memristance was incorporated in a Chua chaos oscillator, and a fractional-order Lorenz chaos system with order less than three. Quantum computing will present many problems to cryptographic system security when existing systems are upgraded in the near future. The only existing encoding system that will resist cryptanalysis by this system is the unconditionally-secure one-time pad encryption

    Application of Stochastic Diffusion for Hiding High Fidelity Encrypted Images

    Get PDF
    Cryptography coupled with information hiding has received increased attention in recent years and has become a major research theme because of the importance of protecting encrypted information in any Electronic Data Interchange system in a way that is both discrete and covert. One of the essential limitations in any cryptography system is that the encrypted data provides an indication on its importance which arouses suspicion and makes it vulnerable to attack. Information hiding of Steganography provides a potential solution to this issue by making the data imperceptible, the security of the hidden information being a threat only if its existence is detected through Steganalysis. This paper focuses on a study methods for hiding encrypted information, specifically, methods that encrypt data before embedding in host data where the ‘data’ is in the form of a full colour digital image. Such methods provide a greater level of data security especially when the information is to be submitted over the Internet, for example, since a potential attacker needs to first detect, then extract and then decrypt the embedded data in order to recover the original information. After providing an extensive survey of the current methods available, we present a new method of encrypting and then hiding full colour images in three full colour host images with out loss of fidelity following data extraction and decryption. The application of this technique, which is based on a technique called ‘Stochastic Diffusion’ are wide ranging and include covert image information interchange, digital image authentication, video authentication, copyright protection and digital rights management of image data in general

    Signal processing techniques for mobile multimedia systems

    Get PDF
    Recent trends in wireless communication systems show a significant demand for the delivery of multimedia services and applications over mobile networks - mobile multimedia - like video telephony, multimedia messaging, mobile gaming, interactive and streaming video, etc. However, despite the ongoing development of key communication technologies that support these applications, the communication resources and bandwidth available to wireless/mobile radio systems are often severely limited. It is well known, that these bottlenecks are inherently due to the processing capabilities of mobile transmission systems, and the time-varying nature of wireless channel conditions and propagation environments. Therefore, new ways of processing and transmitting multimedia data over mobile radio channels have become essential which is the principal focus of this thesis. In this work, the performance and suitability of various signal processing techniques and transmission strategies in the application of multimedia data over wireless/mobile radio links are investigated. The proposed transmission systems for multimedia communication employ different data encoding schemes which include source coding in the wavelet domain, transmit diversity coding (space-time coding), and adaptive antenna beamforming (eigenbeamforming). By integrating these techniques into a robust communication system, the quality (SNR, etc) of multimedia signals received on mobile devices is maximised while mitigating the fast fading and multi-path effects of mobile channels. To support the transmission of high data-rate multimedia applications, a well known multi-carrier transmission technology known as Orthogonal Frequency Division Multiplexing (OFDM) has been implemented. As shown in this study, this results in significant performance gains when combined with other signal-processing techniques such as spa ce-time block coding (STBC). To optimise signal transmission, a novel unequal adaptive modulation scheme for the communication of multimedia data over MIMO-OFDM systems has been proposed. In this system, discrete wavelet transform/subband coding is used to compress data into their respective low-frequency and high-frequency components. Unlike traditional methods, however, data representing the low-frequency data are processed and modulated separately as they are more sensitive to the distortion effects of mobile radio channels. To make use of a desirable subchannel state, such that the quality (SNR) of the multimedia data recovered at the receiver is optimized, we employ a lookup matrix-adaptive bit and power allocation (LM-ABPA) algorithm. Apart from improving the spectral efficiency of OFDM, the modified LM-ABPA scheme, sorts and allocates subcarriers with the highest SNR to low-frequency data and the remaining to the least important data. To maintain a target system SNR, the LM-ABPA loading scheme assigns appropriate signal constella tion sizes and transmit power levels (modulation type) across all subcarriers and is adapted to the varying channel conditions such that the average system error-rate (SER/BER) is minimised. When configured for a constant data-rate load, simulation results show significant performance gains over non-adaptive systems. In addition to the above studies, the simulation framework developed in this work is applied to investigate the performance of other signal processing techniques for multimedia communication such as blind channel equalization, and to examine the effectiveness of a secure communication system based on a logistic chaotic generator (LCG) for chaos shift-keying (CSK)
    corecore