10 research outputs found

    Computational Intelligence and Complexity Measures for Chaotic Information Processing

    Get PDF
    This dissertation investigates the application of computational intelligence methods in the analysis of nonlinear chaotic systems in the framework of many known and newly designed complex systems. Parallel comparisons are made between these methods. This provides insight into the difficult challenges facing nonlinear systems characterization and aids in developing a generalized algorithm in computing algorithmic complexity measures, Lyapunov exponents, information dimension and topological entropy. These metrics are implemented to characterize the dynamic patterns of discrete and continuous systems. These metrics make it possible to distinguish order from disorder in these systems. Steps required for computing Lyapunov exponents with a reorthonormalization method and a group theory approach are formalized. Procedures for implementing computational algorithms are designed and numerical results for each system are presented. The advance-time sampling technique is designed to overcome the scarcity of phase space samples and the buffer overflow problem in algorithmic complexity measure estimation in slow dynamics feedback-controlled systems. It is proved analytically and tested numerically that for a quasiperiodic system like a Fibonacci map, complexity grows logarithmically with the evolutionary length of the data block. It is concluded that a normalized algorithmic complexity measure can be used as a system classifier. This quantity turns out to be one for random sequences and a non-zero value less than one for chaotic sequences. For periodic and quasi-periodic responses, as data strings grow their normalized complexity approaches zero, while a faster deceasing rate is observed for periodic responses. Algorithmic complexity analysis is performed on a class of certain rate convolutional encoders. The degree of diffusion in random-like patterns is measured. Simulation evidence indicates that algorithmic complexity associated with a particular class of 1/n-rate code increases with the increase of the encoder constraint length. This occurs in parallel with the increase of error correcting capacity of the decoder. Comparing groups of rate-1/n convolutional encoders, it is observed that as the encoder rate decreases from 1/2 to 1/7, the encoded data sequence manifests smaller algorithmic complexity with a larger free distance value

    Entropy in Image Analysis II

    Get PDF
    Image analysis is a fundamental task for any application where extracting information from images is required. The analysis requires highly sophisticated numerical and analytical methods, particularly for those applications in medicine, security, and other fields where the results of the processing consist of data of vital importance. This fact is evident from all the articles composing the Special Issue "Entropy in Image Analysis II", in which the authors used widely tested methods to verify their results. In the process of reading the present volume, the reader will appreciate the richness of their methods and applications, in particular for medical imaging and image security, and a remarkable cross-fertilization among the proposed research areas

    Entropy in Image Analysis III

    Get PDF
    Image analysis can be applied to rich and assorted scenarios; therefore, the aim of this recent research field is not only to mimic the human vision system. Image analysis is the main methods that computers are using today, and there is body of knowledge that they will be able to manage in a totally unsupervised manner in future, thanks to their artificial intelligence. The articles published in the book clearly show such a future

    Lightweight image encryption algorithms: design and evaluation

    Get PDF
    Doctor of PhilosophyDepartment of Computer ScienceArslan MunirIn an era dominated by increasing use of multimedia data such as images and videos, ensuring the security and confidentiality of images with real-time encryption is of greatest importance. Traditional encryption algorithms are secure, widely used, and recommended, yet they are not suitable nor computationally efficient for encrypting multimedia data due to the large size and high redundancy inherent in multimedia data. Thus, specialized algorithms for multimedia data encryption are needed. This dissertation explores lightweight image encryption algorithms, specifically designed to address time and resource constraints of realtime image encryption while maintaining the confidentiality and integrity of the multimedia data. The dissertation classifies image encryption based on the techniques used into seven different approaches and analyzes the strengths and weaknesses of each approach. It subsequently introduces and evaluates three novel algorithms designed to encrypt images with low complexity, high efficiency, and reliable security. These algorithms rely on a combination of permutation, substitution, and pseudorandom keystreams to ensure the security of the encrypted images. The first algorithm is based on chaotic systems. The algorithm is implemented using logistic map, permutations, AES S-box, and a plaintext related SHA-2 hash. The second algorithm is based on Trivium cipher. the algorithm is implemented to work on multi-rounds of encryption using pixel-based row and column permutations, and bit-level substitution. For the third algorithm, the Ascon algorithm selected by the National Institute of Standards and Technology (NIST) to standardize lightweight cryptography applications is evaluated for image encryption. To evaluate the proposed algorithms, a comprehensive set of security, quality, and efficiency valuation metrics is utilized to assess the proposed algorithms and compare them to contemporary image encryption algorithms

    Dynamical Systems

    Get PDF
    Complex systems are pervasive in many areas of science integrated in our daily lives. Examples include financial markets, highway transportation networks, telecommunication networks, world and country economies, social networks, immunological systems, living organisms, computational systems and electrical and mechanical structures. Complex systems are often composed of a large number of interconnected and interacting entities, exhibiting much richer global scale dynamics than the properties and behavior of individual entities. Complex systems are studied in many areas of natural sciences, social sciences, engineering and mathematical sciences. This special issue therefore intends to contribute towards the dissemination of the multifaceted concepts in accepted use by the scientific community. We hope readers enjoy this pertinent selection of papers which represents relevant examples of the state of the art in present day research. [...

    5th EUROMECH nonlinear dynamics conference, August 7-12, 2005 Eindhoven : book of abstracts

    Get PDF

    5th EUROMECH nonlinear dynamics conference, August 7-12, 2005 Eindhoven : book of abstracts

    Get PDF

    A New Image Encryption Algorithm Based on Composite Chaos and Hyperchaos Combined with DNA Coding

    No full text
    In order to obtain chaos with a wider chaotic scope and better chaotic behavior, this paper combines the several existing one-dimensional chaos and forms a new one-dimensional chaotic map by using a modular operation which is named by LLS system and abbreviated as LLSS. To get a better encryption effect, a new image encryption method based on double chaos and DNA coding technology is proposed in this paper. A new one-dimensional chaotic map is combined with a hyperchaotic Qi system to encrypt by using DNA coding. The first stage involves three rounds of scrambling; a diffusion algorithm is applied to the plaintext image, and then the intermediate ciphertext image is partitioned. The final encrypted image is formed by using DNA operation. Experimental simulation and security analysis show that this algorithm increases the key space, has high sensitivity, and can resist several common attacks. At the same time, the algorithm in this paper can reduce the correlation between adjacent pixels, making it close to 0, and increase the information entropy, making it close to the ideal value and achieving a good encryption effect
    corecore