536 research outputs found

    Encryption and Decryption of Images with Pixel Data Modification Using Hand Gesture Passcodes

    Get PDF
    To ensure data security and safeguard sensitive information in society, image encryption and decryption as well as pixel data modifications, are essential. To avoid misuse and preserve trust in our digital environment, it is crucial to use these technologies responsibly and ethically. So, to overcome some of the issues, the authors designed a way to modify pixel data that would hold the hidden information. The objective of this work is to change the pixel values in a way that can be used to store information about black and white image pixel data. Prior to encryption and decryption, by using Python we were able to construct a passcode with hand gestures in the air, then encrypt it without any data loss. It concentrates on keeping track of simply two pixel values. Thus, pixel values are slightly changed to ensure the masked image is not misleading. Considering that the RGB values are at their border values of 254, 255 the test cases of masking overcome issues with the corner values susceptibility

    A Novel Hybrid Secure Image Encryption Based on Julia Set of Fractals and 3D Lorenz Chaotic Map

    Get PDF
    Chaos-based encryption schemes have attracted many researchers around the world in the digital image security domain. Digital images can be secured using existing chaotic maps, multiple chaotic maps, and several other hybrid dynamic systems that enhance the non-linearity of digital images. The combined property of confusion and diffusion was introduced by Claude Shannon which can be employed for digital image security. In this paper, we proposed a novel system that is computationally less expensive and provided a higher level of security. The system is based on a shuffling process with fractals key along with three-dimensional Lorenz chaotic map. The shuffling process added the confusion property and the pixels of the standard image is shuffled. Three-dimensional Lorenz chaotic map is used for a diffusion process which distorted all pixels of the image. In the statistical security test, means square error (MSE) evaluated error value was greater than the average value of 10000 for all standard images. The value of peak signal to noise (PSNR) was 7.69(dB) for the test image. Moreover, the calculated correlation coefficient values for each direction of the encrypted images was less than zero with a number of pixel change rate (NPCR) higher than 99%. During the security test, the entropy values were more than 7.9 for each grey channel which is almost equal to the ideal value of 8 for an 8-bit system. Numerous security tests and low computational complexity tests validate the security, robustness, and real-time implementation of the presented scheme

    Entropy in Image Analysis II

    Get PDF
    Image analysis is a fundamental task for any application where extracting information from images is required. The analysis requires highly sophisticated numerical and analytical methods, particularly for those applications in medicine, security, and other fields where the results of the processing consist of data of vital importance. This fact is evident from all the articles composing the Special Issue "Entropy in Image Analysis II", in which the authors used widely tested methods to verify their results. In the process of reading the present volume, the reader will appreciate the richness of their methods and applications, in particular for medical imaging and image security, and a remarkable cross-fertilization among the proposed research areas

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Data Hiding and Its Applications

    Get PDF
    Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others

    High imperceptibility and robustness watermarking scheme for brain MRI using Slantlet transform coupled with enhanced knight tour algorithm

    Get PDF
    This research introduces a novel and robust watermarking scheme for medical Brain MRI DICOM images, addressing the challenge of maintaining high imperceptibility and robustness simultaneously. The scheme ensures privacy control, content authentication, and protection against the detachment of vital Electronic Patient Record information. To enhance imperceptibility, a Dynamic Visibility Threshold parameter leveraging the Human Visual System is introduced. Embeddable Zones and Non-Embeddable Zones are defined to enhance robustness, and an enhanced Knight Tour algorithm based on Slantlet Transform shuffles the embedding sequence for added security. The scheme achieves remarkable results with a Peak Signal-to-Noise Ratio (PSNR) evaluation surpassing contemporary techniques. Extensive experimentation demonstrates resilience to various attacks, with low Bit Error Rate (BER) and high Normalized Cross-Correlation (NCC) values. The proposed technique outperforms existing methods, emphasizing its superior performance and effectiveness in medical image watermarking

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Development Of A High Performance Mosaicing And Super-Resolution Algorithm

    Get PDF
    In this dissertation, a high-performance mosaicing and super-resolution algorithm is described. The scale invariant feature transform (SIFT)-based mosaicing algorithm builds an initial mosaic which is iteratively updated by the robust super resolution algorithm to achieve the final high-resolution mosaic. Two different types of datasets are used for testing: high altitude balloon data and unmanned aerial vehicle data. To evaluate our algorithm, five performance metrics are employed: mean square error, peak signal to noise ratio, singular value decomposition, slope of reciprocal singular value curve, and cumulative probability of blur detection. Extensive testing shows that the proposed algorithm is effective in improving the captured aerial data and the performance metrics are accurate in quantifying the evaluation of the algorithm

    Development of an Advanced Radioxenon Detector for Nuclear Explosion Monitoring

    Full text link
    The Comprehensive Nuclear-Test-Ban Treaty was opened for signature in 1996 and seeks to ban nuclear weapons testing worldwide. The International Monitoring System (IMS) was established to verify treaty compliance, and consists of four technologies: seismic, infrasound, hydroacoustic, and radionuclide. The radionuclide component of the IMS conducts atmospheric monitoring to identify radioactive particles and gases associated with nuclear testing, such as radioxenon. As a noble gas, the radioxenon produced in an underground nuclear explosion can be released into the atmosphere, for subsequent detection by the IMS. Radioxenon is also produced by fission-based civilian processes, such as nuclear reactors and medical isotope production facilities, requiring discrimination between these sources. The focus of this work is to improve the resolution and sensitivity of radioxenon monitoring systems. Radioxenon is measured using beta-gamma coincidence techniques, typically with scintillating plastic and NaI(Tl) detectors; however, the poor energy resolution of the plastic results in isotopic interference, complicating the analysis. Additionally, radon emits decay energies that interfere with those from radioxenon, requiring complex gas-processing systems to filter it from the sample. Furthermore, radioxenon diffuses into the plastic detectors, which increases the background of subsequent measurements; this phenomenon is known as the memory effect. To mitigate these issues, this thesis demonstrated 1) an anticoincidence analysis method to better identify metastable isotopes, 2) a validated MCNPX-PoliMi simulation tool to analyze new detector systems and produce training spectra for analysis testing, and 3) a prototype radioxenon detector system based on stilbene. Stilbene cell prototypes have been developed, tested, and compared with a traditional plastic scintillator cell. The results show that the stilbene cell has similar response to the plastic cell with an improved energy resolution, full-width at half-maximum decreased by 2.2 keV at 129 keV. The stilbene cell is capable of pulse shape discrimination allowing for radon mitigation through alpha identification. The analysis presented reduced the minimum detectable concentration of Xe-135 by 1% and could be used for environmental monitoring. The stilbene cell was shown to have 0.043% residual activity compared to 4.5% residual activity for the plastic cell, demonstrating significantly improved memory effect. The results presented in the thesis allow for better identification of metastable isotopes, improved simulation techniques, and improved detection sensitivity which could lead to improved source discrimination strengthening the Comprehensive Nuclear-Test-Ban Treaty verification regime.PHDNuclear Engineering & Radiological SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/147543/1/csivels_1.pd
    corecore