11 research outputs found

    Taming Reversible Halftoning via Predictive Luminance

    Full text link
    Traditional halftoning usually drops colors when dithering images with binary dots, which makes it difficult to recover the original color information. We proposed a novel halftoning technique that converts a color image into a binary halftone with full restorability to its original version. Our novel base halftoning technique consists of two convolutional neural networks (CNNs) to produce the reversible halftone patterns, and a noise incentive block (NIB) to mitigate the flatness degradation issue of CNNs. Furthermore, to tackle the conflicts between the blue-noise quality and restoration accuracy in our novel base method, we proposed a predictor-embedded approach to offload predictable information from the network, which in our case is the luminance information resembling from the halftone pattern. Such an approach allows the network to gain more flexibility to produce halftones with better blue-noise quality without compromising the restoration quality. Detailed studies on the multiple-stage training method and loss weightings have been conducted. We have compared our predictor-embedded method and our novel method regarding spectrum analysis on halftone, halftone accuracy, restoration accuracy, and the data embedding studies. Our entropy evaluation evidences our halftone contains less encoding information than our novel base method. The experiments show our predictor-embedded method gains more flexibility to improve the blue-noise quality of halftones and maintains a comparable restoration quality with a higher tolerance for disturbances.Comment: to be published in IEEE Transactions on Visualization and Computer Graphic

    Near-Lossless Bitonal Image Compression System

    Get PDF
    The main purpose of this thesis is to develop an efficient near-lossless bitonal compression algorithm and to implement that algorithm on a hardware platform. The current methods for compression of bitonal images include the JBIG and JBIG2 algorithms, however both JBIG and JBIG2 have their disadvantages. Both of these algorithms are covered by patents filed by IBM, making them costly to implement commercially. Also, JBIG only provides means for lossless compression while JBIG2 provides lossy methods only for document-type images. For these reasons a new method for introducing loss and controlling this loss to sustain quality is developed. The lossless bitonal image compression algorithm used for this thesis is called Block Arithmetic Coder for Image Compression (BACIC), which can efficiently compress bitonal images. In this thesis, loss is introduced for cases where better compression efficiency is needed. However, introducing loss in bitonal images is especially difficult, because pixels undergo such a drastic change, either from white to black or black to white. Such pixel flipping introduces salt and pepper noise, which can be very distracting when viewing an image. Two methods are used in combination to control the visual distortion introduced into the image. The first is to keep track of the error created by the flipping of pixels, and using this error to decide whether flipping another pixel will cause the visual distortion to exceed a predefined threshold. The second method is region of interest consideration. In this method, lower loss or no loss is introduced into the important parts of an image, and higher loss is introduced into the less important parts. This allows for a good quality image while increasing the compression efficiency. Also, the ability of BACIC to compress grayscale images is studied and BACICm, a multiplanar BACIC algorithm, is created. A hardware implementation of the BACIC lossless bitonal image compression algorithm is also designed. The hardware implementation is done using VHDL targeting a Xilinx FPGA, which is very useful, because of its flexibility. The programmed FPGA could be included in a product of the facsimile or printing industry to handle the compression or decompression internal to the unit, giving it an advantage in the marketplace

    High Capacity Analog Channels for Smart Documents

    Get PDF
    Widely-used valuable hardcopy documents such as passports, visas, driving licenses, educational certificates, entrance-passes for entertainment events etc. are conventionally protected against counterfeiting and data tampering attacks by applying analog security technologies (e.g. KINEGRAMS®, holograms, micro-printing, UV/IR inks etc.). How-ever, easy access to high quality, low price modern desktop publishing technology has left most of these technologies ineffective, giving rise to high quality false documents. The higher price and restricted usage are other drawbacks of the analog document pro-tection techniques. Digital watermarking and high capacity storage media such as IC-chips, optical data stripes etc. are the modern technologies being used in new machine-readable identity verification documents to ensure contents integrity; however, these technologies are either expensive or do not satisfy the application needs and demand to look for more efficient document protection technologies. In this research three different high capacity analog channels: high density data stripe (HD-DataStripe), data hiding in printed halftone images (watermarking), and super-posed constant background grayscale image (CBGI) are investigated for hidden com-munication along with their applications in smart documents. On way to develop high capacity analog channels, noise encountered from printing and scanning (PS) process is investigated with the objective to recover the digital information encoded at nearly maximum channel utilization. By utilizing noise behaviour, countermeasures against the noise are taken accordingly in data recovery process. HD-DataStripe is a printed binary image similar to the conventional 2-D barcodes (e.g. PDF417), but it offers much higher data storage capacity and is intended for machine-readable identity verification documents. The capacity offered by the HD-DataStripe is sufficient to store high quality biometric characteristics rather than extracted templates, in addition to the conventional bearer related data contained in a smart ID-card. It also eliminates the need for central database system (except for backup record) and other ex-pensive storage media, currently being used. While developing novel data-reading tech-nique for HD-DataStripe, to count for the unavoidable geometrical distortions, registra-tion marks pattern is chosen in such a way so that it results in accurate sampling points (a necessary condition for reliable data recovery at higher data encoding-rate). For more sophisticated distortions caused by the physical dot gain effects (intersymbol interfer-ence), the countermeasures such as application of sampling theorem, adaptive binariza-tion and post-data processing, each one of these providing only a necessary condition for reliable data recovery, are given. Finally, combining the various filters correspond-ing to these countermeasures, a novel Data-Reading technique for HD-DataStripe is given. The novel data-reading technique results in superior performance than the exist-ing techniques, intended for data recovery from printed media. In another scenario a small-size HD-DataStripe with maximum entropy is used as a copy detection pattern by utilizing information loss encountered at nearly maximum channel capacity. While considering the application of HD-DataStripe in hardcopy documents (contracts, official letters etc.), unlike existing work [Zha04], it allows one-to-one contents matching and does not depend on hash functions and OCR technology, constraints mainly imposed by the low data storage capacity offered by the existing analog media. For printed halftone images carrying hidden information higher capacity is mainly attributed to data-reading technique for HD-DataStripe that allows data recovery at higher printing resolution, a key requirement for a high quality watermarking technique in spatial domain. Digital halftoning and data encoding techniques are the other factors that contribute to data hiding technique given in this research. While considering security aspects, the new technique allows contents integrity and authenticity verification in the present scenario in which certain amount of errors are unavoidable, restricting the usage of existing techniques given for digital contents. Finally, a superposed constant background grayscale image, obtained by the repeated application of a specially designed small binary pattern, is used as channel for hidden communication and it allows up to 33 pages of A-4 size foreground text to be encoded in one CBGI. The higher capacity is contributed from data encoding symbols and data reading technique

    Data Hiding and Its Applications

    Get PDF
    Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Using MapReduce Streaming for Distributed Life Simulation on the Cloud

    Get PDF
    Distributed software simulations are indispensable in the study of large-scale life models but often require the use of technically complex lower-level distributed computing frameworks, such as MPI. We propose to overcome the complexity challenge by applying the emerging MapReduce (MR) model to distributed life simulations and by running such simulations on the cloud. Technically, we design optimized MR streaming algorithms for discrete and continuous versions of Conway’s life according to a general MR streaming pattern. We chose life because it is simple enough as a testbed for MR’s applicability to a-life simulations and general enough to make our results applicable to various lattice-based a-life models. We implement and empirically evaluate our algorithms’ performance on Amazon’s Elastic MR cloud. Our experiments demonstrate that a single MR optimization technique called strip partitioning can reduce the execution time of continuous life simulations by 64%. To the best of our knowledge, we are the first to propose and evaluate MR streaming algorithms for lattice-based simulations. Our algorithms can serve as prototypes in the development of novel MR simulation algorithms for large-scale lattice-based a-life models.https://digitalcommons.chapman.edu/scs_books/1014/thumbnail.jp
    corecore