62 research outputs found

    Dual-image-based reversible data hiding scheme with integrity verification using exploiting modification direction

    Get PDF
    Abstract(#br)In this paper, a novel dual-image-based reversible data hiding scheme using exploiting modification direction (EMD) is proposed. This scheme embeds two 5-base secret digits into each pixel pair of the cover image simultaneously according to the EMD matrix to generate two stego-pixel pairs. By shifting these stego-pixel pairs to the appropriate locations in some cases, two meaningful shadows are produced. The secret data can be extracted accurately, and the cover image can be reconstructed completely in the data extraction and the image reconstruction procedure, respectively. Experimental results show that our scheme outperforms the comparative methods in terms of image quality and embedding ratio. Pixel-value differencing (PVD) histogram analysis reveals that our scheme..

    Exclusive-or preprocessing and dictionary coding of continuous-tone images.

    Get PDF
    The field of lossless image compression studies the various ways to represent image data in the most compact and efficient manner possible that also allows the image to be reproduced without any loss. One of the most efficient strategies used in lossless compression is to introduce entropy reduction through decorrelation. This study focuses on using the exclusive-or logic operator in a decorrelation filter as the preprocessing phase of lossless image compression of continuous-tone images. The exclusive-or logic operator is simply and reversibly applied to continuous-tone images for the purpose of extracting differences between neighboring pixels. Implementation of the exclusive-or operator also does not introduce data expansion. Traditional as well as innovative prediction methods are included for the creation of inputs for the exclusive-or logic based decorrelation filter. The results of the filter are then encoded by a variation of the Lempel-Ziv-Welch dictionary coder. Dictionary coding is selected for the coding phase of the algorithm because it does not require the storage of code tables or probabilities and because it is lower in complexity than other popular options such as Huffman or Arithmetic coding. The first modification of the Lempel-Ziv-Welch dictionary coder is that image data can be read in a sequence that is linear, 2-dimensional, or an adaptive combination of both. The second modification of the dictionary coder is that the coder can instead include multiple, dynamically chosen dictionaries. Experiments indicate that the exclusive-or operator based decorrelation filter when combined with a modified Lempel-Ziv-Welch dictionary coder provides compression comparable to algorithms that represent the current standard in lossless compression. The proposed algorithm provides compression performance that is below the Context-Based, Adaptive, Lossless Image Compression (CALIC) algorithm by 23%, below the Low Complexity Lossless Compression for Images (LOCO-I) algorithm by 19%, and below the Portable Network Graphics implementation of the Deflate algorithm by 7%, but above the Zip implementation of the Deflate algorithm by 24%. The proposed algorithm uses the exclusive-or operator in the modeling phase and uses modified Lempel-Ziv-Welch dictionary coding in the coding phase to form a low complexity, reversible, and dynamic method of lossless image compression

    Study and Implementation of Watermarking Algorithms

    Get PDF
    Water Making is the process of embedding data called a watermark into a multimedia object such that watermark can be detected or extracted later to make an assertion about the object. The object may be an audio, image or video. A copy of a digital image is identical to the original. This has in many instances, led to the use of digital content with malicious intent. One way to protect multimedia data against illegal recording and retransmission is to embed a signal, called digital signature or copyright label or watermark that authenticates the owner of the data. Data hiding, schemes to embed secondary data in digital media, have made considerable progress in recent years and attracted attention from both academia and industry. Techniques have been proposed for a variety of applications, including ownership protection, authentication and access control. Imperceptibility, robustness against moderate processing such as compression, and the ability to hide many bits are the basic but rat..

    Reversible and imperceptible watermarking approach for ensuring the integrity and authenticity of brain MR images

    Get PDF
    The digital medical workflow has many circumstances in which the image data can be manipulated both within the secured Hospital Information Systems (HIS) and outside, as images are viewed, extracted and exchanged. This potentially grows ethical and legal concerns regarding modifying images details that are crucial in medical examinations. Digital watermarking is recognised as a robust technique for enhancing trust within medical imaging by detecting alterations applied to medical images. Despite its efficiency, digital watermarking has not been widely used in medical imaging. Existing watermarking approaches often suffer from validation of their appropriateness to medical domains. Particularly, several research gaps have been identified: (i) essential requirements for the watermarking of medical images are not well defined; (ii) no standard approach can be found in the literature to evaluate the imperceptibility of watermarked images; and (iii) no study has been conducted before to test digital watermarking in a medical imaging workflow. This research aims to investigate digital watermarking to designing, analysing and applying it to medical images to confirm manipulations can be detected and tracked. In addressing these gaps, a number of original contributions have been presented. A new reversible and imperceptible watermarking approach is presented to detect manipulations of brain Magnetic Resonance (MR) images based on Difference Expansion (DE) technique. Experimental results show that the proposed method, whilst fully reversible, can also realise a watermarked image with low degradation for reasonable and controllable embedding capacity. This is fulfilled by encoding the data into smooth regions (blocks that have least differences between their pixels values) inside the Region of Interest (ROI) part of medical images and also through the elimination of the large location map (location of pixels used for encoding the data) required at extraction to retrieve the encoded data. This compares favourably to outcomes reported under current state-of-art techniques in terms of visual image quality of watermarked images. This was also evaluated through conducting a novel visual assessment based on relative Visual Grading Analysis (relative VGA) to define a perceptual threshold in which modifications become noticeable to radiographers. The proposed approach is then integrated into medical systems to verify its validity and applicability in a real application scenario of medical imaging where medical images are generated, exchanged and archived. This enhanced security measure, therefore, enables the detection of image manipulations, by an imperceptible and reversible watermarking approach, that may establish increased trust in the digital medical imaging workflow

    Flowing matter

    Get PDF
    This open access book, published in the Soft and Biological Matter series, presents an introduction to selected research topics in the broad field of flowing matter, including the dynamics of fluids with a complex internal structure -from nematic fluids to soft glasses- as well as active matter and turbulent phenomena.Flowing matter is a subject at the crossroads between physics, mathematics, chemistry, engineering, biology and earth sciences, and relies on a multidisciplinary approach to describe the emergence of the macroscopic behaviours in a system from the coordinated dynamics of its microscopic constituents.Depending on the microscopic interactions, an assembly of molecules or of mesoscopic particles can flow like a simple Newtonian fluid, deform elastically like a solid or behave in a complex manner. When the internal constituents are active, as for biological entities, one generally observes complex large-scale collective motions. Phenomenology is further complicated by the invariable tendency of fluids to display chaos at the large scales or when stirred strongly enough. This volume presents several research topics that address these phenomena encompassing the traditional micro-, meso-, and macro-scales descriptions, and contributes to our understanding of the fundamentals of flowing matter.This book is the legacy of the COST Action MP1305 “Flowing Matter”

    Spline wavelet image coding and synthesis for a VLSI based difference engine

    Get PDF
    Bibliography: leaves 142-146.The efficiency of an image compression/synthesis system based on a spline multi-resolution analysis (MRA) is investigated. The proposed system uses a quadratic spline wavelet transform combined with minimum-mean squared error vector quantization to achieve image compression. Image synthesis is accomplished by utilizing the properties of the MRA and the architecture of a custom designed display processor, the Difference Engine. The latter is ideally suited to rendering images with polynomial intensity profiles, such as those generated by the proposed spline :V1RA. Based on these properties, an adaptive image synthesis system is developed which enables one to reduce the number of instruction cycles required to reproduce images compressed using the quadratic spline wavelet transform. This adaptive approach is computationally simple and fairly robust. In addition, there is little overhead involved in its implementation

    NASA Tech Briefs, July 1992

    Get PDF
    Topics include: New Product Ideas; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences

    Teaching and Learning of Fluid Mechanics

    Get PDF
    This book contains research on the pedagogical aspects of fluid mechanics and includes case studies, lesson plans, articles on historical aspects of fluid mechanics, and novel and interesting experiments and theoretical calculations that convey complex ideas in creative ways. The current volume showcases the teaching practices of fluid dynamicists from different disciplines, ranging from mathematics, physics, mechanical engineering, and environmental engineering to chemical engineering. The suitability of these articles ranges from early undergraduate to graduate level courses and can be read by faculty and students alike. We hope this collection will encourage cross-disciplinary pedagogical practices and give students a glimpse of the wide range of applications of fluid dynamics

    Experimental and Data-driven Workflows for Microstructure-based Damage Prediction

    Get PDF
    Materialermüdung ist die häufigste Ursache für mechanisches Versagen. Die Degradationsmechanismen, welche die Lebensdauer von Bauteilen bei vergleichsweise ausgeprägten zyklischen Belastungen bestimmen, sind gut bekannt. Bei Belastungen im makroskopisch elastischen Bereich hingegen, der (sehr) hochzyklischen Ermüdung, bestimmen die innere Struktur eines Werkstoffs und die Wechselwirkung kristallografischer Defekte die Lebensdauer. Unter diesen Umständen sind die inneren Degradationsphänomene auf der mikroskopischen Skala weitgehend reversibel und führen nicht zur Bildung kritischer Schädigungen, die kontinuierlich wachsen können. Allerdings sind einige Kornensembles in polykristallinen Metallen, je nach den lokalen mikrostrukturellen Gegebenheiten, anfällig für Schädigungsinitiierung, Rissbildung und -wachstum und wirken daher als Schwachstellen. Daher weisen Bauteile, die solchen Belastungen ausgesetzt sind, oft eine ausgeprägte Lebensdauerstreuung auf. Die Tatsache, dass ein umfassendes mechanistisches Verständnis für diese Degradationsprozesse in verschiedenen Werkstoffen nicht vorliegt, hat zur Folge, dass die derzeitigen Modellierungsbemühungen die mittlere Lebensdauer und ihre Varianz in der Regel nur mit unbefriedigender Genauigkeit vorhersagen. Dies wiederum erschwert die Bauteilauslegung und macht die Nutzung von Sicherheitsfaktoren während des Dimensionierungsprozesses erforderlich. Abhilfe kann geschaffen werden, indem umfangreiche Daten zu Einflussfaktoren und deren Wirkung auf die Bildung initialer Ermüdungsschädigungen erhoben werden. Die Datenknappheit wirkt sich nach wie vor negativ auf Datenwissenschaftler und Modellierungsexperten aus, die versuchen, trotz geringer Stichprobengröße und unvollständigen Merkmalsräumen, mikrostrukturelle Abhängigkeiten abzuleiten, datengetriebene Vorhersagemodelle zu trainieren oder physikalische, regelbasierte Modelle zu parametrisieren. Die Tatsache, dass nur wenige kritische Schädigungen bezogen auf das gesamte Probenvolumen auftreten und die hochzyklische Ermüdung eine Vielzahl unterschiedlicher Abhängigkeiten aufweist, impliziert einige Anforderungen an die Datenerfassung und -verarbeitung. Am wichtigsten ist, dass die Messtechniken so empfindlich sind, dass nuancierte Schwankungen im Probenzustand erfasst werden können, dass die gesamte Routine effizient ist und dass die korrelative Mikroskopie räumliche Informationen aus verschiedenen Messungen miteinander verbindet. Das Hauptziel dieser Arbeit besteht darin, einen Workflow zu etablieren, der den Datenmangel behebt, so dass die zukünftige virtuelle Auslegung von Komponenten effizienter, zuverlässiger und nachhaltiger gestaltet werden kann. Zu diesem Zweck wird in dieser Arbeit ein kombinierter experimenteller und datenverarbeitender Workflow vorgeschlagen, um multimodale Datensätze zu Ermüdungsschädigungen zu erzeugen. Der Schwerpunkt liegt dabei auf dem Auftreten von lokalen Gleitbändern, der Rissinitiierung und dem Wachstum mikrostrukturell kurzer Risse. Der Workflow vereint die Ermüdungsprüfung von mesoskaligen Proben, um die Empfindlichkeit der Schädigungsdetektion zu erhöhen, die ergänzende Charakterisierung, die multimodale Registrierung und Datenfusion der heterogenen Daten, sowie die bildverarbeitungsbasierte Schädigungslokalisierung und -bewertung. Mesoskalige Biegeresonanzprüfung ermöglicht das Erreichen des hochzyklischen Ermüdungszustands in vergleichsweise kurzen Zeitspannen bei gleichzeitig verbessertem Auflösungsvermögen der Schädigungsentwicklung. Je nach Komplexität der einzelnen Bildverarbeitungsaufgaben und Datenverfügbarkeit werden entweder regelbasierte Bildverarbeitungsverfahren oder Repräsentationslernen gezielt eingesetzt. So sorgt beispielsweise die semantische Segmentierung von Schädigungsstellen dafür, dass wichtige Ermüdungsmerkmale aus mikroskopischen Abbildungen extrahiert werden können. Entlang des Workflows wird auf einen hohen Automatisierungsgrad Wert gelegt. Wann immer möglich, wurde die Generalisierbarkeit einzelner Workflow-Elemente untersucht. Dieser Workflow wird auf einen ferritischen Stahl (EN 1.4003) angewendet. Der resultierende Datensatz verknüpft unter anderem große verzerrungskorrigierte Mikrostrukturdaten mit der Schädigungslokalisierung und deren zyklischer Entwicklung. Im Zuge der Arbeit wird der Datensatz wird im Hinblick auf seinen Informationsgehalt untersucht, indem detaillierte, analytische Studien zur einzelnen Schädigungsbildung durchgeführt werden. Auf diese Weise konnten unter anderem neuartige, quantitative Erkenntnisse über mikrostrukturinduzierte plastische Verformungs- und Rissstopmechanismen gewonnen werden. Darüber hinaus werden aus dem Datensatz abgeleitete kornweise Merkmalsvektoren und binäre Schädigungskategorien verwendet, um einen Random-Forest-Klassifikator zu trainieren und dessen Vorhersagegüte zu bewerten. Der vorgeschlagene Workflow hat das Potenzial, die Grundlage für künftiges Data Mining und datengetriebene Modellierung mikrostrukturempfindlicher Ermüdung zu legen. Er erlaubt die effiziente Erhebung statistisch repräsentativer Datensätze mit gleichzeitig hohem Informationsgehalt und kann auf eine Vielzahl von Werkstoffen ausgeweitet werden
    corecore