287 research outputs found

    Implementation SHA512 Hash Function And Boyer-Moore String Matching Algorithm For Jpeg/exif Message Digest Compilation

    Get PDF
    Security information method for JPEG/exif documents generally aims to prevent security attacks by protecting documents with password and watermark. Both methods cannot be used to determine the condition of data integrity at the detection stage of the information security cycle. Message Digest is the essence of a file that has a function as a digital fingerprint to represent data integrity. This study aims to compile digital fingerprints to detect changes that occurred in JPEG / exif documents in information security. The research phase consists of five stages. The first stage, identification of the JPEG / exif document structure conducted using the Boyer-Moore string matching algorithm to find JPEG/exif segments location. The Second stage is segment content acquisition, conducted based on segment location and length obtained. The Third step, computing message digest for each segment using SHA512 hash function. Fourth stage, JPEG / exif document modification experiments to identified affected segments. The fifth stage is selecting and combining the hash value of the segment into the message digest. The obtained result shows the message digest for JPEG/exif documents composed of three hash values. The SOI segment hash value used to detect modifications for JPEG to png conversion and image editing. The APP1 hash value used to detect metadata editing. The SOF0 hash values use to detect modification for image recoloring, cropping and resizing — the combination from three hash values as JPEG/exif’s message digest

    Specification and implementation of metadata for secure image provenance information

    Get PDF
    The booming of AI tools capable of modifying images has equipped fake media producers with strong tools in their arsenal. Complementary to the efforts of implementing fake media detectors, research organizations are designing a standardized way of describing the modification history of digital media in a cryptographically secure way, ensuring that this information cannot be tampered with. This thesis proposes a specification which focuses on JPEG images and specifies a data model based on the JPEG Universal Metadata Box Format (JUMBF) standard. Furthermore, it proposes the encryption of a subset of provenance metadata that could pose privacy-related risks to the users. Along with the specification, a library has been developed to manage provenance information of JPEG images. To that extent, a set of libraries that handle JUMBF information is required to be implemented. These libraries have been submitted as a proposed reference software contributing to the JUMBF standard

    Boyer-Moore String Matching Algorithm and SHA512 Implementation for Jpeg/exif File Fingerprint Compilation in DSA

    Get PDF
    The jpeg/exif is file’s format for image produced by digital camera such as in the smartphones. The security method for jpeg/exif usages in digital communication currently only full-fill prevention aspect from three aspects of information security, prevention, detection and response. Digital Signature Algorithm (DSA) is a cryptographic method that provide detection aspect of information security by using hash-value as fingerprint of digital documents. The purpose of this research is to compile jpeg/exif file data fingerprint using the hash-value from DSA. The research conducted in four stages. The first stages is the identification of jpeg/exif file structure using Boyer-Moore string matching algorithm to locate the position of file’s segments. The second stage is segment’s content acquisition. The third stage the image files modification experiments to select the suitable element of jpeg/exif file data fingerprint. The fourth stage is the compilation of hash-values to form data fingerprint. The Obtained result has shown that the jpeg/exif file fingerprint comprises three hash value from the SOI segment, APP1's segment, and the SOF0 segment. The jpeg/exif file fingerprint can use for modified image detection, include six types of image modification there are image resizing, text addition, metadata modification, image resizing, image cropping and file type conversio

    The Most Moral of Rights: The Right to be Recognized as the Author of One\u27s Work

    Get PDF
    The U.S. Constitution authorizes Congress to secure for limited times the exclusive right of authors to their writings. Curiously, those rights, as enacted in our copyright laws, have not included a general right to be recognized as the author of one\u27s writings. Yet, the interest in being identified with one\u27s work is fundamental, whatever the conception of the philosophical or policy basis for copyright. The basic fairness of giving credit where it is due advances both the author-regarding and the public-regarding aspects of copyright. Most national copyright laws guarantee the right of attribution (or paternity ); the leading international copyright treaty, the Berne Convention, requires that Member States protect other Members\u27 authors\u27 right to claim authorship. But, apart from an infinitesimal (and badly drafted) recognition of the right in the 1990 Visual Artist’s Right Act, and an uncertain and indirect route through protection of copyright management information, the U.S. has not implemented that obligation. Perpetuating that omission not only allows a source of international embarrassment to continue to fester; it also belittles our own creators. Copyright not only protects the economic interests in a work of authorship, it also secures (or should secure) the dignitary interests that for many authors precede monetary gain. Without established and enforceable attribution rights, U.S. copyright neither meets international norms nor fulfills the aspirations of the constitutional copyright clause. This article will analyze the bases and enforceability of attribution rights within international norms. It will review the sources of attribution rights in the current US copyright law, particularly the Visual Artists Rights Act, and section 1202\u27s coverage of copyright management information. It will explore the extent to which removal of author-identifying information might violate section 1202 and/or disqualify an online service provider from the section 512 safe harbors. Finally, it will consider how our law might be interpreted or amended to provide for authorship attribution. Non-legislative measures include making authorship attribution a consideration under the first factor of the fair use defense

    Exploration of media blockchain technologies for JPEG privacy and security

    Get PDF
    Privacy and security, copyright violations and fake news are emerging challenges in digital media. Social media and data leaks increase risk of user privacy. Creative media particularly images are often susceptible to copyright violations which poses a serious problem to the industry. On the other hand, doctored images using photo editing tools and computer generated images may give a false impression of reality and add to the problem of fake news. These problems demand solutions to protect images and associated metadata as well as methods that can proof the integrity of digital media. For these reasons, the JPEG standardization committee has been working on a new Privacy and Security standard that provides solutions to support privacy and security focused workflows. The standard defines tools to support protection and integrity across the wide range of JPEG image standards. Related to image integrity, blockchain technology provides a solution for creating tamper proof distributed ledgers. However, adopting blockchain technology for digital image integrity poses several challenges at the technology level as well as at the level of privacy legislation. In addition, if blockchain technology is adopted to support media applications, it needs to be closely integrated with a widely adopted standard to ensure broad interoperability. Therefore, the JPEG committee initiated an activity to explore standardization needs related to media blockchain and distributed ledger technologies (DLT). This paper explains the scope and implementation of the JPEG Privacy and Security standard and presents the current state of the exploration on standardization needs related to media blockchain applications

    Digital lifecycles and file types: final report

    Get PDF
    The Rights and Rewards in Blended Institutional Repositories Project is funded by the Joint Information Systems Committee (JISC) under the Digital Repositories Programme. This represents a cooperative venture between the Department of Information Science (DIS), the Engineering Centre for Excellence in Teaching and Learning (engCETL) and the University Library. The two year project aims to establish a single Blended repository to meet the teaching and research needs of this institution. It will address the motivational issues facing depositors of teaching materials with a focus on the associated Rights and Rewards. This digital lifecycles study will identify the most appropriate materials for submission to the project’s demonstrator repository. This takes into account factors like: granularity, persistence and multimedia types that can be supported for both teaching and research materials. It also documents the existing lifecycles of these items and the tools and specifications needed within a repository frameworks to support these lifecycles. For example, it will identify appropriate granularity of teaching resources and appropriate methods for content packaging. The results of the study will help to identify which types of files are currently in use, which formats should be supported by the repository system ultimately selected for the demonstrator repository. This information is likely to be of benefit to other projects and institutions in the process of setting up an Institutional Repository (IR)

    The use of technical metadata in still digital imaging by the newspaper industry

    Get PDF
    Newspapers are increasingly capturing images digitally. Included with these digital files is technical information about the conditions of the image and the conditions surrounding image capture. Technical metadata has the potential to be a valuable resource in image reproduction, management, and archiving. Nevertheless, even though digital devices capture a large amount of technical metadata, the use of such data in the digital imaging workflow is not widespread. The use of technical metadata requires a uniform set of technical metadata standards and an open encoding scheme to embed data. From their inception, image file formats, such as TIFF and JPEG, have allowed the inclusion of technical metadata tags. The Exif schema has extended the metadata inclusion capabilities of both of these formats. Additionally, XML has emerged as a standard for users to add metadata to image files. Consequently, organizations such as the World Wide Web Consortium and Adobe Systems all support XML. Moreover, organizations such as the Digital Imaging Group (DIG35) and the National Information Standards Organization (NISO) are defining standards for technical metadata inclusion. The purpose of this study was to answer two fundamental questions about technical metadata in the newspaper industry. First, it assessed the ability of technical metadata to improve the newspaper digital imaging workflow; and second, it determined how technical metadata could be used to preserve the integrity of newspaper digital images. This study examined five large newspaper organizations: The Chicago Tribune, The New York Times, The Rochester Democrat & Chronicle, USA Today, and The Washington Post. Based on interviews and questionnaire responses, each organization?s use of technical metadata in the digital imaging workflow was examined through case studies. Interviews were conducted with those individuals responsible for image capture, adjustment, database management, and output. Furthermore, participants were asked to rank the importance of selected fields of technical metadata through a questionnaire. It was found that the use of technical metadata classified by NISO as Basic Image Parameters, which includes file size, type, and compression, was universal in newspaper organizations. The use of Image Creation metadata was not widespread with the exception of two fields that established date and time of capture and assigned each image a unique identifier. Image Performance Assessment metadata, such as test targets, was not widely used except by The Rochester Democrat & Chronicle. Change History fell victim to the short cycle time in the newspaper industry, and for the most part, a history of change was kept at various handoffs in the digital workflow through versioning. The use of technical metadata to improve the digital workflow, to an extent, was at cross-purposes with newspapers? need to visually examine each image to determine its usefulness. However, software designed to visually present technical metadata through a well designed graphic user interface was popular. It appeared that technical metadata had the potential to benefit newspapers when repurposing images for other media. Additionally, large newspaper organizations were creating their own image databases; while the use of technical metadata in these archives was unclear, it would be prudent to include too much technical metadata, rather than too little. The foremost concern of all organizations was preservation of the editorial integrity of the image. Newspapers defined editorial integrity as the ability to capture as much detail of an event as possible, and then present that information to their readers in a truthful, unambiguous way. Research pointed out that image reproduction quality was only one of a series of variables that determined newspaper image quality. With the advent of digital photography, photographers are editing more in the field, and as a result they are making decisions regarding image content. The use of technical metadata has the potential to provide greater tractability of these outtakes. Additionally, the industry is moving toward the Camera Raw file format to acquire image data that is unprocessed by camera software. The adjustment of Camera Raw files through a GUI, and their subsequent conversion to another file format, represented a de facto use of technical metadata to preserve editorial integrity
    • …
    corecore