777 research outputs found

    Cognitive computation of compressed sensing for watermark signal measurement

    Get PDF
    As an important tool for protecting multimedia contents, scrambling and randomizing of original messages is used in generating digital watermark for satisfying security requirements. Based on the neural perception of high-dimensional data, compressed sensing (CS) is proposed as a new technique in watermarking for improved security and reduced computational complexity. In our proposed methodology, watermark signal is extracted from the CS of the Hadamard measurement matrix. Through construction of the scrambled block Hadamard matrix utilizing a cryptographic key, encrypting the watermark signal in CS domain is achieved without any additional computation required. The extensive experiments have shown that the neural inspired CS mechanism can generate watermark signal of higher security, yet it still maintains a better trade-off between transparency and robustness

    Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    Get PDF
    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases

    Perceptual Video Hashing for Content Identification and Authentication

    Get PDF
    Perceptual hashing has been broadly used in the literature to identify similar contents for video copy detection. It has also been adopted to detect malicious manipulations for video authentication. However, targeting both applications with a single system using the same hash would be highly desirable as this saves the storage space and reduces the computational complexity. This paper proposes a perceptual video hashing system for content identification and authentication. The objective is to design a hash extraction technique that can withstand signal processing operations on one hand and detect malicious attacks on the other hand. The proposed system relies on a new signal calibration technique for extracting the hash using the discrete cosine transform (DCT) and the discrete sine transform (DST). This consists of determining the number of samples, called the normalizing shift, that is required for shifting a digital signal so that the shifted version matches a certain pattern according to DCT/DST coefficients. The rationale for the calibration idea is that the normalizing shift resists signal processing operations while it exhibits sensitivity to local tampering (i.e., replacing a small portion of the signal with a different one). While the same hash serves both applications, two different similarity measures have been proposed for video identification and authentication, respectively. Through intensive experiments with various types of video distortions and manipulations, the proposed system has been shown to outperform related state-of-the art video hashing techniques in terms of identification and authentication with the advantageous ability to locate tampered regions

    Effective Image Fingerprint Extraction Based on Random Bubble Sampling

    Get PDF
    In this paper we propose an algorithm for image fingerprint extraction based on random selection of circular bubbles on the considered image. In more detail, a fingerprint vector is associated to the image, the components of which are the variances of pixel luminance values in randomly selected circular zones of the image. The positions and radius of these bubbles result from a random selection, whose parameters are user-defined. The obtained fingerprint has then been used for content-based image retrieval, using the standard euclidean distance as similarity metric between the extracted features. Experiments based on the detection of various linearly and nonlinearly distorted versions of a test image in a large database show very promising results

    Privacy-preserving techniques for computer and network forensics

    Get PDF
    Clients, administrators, and law enforcement personnel have many privacy concerns when it comes to network forensics. Clients would like to use network services in a freedom-friendly environment that protects their privacy and personal data. Administrators would like to monitor their network, and audit its behavior and functionality for debugging and statistical purposes (which could involve invading the privacy of its network users). Finally, members of law enforcement would like to track and identify any type of digital crimes that occur on the network, and charge the suspects with the appropriate crimes. Members of law enforcement could use some security back doors made available by network administrators, or other forensic tools, that could potentially invade the privacy of network users. In my dissertation, I will be identifying and implementing techniques that each of these entities could use to achieve their goals while preserving the privacy of users on the network. I will show a privacy-preserving implementation of network flow recording that can allow administrators to monitor and audit their network behavior and functionality for debugging and statistical purposes without having this data contain any private information about its users. This implementation is based on identity-based encryption and differential privacy. I will also be showing how law enforcement could use timing channel techniques to fingerprint anonymous servers that are running websites with illegal content and services. Finally I will show the results from a thought experiment about how network administrators can identify pattern-like software that is running on clients\u27 machines remotely without any administrative privileges. The goal of my work is to understand what privileges administrators or law enforcement need to achieve their goals, and the privacy issues inherent in this, and to develop technologies that help administrators and law enforcement achieve their goals while preserving the privacy of network users
    corecore