47 research outputs found

    Image authentication using LBP-based perceptual image hashing

    Get PDF
    Feature extraction is a main step in all perceptual image hashing schemes in which robust features will led to better results in perceptual robustness. Simplicity, discriminative power, computational efficiency and robustness to illumination changes are counted as distinguished properties of Local Binary Pattern features. In this paper, we investigate the use of local binary patterns for perceptual image hashing. In feature extraction, we propose to use both sign and magnitude information of local differences. So, the algorithm utilizes a combination of gradient-based and LBP-based descriptors for feature extraction. To provide security needs, two secret keys are incorporated in feature extraction and hash generation steps. Performance of the proposed hashing method is evaluated with an important application in perceptual image hashing scheme: image authentication. Experiments are conducted to show that the present method has acceptable robustness against perceptual content-preserving manipulations. Moreover, the proposed method has this capability to localize the tampering area, which is not possible in all hashing schemes

    Privacy-Preserving Outsourced Media Search

    Get PDF
    International audienceThis work proposes a privacy-protection framework for an important application called outsourced media search. This scenario involves a data owner, a client, and an untrusted server, where the owner outsources a search service to the server. Due to lack of trust, the privacy of the client and the owner should be protected. The framework relies on multimedia hashing and symmetric encryption. It requires involved parties to participate in a privacy-enhancing protocol. Additional processing steps are carried out by the owner and the client: (i) before outsourcing low-level media features to the server, the owner has to one-way hash them, and partially encrypt each hash-value; (ii) the client completes the similarity search by re-ranking the most similar candidates received from the server. One-way hashing and encryption add ambiguity to data and make it difficult for the server to infer contents from database items and queries, so the privacy of both the owner and the client is enforced. The proposed framework realizes trade-offs among strength of privacy enforcement, quality of search, and complexity, because the information loss can be tuned during hashing and encryption. Extensive experiments demonstrate the effectiveness and the flexibility of the framework

    A Quantized Johnson Lindenstrauss Lemma: The Finding of Buffon's Needle

    Get PDF
    In 1733, Georges-Louis Leclerc, Comte de Buffon in France, set the ground of geometric probability theory by defining an enlightening problem: What is the probability that a needle thrown randomly on a ground made of equispaced parallel strips lies on two of them? In this work, we show that the solution to this problem, and its generalization to NN dimensions, allows us to discover a quantized form of the Johnson-Lindenstrauss (JL) Lemma, i.e., one that combines a linear dimensionality reduction procedure with a uniform quantization of precision ÎŽ>0\delta>0. In particular, given a finite set S⊂RN\mathcal S \subset \mathbb R^N of SS points and a distortion level Ï”>0\epsilon>0, as soon as M>M0=O(ϔ−2log⁥S)M > M_0 = O(\epsilon^{-2} \log S), we can (randomly) construct a mapping from (S,ℓ2)(\mathcal S, \ell_2) to (ÎŽZM,ℓ1)(\delta\mathbb Z^M, \ell_1) that approximately preserves the pairwise distances between the points of S\mathcal S. Interestingly, compared to the common JL Lemma, the mapping is quasi-isometric and we observe both an additive and a multiplicative distortions on the embedded distances. These two distortions, however, decay as O((log⁥S)/M)O(\sqrt{(\log S)/M}) when MM increases. Moreover, for coarse quantization, i.e., for high ÎŽ\delta compared to the set radius, the distortion is mainly additive, while for small ÎŽ\delta we tend to a Lipschitz isometric embedding. Finally, we prove the existence of a "nearly" quasi-isometric embedding of (S,ℓ2)(\mathcal S, \ell_2) into (ÎŽZM,ℓ2)(\delta\mathbb Z^M, \ell_2). This one involves a non-linear distortion of the ℓ2\ell_2-distance in S\mathcal S that vanishes for distant points in this set. Noticeably, the additive distortion in this case is slower, and decays as O((log⁥S)/M4)O(\sqrt[4]{(\log S)/M}).Comment: 27 pages, 2 figures (note: this version corrects a few typos in the abstract

    ID Photograph hashing : a global approach

    No full text
    This thesis addresses the question of the authenticity of identity photographs, part of the documents required in controlled access. Since sophisticated means of reproduction are publicly available, new methods / techniques should prevent tampering and unauthorized reproduction of the photograph. This thesis proposes a hashing method for the authentication of the identity photographs, robust to print-and-scan. This study focuses also on the effects of digitization at hash level. The developed algorithm performs a dimension reduction, based on independent component analysis (ICA). In the learning stage, the subspace projection is obtained by applying ICA and then reduced according to an original entropic selection strategy. In the extraction stage, the coefficients obtained after projecting the identity image on the subspace are quantified and binarized to obtain the hash value. The study reveals the effects of the scanning noise on the hash values of the identity photographs and shows that the proposed method is robust to the print-and-scan attack. The approach focusing on robust hashing of a restricted class of images (identity) differs from classical approaches that address any imageCette thĂšse traite de la question de l’authenticitĂ© des photographies d’identitĂ©, partie intĂ©grante des documents nĂ©cessaires lors d’un contrĂŽle d’accĂšs. Alors que les moyens de reproduction sophistiquĂ©s sont accessibles au grand public, de nouvelles mĂ©thodes / techniques doivent empĂȘcher toute falsification / reproduction non autorisĂ©e de la photographie d’identitĂ©. Cette thĂšse propose une mĂ©thode de hachage pour l’authentification de photographies d’identitĂ©, robuste Ă  l’impression-lecture. Ce travail met ainsi l’accent sur les effets de la numĂ©risation au niveau de hachage. L’algorithme mis au point procĂšde Ă  une rĂ©duction de dimension, basĂ©e sur l’analyse en composantes indĂ©pendantes (ICA). Dans la phase d’apprentissage, le sous-espace de projection est obtenu en appliquant l’ICA puis rĂ©duit selon une stratĂ©gie de sĂ©lection entropique originale. Dans l’étape d’extraction, les coefficients obtenus aprĂšs projection de l’image d’identitĂ© sur le sous-espace sont quantifiĂ©s et binarisĂ©s pour obtenir la valeur de hachage. L’étude rĂ©vĂšle les effets du bruit de balayage intervenant lors de la numĂ©risation des photographies d’identitĂ© sur les valeurs de hachage et montre que la mĂ©thode proposĂ©e est robuste Ă  l’attaque d’impression-lecture. L’approche suivie en se focalisant sur le hachage robuste d’une classe restreinte d’images (d’identitĂ©) se distingue des approches classiques qui adressent une image quelconqu

    A hybrid scheme for authenticating scalable video codestreams

    Get PDF

    An Overview on Privacy Preserving Biometrics

    Get PDF
    The Internet has consolidated itself as a very powerful platform that has changed the communication and business way. Nowadays, the number of users navigating through Internet is about 1,552 millions according to Internet World Stats. This large audience demands online commerce, e-government, knowledge sharing, social networks, online gaming . . . which grew exponentially over the past few years. The security of these transactions is very important considering the number of information that could be intercepted by an attacker. Within this context, authentication is one of the most important challenges in computer security. Indeed, the authentication step is often considered as the weakest link in the security of electronic transactions. In general, the protection of the message content is achieved by using cryptographic protocols that are well known and established. The well-known ID/password is far the most used authentication method, it is widely spread despite its obvious lack of security. This is mainly due to its implementation ease and to its ergonomic feature: the users are used to this system, which enhances its acceptance and deployment. Many more sophisticated solutions exist in the state of the art to secure logical access control (one time passwords tokens, certificates . . . ) but none of them are used by a large community of users for a lack of simplicity usage (O'Gorman, 2003)..
    corecore