47 research outputs found
Image authentication using LBP-based perceptual image hashing
Feature extraction is a main step in all perceptual image hashing schemes in which robust features will led to better results in perceptual robustness. Simplicity, discriminative power, computational efficiency and robustness to illumination changes are counted as distinguished properties of Local Binary Pattern features. In this paper, we investigate the use of local binary patterns for perceptual image hashing. In feature extraction, we propose to use both sign and magnitude information of local differences. So, the algorithm utilizes a combination of gradient-based and LBP-based descriptors for feature extraction. To provide security needs, two secret keys are incorporated in feature extraction and hash generation steps. Performance of the proposed hashing method is evaluated with an important application in perceptual image hashing scheme: image authentication. Experiments are conducted to show that the present method has acceptable robustness against perceptual content-preserving manipulations. Moreover, the proposed method has this capability to localize the tampering area, which is not possible in all hashing schemes
Privacy-Preserving Outsourced Media Search
International audienceThis work proposes a privacy-protection framework for an important application called outsourced media search. This scenario involves a data owner, a client, and an untrusted server, where the owner outsources a search service to the server. Due to lack of trust, the privacy of the client and the owner should be protected. The framework relies on multimedia hashing and symmetric encryption. It requires involved parties to participate in a privacy-enhancing protocol. Additional processing steps are carried out by the owner and the client: (i) before outsourcing low-level media features to the server, the owner has to one-way hash them, and partially encrypt each hash-value; (ii) the client completes the similarity search by re-ranking the most similar candidates received from the server. One-way hashing and encryption add ambiguity to data and make it difficult for the server to infer contents from database items and queries, so the privacy of both the owner and the client is enforced. The proposed framework realizes trade-offs among strength of privacy enforcement, quality of search, and complexity, because the information loss can be tuned during hashing and encryption. Extensive experiments demonstrate the effectiveness and the flexibility of the framework
Recommended from our members
Fast embedding for image classification & retrieval and its application to the hostel industry
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonContent-based image classification and retrieval are the automatic processes of taking
an unseen image input and extracting its features representing the input image. Then,
for the classification task, this mathematically measured input is categorized according
to established criteria in the server and consequently shows the output as a result. On
the other hand, for the retrieval task, the extracted features of an unseen query image
are sent to the server to search for the most visually similar images to a given image
and retrieve these images as a result. Despite image features could be represented
by classical features, artificial intelligence-based features, Convolutional Neural
Networks (CNN) to be precise, have become powerful tools in the field. Nonetheless,
the high dimensional CNN features have been a challenge in particular for applications
on mobile or Internet of Things devices. Therefore, in this thesis, several fast
embeddings are explored and proposed to overcome the constraints of low memory,
bandwidth, and power. Furthermore, the first hostel image database is created with
three datasets, hostel image dataset containing 13,908 interior and exterior images of
hostels across the world, and Hostels-900 dataset and Hostels-2K dataset containing
972 images and 2,380 images, respectively, of 20 London hostel buildings. The results
demonstrate that the proposed fast embeddings such as the application of GHM-Rand
operator, GHM-Fix operator, and binary feature vectors are able to outperform or give
competitive results to those state-of-the-art methods with a lot less computational
resource. Additionally, the findings from a ten-year literature review of CBIR study in
the tourism industry could picturize the relevant research activities in the past decade
which are not only beneficial to the hostel industry or tourism sector but also to the
computer science and engineering research communities for the potential real-life
applications of the existing and developing technologies in the field
A Quantized Johnson Lindenstrauss Lemma: The Finding of Buffon's Needle
In 1733, Georges-Louis Leclerc, Comte de Buffon in France, set the ground of
geometric probability theory by defining an enlightening problem: What is the
probability that a needle thrown randomly on a ground made of equispaced
parallel strips lies on two of them? In this work, we show that the solution to
this problem, and its generalization to dimensions, allows us to discover a
quantized form of the Johnson-Lindenstrauss (JL) Lemma, i.e., one that combines
a linear dimensionality reduction procedure with a uniform quantization of
precision . In particular, given a finite set of points and a distortion level , as soon as , we can (randomly) construct a mapping from
to that approximately
preserves the pairwise distances between the points of .
Interestingly, compared to the common JL Lemma, the mapping is quasi-isometric
and we observe both an additive and a multiplicative distortions on the
embedded distances. These two distortions, however, decay as when increases. Moreover, for coarse quantization, i.e., for high
compared to the set radius, the distortion is mainly additive, while
for small we tend to a Lipschitz isometric embedding. Finally, we
prove the existence of a "nearly" quasi-isometric embedding of into . This one involves a non-linear
distortion of the -distance in that vanishes for distant
points in this set. Noticeably, the additive distortion in this case is slower,
and decays as .Comment: 27 pages, 2 figures (note: this version corrects a few typos in the
abstract
ID Photograph hashing : a global approach
This thesis addresses the question of the authenticity of identity photographs, part of the documents required in controlled access. Since sophisticated means of reproduction are publicly available, new methods / techniques should prevent tampering and unauthorized reproduction of the photograph. This thesis proposes a hashing method for the authentication of the identity photographs, robust to print-and-scan. This study focuses also on the effects of digitization at hash level. The developed algorithm performs a dimension reduction, based on independent component analysis (ICA). In the learning stage, the subspace projection is obtained by applying ICA and then reduced according to an original entropic selection strategy. In the extraction stage, the coefficients obtained after projecting the identity image on the subspace are quantified and binarized to obtain the hash value. The study reveals the effects of the scanning noise on the hash values of the identity photographs and shows that the proposed method is robust to the print-and-scan attack. The approach focusing on robust hashing of a restricted class of images (identity) differs from classical approaches that address any imageCette thĂšse traite de la question de lâauthenticitĂ© des photographies dâidentitĂ©, partie intĂ©grante des documents nĂ©cessaires lors dâun contrĂŽle dâaccĂšs. Alors que les moyens de reproduction sophistiquĂ©s sont accessibles au grand public, de nouvelles mĂ©thodes / techniques doivent empĂȘcher toute falsification / reproduction non autorisĂ©e de la photographie dâidentitĂ©. Cette thĂšse propose une mĂ©thode de hachage pour lâauthentification de photographies dâidentitĂ©, robuste Ă lâimpression-lecture. Ce travail met ainsi lâaccent sur les effets de la numĂ©risation au niveau de hachage. Lâalgorithme mis au point procĂšde Ă une rĂ©duction de dimension, basĂ©e sur lâanalyse en composantes indĂ©pendantes (ICA). Dans la phase dâapprentissage, le sous-espace de projection est obtenu en appliquant lâICA puis rĂ©duit selon une stratĂ©gie de sĂ©lection entropique originale. Dans lâĂ©tape dâextraction, les coefficients obtenus aprĂšs projection de lâimage dâidentitĂ© sur le sous-espace sont quantifiĂ©s et binarisĂ©s pour obtenir la valeur de hachage. LâĂ©tude rĂ©vĂšle les effets du bruit de balayage intervenant lors de la numĂ©risation des photographies dâidentitĂ© sur les valeurs de hachage et montre que la mĂ©thode proposĂ©e est robuste Ă lâattaque dâimpression-lecture. Lâapproche suivie en se focalisant sur le hachage robuste dâune classe restreinte dâimages (dâidentitĂ©) se distingue des approches classiques qui adressent une image quelconqu
An Overview on Privacy Preserving Biometrics
The Internet has consolidated itself as a very powerful platform that has changed the communication and business way. Nowadays, the number of users navigating through Internet is about 1,552 millions according to Internet World Stats. This large audience demands online commerce, e-government, knowledge sharing, social networks, online gaming . . . which grew exponentially over the past few years. The security of these transactions is very important considering the number of information that could be intercepted by an attacker. Within this context, authentication is one of the most important challenges in computer security. Indeed, the authentication step is often considered as the weakest link in the security of electronic transactions. In general, the protection of the message content is achieved by using cryptographic protocols that are well known and established. The well-known ID/password is far the most used authentication method, it is widely spread despite its obvious lack of security. This is mainly due to its implementation ease and to its ergonomic feature: the users are used to this system, which enhances its acceptance and deployment. Many more sophisticated solutions exist in the state of the art to secure logical access control (one time passwords tokens, certificates . . . ) but none of them are used by a large community of users for a lack of simplicity usage (O'Gorman, 2003)..