11 research outputs found

    TTP-free Asymmetric Fingerprinting based on Client Side Embedding

    Get PDF
    In this paper, we propose a solution for implementing an asymmetric fingerprinting protocol within a client-side embedding distribution framework. The scheme is based on two novel client-side embedding techniques that are able to reliably transmit a binary fingerprint. The first one relies on standard spread-spectrum like client-side embedding, while the second one is based on an innovative client-side informed embedding technique. The proposed techniques enable secure distribution of personalized decryption keys containing the Buyer's fingerprint by means of existing asymmetric protocols, without using a trusted third party. Simulation results show that the fingerprint can be reliably recovered by using either non-blind decoding with standard embedding or blind decoding with informed embedding, and in both cases it is robust with respect to common attacks. To the best of our knowledge, the proposed scheme is the first solution addressing asymmetric fingerprinting within a clientside framework, representing a valid solution to both customer's rights and scalability issues in multimedia content distributio

    Security of Lattice-Based Data Hiding Against the Watermarked-Only Attack

    Full text link

    Watermarking security

    Get PDF
    International audienceThis chapter deals with applications where watermarking is a security primitive included in a larger system protecting the value of multimedia content. In this context, there might exist dishonest users, in the sequel so-called attackers, willing to read/overwrite hidden messages or simply to remove the watermark signal.The goal of this section is to play the role of the attacker. We analyze means to deduce information about the watermarking technique that will later ease the forgery of attacked copies. This chapter first proposes a topology of the threats in Section 6.1, introducing three different concepts: robustness, worst-case attacks, and security. Previous chapter has already discussed watermark robustness. We focus on worst-case attacks in Section 6.2, on the way to measure watermarking security in Section 6.3, and on the classical tools to break a watermarking scheme in Section 6.4. This tour of watermarking security concludes by a summary of what we know and still do not know about it (Section 6.5) and a review of oracle attacks (Section 6.6). Last, Section 6.7 deals with protocol attacks, a notion which underlines the illusion of security that a watermarking primitive might bring when not properly used in some applications

    Coding with side information

    Get PDF
    Source coding and channel coding are two important problems in communications. Although side information exists in everyday scenario, the effect of side information is not taken into account in the conventional setups. In this thesis, we focus on the practical designs of two interesting coding problems with side information: Wyner-Ziv coding (source coding with side information at the decoder) and Gel??fand-Pinsker coding (channel coding with side information at the encoder). For WZC, we split the design problem into the two cases when the distortion of the reconstructed source is zero and when it is not. We review that the first case, which is commonly called Slepian-Wolf coding (SWC), can be implemented using conventional channel coding. Then, we detail the SWC design using the low-density parity-check (LDPC) code. To facilitate SWC design, we justify a necessary requirement that the SWC performance should be independent of the input source. We show that a sufficient condition of this requirement is that the hypothetical channel between the source and the side information satisfies a symmetry condition dubbed dual symmetry. Furthermore, under that dual symmetry condition, SWC design problem can be simply treated as LDPC coding design over the hypothetical channel. When the distortion of the reconstructed source is non-zero, we propose a practical WZC paradigm called Slepian-Wolf coded quantization (SWCQ) by combining SWC and nested lattice quantization. We point out an interesting analogy between SWCQ and entropy coded quantization in classic source coding. Furthermore, a practical scheme of SWCQ using 1-D nested lattice quantization and LDPC is implemented. For GPC, since the actual design procedure relies on the more precise setting of the problem, we choose to investigate the design of GPC as the form of a digital watermarking problem as digital watermarking is the precise dual of WZC. We then introduce an enhanced version of the well-known spread spectrum watermarking technique. Two applications related to digital watermarking are presented

    Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions

    Full text link
    An analysis of steganographic systems subject to the following perfect undetectability condition is presented in this paper. Following embedding of the message into the covertext, the resulting stegotext is required to have exactly the same probability distribution as the covertext. Then no statistical test can reliably detect the presence of the hidden message. We refer to such steganographic schemes as perfectly secure. A few such schemes have been proposed in recent literature, but they have vanishing rate. We prove that communication performance can potentially be vastly improved; specifically, our basic setup assumes independently and identically distributed (i.i.d.) covertext, and we construct perfectly secure steganographic codes from public watermarking codes using binning methods and randomized permutations of the code. The permutation is a secret key shared between encoder and decoder. We derive (positive) capacity and random-coding exponents for perfectly-secure steganographic systems. The error exponents provide estimates of the code length required to achieve a target low error probability. We address the potential loss in communication performance due to the perfect-security requirement. This loss is the same as the loss obtained under a weaker order-1 steganographic requirement that would just require matching of first-order marginals of the covertext and stegotext distributions. Furthermore, no loss occurs if the covertext distribution is uniform and the distortion metric is cyclically symmetric; steganographic capacity is then achieved by randomized linear codes. Our framework may also be useful for developing computationally secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore Version 2 as the file was corrupte

    Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions

    Full text link
    An analysis of steganographic systems subject to the following perfect undetectability condition is presented in this paper. Following embedding of the message into the covertext, the resulting stegotext is required to have exactly the same probability distribution as the covertext. Then no statistical test can reliably detect the presence of the hidden message. We refer to such steganographic schemes as perfectly secure. A few such schemes have been proposed in recent literature, but they have vanishing rate. We prove that communication performance can potentially be vastly improved; specifically, our basic setup assumes independently and identically distributed (i.i.d.) covertext, and we construct perfectly secure steganographic codes from public watermarking codes using binning methods and randomized permutations of the code. The permutation is a secret key shared between encoder and decoder. We derive (positive) capacity and random-coding exponents for perfectly-secure steganographic systems. The error exponents provide estimates of the code length required to achieve a target low error probability. We address the potential loss in communication performance due to the perfect-security requirement. This loss is the same as the loss obtained under a weaker order-1 steganographic requirement that would just require matching of first-order marginals of the covertext and stegotext distributions. Furthermore, no loss occurs if the covertext distribution is uniform and the distortion metric is cyclically symmetric; steganographic capacity is then achieved by randomized linear codes. Our framework may also be useful for developing computationally secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore Version 2 as the file was corrupte

    Systematic hybrid analog/digital signal coding

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2000.Includes bibliographical references (p. 201-206).This thesis develops low-latency, low-complexity signal processing solutions for systematic source coding, or source coding with side information at the decoder. We consider an analog source signal transmitted through a hybrid channel that is the composition of two channels: a noisy analog channel through which the source is sent unprocessed and a secondary rate-constrained digital channel; the source is processed prior to transmission through the digital channel. The challenge is to design a digital encoder and decoder that provide a minimum-distortion reconstruction of the source at the decoder, which has observations of analog and digital channel outputs. The methods described in this thesis have importance to a wide array of applications. For example, in the case of in-band on-channel (IBOC) digital audio broadcast (DAB), an existing noisy analog communications infrastructure may be augmented by a low-bandwidth digital side channel for improved fidelity, while compatibility with existing analog receivers is preserved. Another application is a source coding scheme which devotes a fraction of available bandwidth to the analog source and the rest of the bandwidth to a digital representation. This scheme is applicable in a wireless communications environment (or any environment with unknown SNR), where analog transmission has the advantage of a gentle roll-off of fidelity with SNR. A very general paradigm for low-latency, low-complexity source coding is composed of three basic cascaded elements: 1) a space rotation, or transformation, 2) quantization, and 3) lossless bitstream coding. The paradigm has been applied with great success to conventional source coding, and it applies equally well to systematic source coding. Focusing on the case involving a Gaussian source, Gaussian channel and mean-squared distortion, we determine optimal or near-optimal components for each of the three elements, each of which has analogous components in conventional source coding. The space rotation can take many forms such as linear block transforms, lapped transforms, or subband decomposition, all for which we derive conditions of optimality. For a very general case we develop algorithms for the design of locally optimal quantizers. For the Gaussian case, we describe a low-complexity scalar quantizer, the nested lattice scalar quantizer, that has performance very near that of the optimal systematic scalar quantizer. Analogous to entropy coding for conventional source coding, Slepian-Wolf coding is shown to be an effective lossless bitstream coding stage for systematic source coding.by Richard J. Barron.Ph.D

    Contribution des filtres LPTV et des techniques d'interpolation au tatouage numérique

    Get PDF
    Les Changements d'Horloge Périodiques (PCC) et les filtres Linéaires Variant Périodiquement dans le Temps (LPTV) sont utilisés dans le domaine des télécommunications multi-utilisateurs. Dans cette thèse, nous montrons que, dans l'ensemble des techniques de tatouage par étalement de spectre, ils peuvent se substituer à la modulation par code pseudo-aléatoire. Les modules de décodage optimal, de resynchronisation, de pré-annulation des interférences et de quantification de la transformée d'étalement s'appliquent également aux PCC et aux filtres LPTV. Pour le modèle de signaux stationnaires blancs gaussiens, ces techniques présentent des performances identiques à l'étalement à Séquence Directe (DS) classique. Cependant, nous montrons que, dans le cas d'un signal corrélé localement, la luminance d'une image naturelle notamment, la périodicité des PCC et des filtres LPTV associée à un parcours d'image de type Peano-Hilbert conduit à de meilleures performances. Les filtres LPTV sont en outre un outil plus puissant qu'une simple modulation DS. Nous les utilisons pour effectuer un masquage spectral simultanément à l'étalement, ainsi qu'un rejet des interférences de l'image dans le domaine spectral. Cette dernière technique possède de très bonnes performances au décodage. Le second axe de cette thèse est l'étude des liens entre interpolation et tatouage numérique. Nous soulignons d'abord le rôle de l'interpolation dans les attaques sur la robustesse du tatouage. Nous construisons ensuite des techniques de tatouage bénéficiant des propriétés perceptuelles de l'interpolation. La première consiste en des masques perceptuels utilisant le bruit d'interpolation. Dans la seconde, un schéma de tatouage informé est construit autour de l'interpolation. Cet algorithme, qu'on peut relier aux techniques de catégorisation aléatoire, utilise des règles d'insertion et de décodage originales, incluant un masquage perceptuel intrinsèque. Outre ces bonnes propriétés perceptuelles, il présente un rejet des interférences de l'hôte et une robustesse à diverses attaques telles que les transformations valumétriques. Son niveau de sécurité est évalué à l'aide d'algorithmes d'attaque pratiques. ABSTRACT : Periodic Clock Changes (PCC) and Linear Periodically Time Varying (LPTV) filters have previously been applied to multi-user telecommunications in the Signal and Communications group of IRIT laboratory. In this thesis, we show that in each digital watermarking scheme involving spread-spectrum, they can be substituted to modulation by a pseudo-noise. The additional steps of optimal decoding, resynchronization, pre-cancellation of interference and quantization of a spread transform apply also to PCCs and LPTV filters. For white Gaussian stationary signals, these techniques offer similar performance as classical Direct Sequence (DS) spreading. However we show that, in the case of locally correlated signals such as image luminance, the periodicity of PCCs and LPTV filters associated to a Peano-Hilbert scan leads to better performance. Moreover, LPTV filters are a more powerful tool than simple DS modulation. We use LPTV filters to conduct spectrum masking simultaneous to spreading, as well as image interference cancellation in the spectral domain. The latter technique offers good decoding performance. The second axis of this thesis is the study of the links between interpolation and digital watermarking.We stress the role of interpolation in attacks on the watermark.We propose then watermarking techniques that benefit from interpolation perceptual properties. The first technique consists in constructing perceptualmasks proportional to an interpolation error. In the second technique, an informed watermarking scheme derives form interpolation. This scheme exhibits good perceptual properties, host-interference rejection and robustness to various attacks such as valumetric transforms. Its security level is assessed by ad hoc practical attack algorithms

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Systèmes de cryptocalculs, compilation et support d’exécution

    Get PDF
    Our approach in this thesis was to identify where FHE could be used in computer science and to build an experimental platform that allow us to test real-life algorithm running on homomorphically-encrypted data. The first part of this thesis is dedicated to the state of the art. We first present homomorphic encryption schemes designed before 2008 and then move to the Fully Homomorphic Encryption period. We describe several schemes of interest for this thesis and discuss FHE implementations. Finally, we present Yao’s garbled circuits as they can solve similar problems as FHE and briefly talk about Functional Encryption (FE). The second part of this thesis is for our contributions to the subject. We begin by explaining how FHE can be useful in various scenarios and try to provide practical use cases that we identified during the thesis. Then, we describe our approach to perform computations on encrypted data using FHE and explain how we were able to build on just the homomorphic addition and multiplication a platform for the execution in the encrypted domain of a wide range of algorithms. We then detail our solution for performing private queries on an encrypted database using homomorphic encryption. In a final chapter, we present our experimental results.Notre approche dans cette thèse était d'identifier où le chiffrement complètement homomorphe (FHE) pouvait être utilisé pour le domaine des sciences informatiques et de construire une plate-forme expérimentale qui nous permette de tester des algorithmes de traitement de l'information manipulant des données chiffrées. La première partie de cette thèse est consacrée à l'état de l'art. Nous présentons d'abord les systèmes de chiffrement homomorphes conçus avant 2008, puis nous présentons ceux adressant la problématique du chiffrement complètement homomorphe. Nous décrivons plusieurs méthodes de chiffrement d'intérêt pour cette thèse et discutons de leurs implémentations FHE. Enfin, nous présentons des circuits de Yao car ils peuvent résoudre des problèmes similaires que le FHE et nous parlons brièvement du chiffrement fonctionnel (FE). La deuxième partie de cette thèse présente nos contributions. Nous commençons par expliquer comment le FHE peut être utile dans divers scénarios et décrivons plusieurs cas d'utilisation pratique identifiés au cours de la thèse. Ensuite, nous décrivons notre approche pour effectuer des calculs sur des données chiffrées à l'aide du FHE et expliquons comment nous avons pu développer une plate-forme pour l'exécution dans le domaine chiffré d'une large gamme d'algorithmes en s'appuyant seulement sur l'addition et la multiplication homomorphes. Nous détaillons ensuite notre solution pour effectuer des requêtes privées sur une base de données chiffrées en utilisant le chiffrement homomorphe. Dans un dernier chapitre, nous présentons nos résultats expérimentaux
    corecore