18,268 research outputs found

    Authentication with Distortion Criteria

    Full text link
    In a variety of applications, there is a need to authenticate content that has experienced legitimate editing in addition to potential tampering attacks. We develop one formulation of this problem based on a strict notion of security, and characterize and interpret the associated information-theoretic performance limits. The results can be viewed as a natural generalization of classical approaches to traditional authentication. Additional insights into the structure of such systems and their behavior are obtained by further specializing the results to Bernoulli and Gaussian cases. The associated systems are shown to be substantially better in terms of performance and/or security than commonly advocated approaches based on data hiding and digital watermarking. Finally, the formulation is extended to obtain efficient layered authentication system constructions.Comment: 22 pages, 10 figure

    Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions

    Full text link
    An analysis of steganographic systems subject to the following perfect undetectability condition is presented in this paper. Following embedding of the message into the covertext, the resulting stegotext is required to have exactly the same probability distribution as the covertext. Then no statistical test can reliably detect the presence of the hidden message. We refer to such steganographic schemes as perfectly secure. A few such schemes have been proposed in recent literature, but they have vanishing rate. We prove that communication performance can potentially be vastly improved; specifically, our basic setup assumes independently and identically distributed (i.i.d.) covertext, and we construct perfectly secure steganographic codes from public watermarking codes using binning methods and randomized permutations of the code. The permutation is a secret key shared between encoder and decoder. We derive (positive) capacity and random-coding exponents for perfectly-secure steganographic systems. The error exponents provide estimates of the code length required to achieve a target low error probability. We address the potential loss in communication performance due to the perfect-security requirement. This loss is the same as the loss obtained under a weaker order-1 steganographic requirement that would just require matching of first-order marginals of the covertext and stegotext distributions. Furthermore, no loss occurs if the covertext distribution is uniform and the distortion metric is cyclically symmetric; steganographic capacity is then achieved by randomized linear codes. Our framework may also be useful for developing computationally secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore Version 2 as the file was corrupte

    Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions

    Full text link
    An analysis of steganographic systems subject to the following perfect undetectability condition is presented in this paper. Following embedding of the message into the covertext, the resulting stegotext is required to have exactly the same probability distribution as the covertext. Then no statistical test can reliably detect the presence of the hidden message. We refer to such steganographic schemes as perfectly secure. A few such schemes have been proposed in recent literature, but they have vanishing rate. We prove that communication performance can potentially be vastly improved; specifically, our basic setup assumes independently and identically distributed (i.i.d.) covertext, and we construct perfectly secure steganographic codes from public watermarking codes using binning methods and randomized permutations of the code. The permutation is a secret key shared between encoder and decoder. We derive (positive) capacity and random-coding exponents for perfectly-secure steganographic systems. The error exponents provide estimates of the code length required to achieve a target low error probability. We address the potential loss in communication performance due to the perfect-security requirement. This loss is the same as the loss obtained under a weaker order-1 steganographic requirement that would just require matching of first-order marginals of the covertext and stegotext distributions. Furthermore, no loss occurs if the covertext distribution is uniform and the distortion metric is cyclically symmetric; steganographic capacity is then achieved by randomized linear codes. Our framework may also be useful for developing computationally secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore Version 2 as the file was corrupte

    A roadside units positioning framework in the context of vehicle-to-infrastructure based on integrated AHP-entropy and group-VIKOR

    Get PDF
    The positioning of roadside units (RSUs) in a vehicle-to-infrastructure (V2I) communication system may have an impact on network performance. Optimal RSU positioning is required to reduce cost and maintain the quality of service. However, RSU positioning is considered a difficult task due to numerous criteria, such as the cost of RSUs, the intersection area and communication strength, which affect the positioning process and must be considered. Furthermore, the conflict and trade-off amongst these criteria and the significance of each criterion are reflected on the RSU positioning process. Towards this end, a four-stage methodology for a new RSU positioning framework using multi-criteria decision-making (MCDM) in V2I communication system context has been designed. Real time V2I hardware for data collection purpose was developed. This hardware device consisted of multi mobile-nodes (in the car) and RSUs and connected via an nRF24L01+ PA/LNA transceiver module with a microcontroller. In the second phase, different testing scenarios were identified to acquire the required data from the V2I devices. These scenarios were evaluated based on three evaluation attributes. A decision matrix consisted of the scenarios as alternatives and its assessment per criterion was constructed. In the third phase, the alternatives were ranked using hybrid of MCDM techniques, specifically the Analytic Hierarchy Process (AHP), Entropy and Vlsekriterijumska Optimizacija I Kompromisno Resenje (VIKOR). The result of each decision ranking was aggregated using Borda voting approach towards a final group ranking. Finally, the validation process was made to ensure the ranking result undergoes a systematic and valid rank. The results indicate the following: (1) The rank of scenarios obtained from group VIKOR suggested the second scenario with, four RSUs, a maximum distance of 200 meters between RSUs and the antennas height of two-meter, is the best positioning scenarios; and (2) in the objective validation. The study also reported significant differences between the scores of the groups, indicating that the ranking results are valid. Finally, the integration of AHP, Entropy and VIKOR has effectively solved the RSUs positioning problems
    • …
    corecore