3,761 research outputs found

    Privacy-Constrained Remote Source Coding

    Full text link
    We consider the problem of revealing/sharing data in an efficient and secure way via a compact representation. The representation should ensure reliable reconstruction of the desired features/attributes while still preserve privacy of the secret parts of the data. The problem is formulated as a remote lossy source coding with a privacy constraint where the remote source consists of public and secret parts. Inner and outer bounds for the optimal tradeoff region of compression rate, distortion, and privacy leakage rate are given and shown to coincide for some special cases. When specializing the distortion measure to a logarithmic loss function, the resulting rate-distortion-leakage tradeoff for the case of identical side information forms an optimization problem which corresponds to the "secure" version of the so-called information bottleneck.Comment: 10 pages, 1 figure, to be presented at ISIT 201

    Cellular Automata Based Image Authentication Scheme Using Extended Visual Cryptography

    Get PDF
    Most of the Visual Cryptography based image authentication schemes hide the share and authentication data into cover images by using an additional data hiding process. This process increases the computational cost of the schemes. Pixel expansion, meaningless shares and use of codebook are other challenges in these schemes. To overcome these issues, an authentication scheme is proposed in which no embedding into the cover images is performed and meaningful authentication shares are created using the watermark and cover images. This makes the scheme completely imperceptible. The watermark can be retrieved just by superimposing these authentication shares, thus reducing the computational complexity at receiver's side. Cellular Automata is used to construct the master share that provides self-construction ability to the shares. The meaningful authentication shares help in enhancing the security of the scheme while size invariance saves transmission and storage cost. The scheme possesses the ability of tamper detection. Experimental results demonstrate the improved security and quality of the generated shares of the proposed scheme as compared to existing schemes

    Error Function Attack of chaos synchronization based encryption schemes

    Full text link
    Different chaos synchronization based encryption schemes are reviewed and compared from the practical point of view. As an efficient cryptanalysis tool for chaos encryption, a proposal based on the Error Function Attack is presented systematically and used to evaluate system security. We define a quantitative measure (Quality Factor) of the effective applicability of a chaos encryption scheme, which takes into account the security, the encryption speed, and the robustness against channel noise. A comparison is made of several encryption schemes and it is found that a scheme based on one-way coupled chaotic map lattices performs outstandingly well, as judged from Quality Factor

    Improved Deep Hiding/Extraction Algorithm to Enhance the Payload Capacity and Security Level of Hidden Information

    Get PDF
    Steganography algorithms have become a significant technique for preventing illegal users from obtaining secret data. In this paper, a deep hiding/extraction algorithm has been improved (IDHEA) to hide a secret message in colour images. The proposed algorithm has been applied to enhance the payload capacity and reduce the time complexity. Modified LSB (MLSB) is based on disseminating secret data randomly on a cover-image and has been proposed to replace a number of bits per byte (Nbpb), up to 4 bits, to increase payload capacity and make it difficult to access the hiding data. The number of levels of the IDHEA algorithm has been specified randomly; each level uses a colour image, and from one level to the next, the image size is expanded, where this algorithm starts with a small size of a cover-image and increases the size of the image gradually or suddenly at the next level, according to an enlargement ratio. Lossless image compression based on the run-length encoding algorithm and Gzip has been applied to enable the size of the data that is hiding at the next level, and data encryption using the Advanced Encryption Standard algorithm (AES) has been introduced at each level to enhance the security level. Thus, the effectiveness of the proposed IDHEA algorithm has been measured at the last level, and the performance of the proposed hiding algorithm has been checked by many statistical and visual measures in terms of the embedding capacity and imperceptibility. Comparisons between the proposed approach and previous work have been implemented; it appears that the intended approach is better than the previously modified LSB algorithms, and it works against visual and statistical attacks with excellent performance achieved by using the detection error (PE). Furthermore, the results confirmed that the stego-image with high imperceptibility has reached even a payload capacity that is large and replaces twelve bits per pixel (12-bpp). Moreover, testing is confirmed in that the proposed algorithm can embed secret data efficiently with better visual quality

    IDENTITY PRIVACY AND PRESERVING DATA IN MULTI-ATTORNEY MANNER USING CLOUD

    Get PDF
    Cloud computing provides an economical and efficient solution for sharing data among the cloud users in the group , users sharing data in a multi-attorney manner preserving data and identity privacy from an untrusted cloud, it  is still a challenging issue, due to frequent change of the membership in the group. In this paper, we propose a multi-attorney data sharing scheme for the dynamic groups in the cloud. By combing group signature and dynamic broadcast encryption techniques, any cloud user anonymously share the data with others. In addition, we analyze the security of our scheme with rigorous proofs, and demonstrate the efficiency of our scheme in experiments

    A multi-candidate electronic voting scheme with unlimited participants

    Full text link
    In this paper a new multi-candidate electronic voting scheme is constructed with unlimited participants. The main idea is to express a ballot to allow voting for up to k out of the m candidates and unlimited participants. The purpose of vote is to select more than one winner among mm candidates. Our result is complementary to the result by Sun peiyong' s scheme, in the sense, their scheme is not amenable for large-scale electronic voting due to flaw of ballot structure. In our scheme the vote is split and hidden, and tallying is made for Go¨delG\ddot{o}del encoding in decimal base without any trusted third party, and the result does not rely on any traditional cryptography or computational intractable assumption. Thus the proposed scheme not only solves the problem of ballot structure, but also achieves the security including perfect ballot secrecy, receipt-free, robustness, fairness and dispute-freeness.Comment: 6 page

    Secret-key rates and privacy leakage in biometric systems

    Get PDF
    In this thesis both the generation of secret keys from biometric data and the binding of secret keys to biometric data are investigated. These secret keys can be used to regulate access to sensitive data, services, and environments. In a biometric secrecy system a secret key is generated or chosen during an enrollment procedure in which biometric data are observed for the first time. This key is to be reconstructed after these biometric data are observed for the second time when authentication is required. Since biometric measurements are typically noisy, reliable biometric secrecy systems also extract so-called helper data from the biometric observation at the time of enrollment. These helper data facilitate reliable reconstruction of the secret key in the authentication process. Since the helper data are assumed to be public, they should not contain information about the secret key. We say that the secrecy leakage should be negligible. Important parameters of biometric key-generation and key-binding systems include the size of the generated or chosen secret key and the information that the helper data contain (leak) about the biometric observation. This latter parameter is called privacy leakage. Ideally the privacy leakage should be small, to prevent the biometric data of an individual from being compromised. Moreover, the secret-key length (also characterized by the secret-key rate) should be large to minimize the probability that the secret key is guessed and unauthorized access is granted. The first part of this thesis mainly focuses on the fundamental trade-off between the secret-key rate and the privacy-leakage rate in biometric secret-generation and secretbinding systems. This trade-off is studied from an information-theoretical perspective for four biometric settings. The first setting is the classical secret-generation setting as proposed by Maurer [1993] and Ahlswede and Csiszár [1993]. For this setting the achievable secret-key vs. privacy-leakage rate region is determined in this thesis. In the second setting the secret key is not generated by the terminals, but independently chosen during enrollment (key binding). Also for this setting the region of achievable secret-key vs. privacy-leakage rate pairs is determined. In settings three and four zero-leakage systems are considered. In these systems the public message should contain only a negligible amount of information about both the secret key and the biometric enrollment sequence. To achieve this, a private key is needed, which can be observed only by the two terminals. Again both the secret generation setting and chosen secret setting are considered. For these two cases the regions of achievable secret-key vs. private-key rate pairs are determined. For all four settings two notions of leakage are considered. Depending on whether one looks at secrecy and privacy leakage separately or in combination, unconditional or conditional privacy leakage is considered. Here unconditional leakage corresponds to the mutual information between the helper data and the biometric enrollment sequence, while the conditional leakage relates to the conditional version of this mutual information, given the secret. The second part of the thesis focuses on the privacy- and secrecy-leakage analysis of the fuzzy commitment scheme. Fuzzy commitment, proposed by Juels and Wattenberg [1999], is, in fact, a particular realization of a binary biometric secrecy system with a chosen secret key. In this scheme the helper data are constructed as a codeword from an error-correcting code, used to encode a chosen secret, masked with the biometric sequence that has been observed during enrollment. Since this scheme is not privacy preserving in the conditional privacy-leakage sense, the unconditional privacy-leakage case is investigated. Four cases of biometric sources are considered, i.e. memoryless and totally-symmetric biometric sources, memoryless and input-symmetric biometric sources, memoryless biometric sources, and stationary and ergodic biometric sources. For the first two cases the achievable rate-leakage regions are determined. In these cases the secrecy leakage rate need not be positive. For the other two cases only outer bounds on achievable rate-leakage regions are found. These bounds, moreover, are sharpened for fuzzy commitment based on systematic parity-check codes. Using the fundamental trade-offs found in the first part of this thesis, it is shown that fuzzy commitment is only optimal for memoryless totally-symmetric biometric sources and only at the maximum secret-key rate. Moreover, it is demonstrated that for memoryless and stationary ergodic biometric sources, which are not input-symmetric, the fuzzy commitment scheme leaks information on both the secret key and the biometric data. Biometric sequences have an often unknown statistical structure (model) that can be quite complex. The last part of this dissertation addresses the problem of finding the maximum a posteriori (MAP) model for a pair of observed biometric sequences and the problem of estimating the maximum secret-key rate from these sequences. A universal source coding procedure called the Context-TreeWeighting (CTW) method [1995] can be used to find this MAP model. In this thesis a procedure that determines the MAP model, based on the so-called beta-implementation of the CTW method, is proposed. Moreover, CTW methods are used to compress the biometric sequences and sequence pairs in order to estimate the mutual information between the sequences. However, CTW methods were primarily developed for compressing onedimensional sources, while biometric data are often modeled as two-dimensional processes. Therefore it is proved here that the entropy of a stationary two-dimensional source can be expressed as a limit of a series of conditional entropies. This result is also extended to the conditional entropy of one two-dimensional source given another one. As a consequence entropy and mutual information estimates can be obtained from CTW methods using properly-chosen templates. Using such techniques estimates of the maximum secret-key rate for physical unclonable functions (PUFs) are determined from a data-set of observed sequences. PUFs can be regarded as inanimate analogues of biometrics

    Fully leakage-resilient signatures revisited: Graceful degradation, noisy leakage, and construction in the bounded-retrieval model

    Get PDF
    We construct new leakage-resilient signature schemes. Our schemes remain unforgeable against an adversary leaking arbitrary (yet bounded) information on the entire state of the signer (sometimes known as fully leakage resilience), including the random coin tosses of the signing algorithm. The main feature of our constructions is that they offer a graceful degradation of security in situations where standard existential unforgeability is impossible
    corecore