172 research outputs found

    Securing Cloud Storage by Transparent Biometric Cryptography

    Get PDF
    With the capability of storing huge volumes of data over the Internet, cloud storage has become a popular and desirable service for individuals and enterprises. The security issues, nevertheless, have been the intense debate within the cloud community. Significant attacks can be taken place, the most common being guessing the (poor) passwords. Given weaknesses with verification credentials, malicious attacks have happened across a variety of well-known storage services (i.e. Dropbox and Google Drive) – resulting in loss the privacy and confidentiality of files. Whilst today's use of third-party cryptographic applications can independently encrypt data, it arguably places a significant burden upon the user in terms of manually ciphering/deciphering each file and administering numerous keys in addition to the login password. The field of biometric cryptography applies biometric modalities within cryptography to produce robust bio-crypto keys without having to remember them. There are, nonetheless, still specific flaws associated with the security of the established bio-crypto key and its usability. Users currently should present their biometric modalities intrusively each time a file needs to be encrypted/decrypted – thus leading to cumbersomeness and inconvenience while throughout usage. Transparent biometrics seeks to eliminate the explicit interaction for verification and thereby remove the user inconvenience. However, the application of transparent biometric within bio-cryptography can increase the variability of the biometric sample leading to further challenges on reproducing the bio-crypto key. An innovative bio-cryptographic approach is developed to non-intrusively encrypt/decrypt data by a bio-crypto key established from transparent biometrics on the fly without storing it somewhere using a backpropagation neural network. This approach seeks to handle the shortcomings of the password login, and concurrently removes the usability issues of the third-party cryptographic applications – thus enabling a more secure and usable user-oriented level of encryption to reinforce the security controls within cloud-based storage. The challenge represents the ability of the innovative bio-cryptographic approach to generate a reproducible bio-crypto key by selective transparent biometric modalities including fingerprint, face and keystrokes which are inherently noisier than their traditional counterparts. Accordingly, sets of experiments using functional and practical datasets reflecting a transparent and unconstrained sample collection are conducted to determine the reliability of creating a non-intrusive and repeatable bio-crypto key of a 256-bit length. With numerous samples being acquired in a non-intrusive fashion, the system would be spontaneously able to capture 6 samples within minute window of time. There is a possibility then to trade-off the false rejection against the false acceptance to tackle the high error, as long as the correct key can be generated via at least one successful sample. As such, the experiments demonstrate that a correct key can be generated to the genuine user once a minute and the average FAR was 0.9%, 0.06%, and 0.06% for fingerprint, face, and keystrokes respectively. For further reinforcing the effectiveness of the key generation approach, other sets of experiments are also implemented to determine what impact the multibiometric approach would have upon the performance at the feature phase versus the matching phase. Holistically, the multibiometric key generation approach demonstrates the superiority in generating the bio-crypto key of a 256-bit in comparison with the single biometric approach. In particular, the feature-level fusion outperforms the matching-level fusion at producing the valid correct key with limited illegitimacy attempts in compromising it – 0.02% FAR rate overall. Accordingly, the thesis proposes an innovative bio-cryptosystem architecture by which cloud-independent encryption is provided to protect the users' personal data in a more reliable and usable fashion using non-intrusive multimodal biometrics.Higher Committee of Education Development in Iraq (HCED

    Error-Correction Coding and Decoding: Bounds, Codes, Decoders, Analysis and Applications

    Get PDF
    Coding; Communications; Engineering; Networks; Information Theory; Algorithm

    A STUDY OF ERASURE CORRECTING CODES

    Get PDF
    This work focus on erasure codes, particularly those that of high performance, and the related decoding algorithms, especially with low computational complexity. The work is composed of different pieces, but the main components are developed within the following two main themes. Ideas of message passing are applied to solve the erasures after the transmission. Efficient matrix-representation of the belief propagation (BP) decoding algorithm on the BEG is introduced as the recovery algorithm. Gallager's bit-flipping algorithm are further developed into the guess and multi-guess algorithms especially for the application to recover the unsolved erasures after the recovery algorithm. A novel maximum-likelihood decoding algorithm, the In-place algorithm, is proposed with a reduced computational complexity. A further study on the marginal number of correctable erasures by the In-place algoritinn determines a lower bound of the average number of correctable erasures. Following the spirit in search of the most likable codeword based on the received vector, we propose a new branch-evaluation- search-on-the-code-tree (BESOT) algorithm, which is powerful enough to approach the ML performance for all linear block codes. To maximise the recovery capability of the In-place algorithm in network transmissions, we propose the product packetisation structure to reconcile the computational complexity of the In-place algorithm. Combined with the proposed product packetisation structure, the computational complexity is less than the quadratic complexity bound. We then extend this to application of the Rayleigh fading channel to solve the errors and erasures. By concatenating an outer code, such as BCH codes, the product-packetised RS codes have the performance of the hard-decision In-place algorithm significantly better than that of the soft-decision iterative algorithms on optimally designed LDPC codes

    Maximum entropy based analysis of a DS/SSMA diversity system

    Get PDF
    D.Ing.This thesis sets out to propose and analyze a cellular Direct Sequence Spread Spectrum Multiple Access (DSjSSMA) system for the Indoor Wireless Communication (IWC) Nakagami fading channel. The up- and downlink of the system implement Differential Phase Shift Keying (DPSK) and Coherent Phase Shift Keying (CPSK) as modulation schemes respectively, and are analyzed using Maximum Entropy (MaxEnt) principles due to its reliability and accuracy. As a means to enhance system capacity and performance, different forms of diversity are investigated; for the up- and downlink, respectively, RAKE reception and Maximum Ratio Combining (MRC) diversity together with Forward Error Control (FEC) coding are assumed. Further, the validity of the Gaussian Assumption (GA) is quantified and investigated under fading and non-fading conditions by calculating the missing information, using Minimum Relative Entropy (MRE) principles between the Inter- User Interference (IUI) distribution and a Gaussian distribution of equal variance
    • …
    corecore