26 research outputs found

    Optimization and Applications of Modern Wireless Networks and Symmetry

    Get PDF
    Due to the future demands of wireless communications, this book focuses on channel coding, multi-access, network protocol, and the related techniques for IoT/5G. Channel coding is widely used to enhance reliability and spectral efficiency. In particular, low-density parity check (LDPC) codes and polar codes are optimized for next wireless standard. Moreover, advanced network protocol is developed to improve wireless throughput. This invokes a great deal of attention on modern communications

    Bandwidth-efficient communication systems based on finite-length low density parity check codes

    Get PDF
    Low density parity check (LDPC) codes are linear block codes constructed by pseudo-random parity check matrices. These codes are powerful in terms of error performance and, especially, have low decoding complexity. While infinite-length LDPC codes approach the capacity of communication channels, finite-length LDPC codes also perform well, and simultaneously meet the delay requirement of many communication applications such as voice and backbone transmissions. Therefore, finite-length LDPC codes are attractive to employ in low-latency communication systems. This thesis mainly focuses on the bandwidth-efficient communication systems using finite-length LDPC codes. Such bandwidth-efficient systems are realized by mapping a group of LDPC coded bits to a symbol of a high-order signal constellation. Depending on the systems' infrastructure and knowledge of the channel state information (CSI), the signal constellations in different coded modulation systems can be two-dimensional multilevel/multiphase constellations or multi-dimensional space-time constellations. In the first part of the thesis, two basic bandwidth-efficient coded modulation systems, namely LDPC coded modulation and multilevel LDPC coded modulation, are investigated for both additive white Gaussian noise (AWGN) and frequency-flat Rayleigh fading channels. The bounds on the bit error rate (BER) performance are derived for these systems based on the maximum likelihood (ML) criterion. The derivation of these bounds relies on the union bounding and combinatoric techniques. In particular, for the LDPC coded modulation, the ML bound is computed from the Hamming distance spectrum of the LDPC code and the Euclidian distance profile of the two-dimensional constellation. For the multilevel LDPC coded modulation, the bound of each decoding stage is obtained for a generalized multilevel coded modulation, where more than one coded bit is considered for level. For both systems, the bounds are confirmed by the simulation results of ML decoding and/or the performance of the ordered-statistic decoding (OSD) and the sum-product decoding. It is demonstrated that these bounds can be efficiently used to evaluate the error performance and select appropriate parameters (such as the code rate, constellation and mapping) for the two communication systems.The second part of the thesis studies bandwidth-efficient LDPC coded systems that employ multiple transmit and multiple receive antennas, i.e., multiple-input multiple-output (MIMO) systems. Two scenarios of CSI availability considered are: (i) the CSI is unknown at both the transmitter and the receiver; (ii) the CSI is known at both the transmitter and the receiver. For the first scenario, LDPC coded unitary space-time modulation systems are most suitable and the ML performance bound is derived for these non-coherent systems. To derive the bound, the summation of chordal distances is obtained and used instead of the Euclidean distances. For the second case of CSI, adaptive LDPC coded MIMO modulation systems are studied, where three adaptive schemes with antenna beamforming and/or antenna selection are investigated and compared in terms of the bandwidth efficiency. For uncoded discrete-rate adaptive modulation, the computation of the bandwidth efficiency shows that the scheme with antenna selection at the transmitter and antenna combining at the receiver performs the best when the number of antennas is small. For adaptive LDPC coded MIMO modulation systems, an achievable threshold of the bandwidth efficiency is also computed from the ML bound of LDPC coded modulation derived in the first part

    Coding with side information

    Get PDF
    Source coding and channel coding are two important problems in communications. Although side information exists in everyday scenario, the effect of side information is not taken into account in the conventional setups. In this thesis, we focus on the practical designs of two interesting coding problems with side information: Wyner-Ziv coding (source coding with side information at the decoder) and Gel??fand-Pinsker coding (channel coding with side information at the encoder). For WZC, we split the design problem into the two cases when the distortion of the reconstructed source is zero and when it is not. We review that the first case, which is commonly called Slepian-Wolf coding (SWC), can be implemented using conventional channel coding. Then, we detail the SWC design using the low-density parity-check (LDPC) code. To facilitate SWC design, we justify a necessary requirement that the SWC performance should be independent of the input source. We show that a sufficient condition of this requirement is that the hypothetical channel between the source and the side information satisfies a symmetry condition dubbed dual symmetry. Furthermore, under that dual symmetry condition, SWC design problem can be simply treated as LDPC coding design over the hypothetical channel. When the distortion of the reconstructed source is non-zero, we propose a practical WZC paradigm called Slepian-Wolf coded quantization (SWCQ) by combining SWC and nested lattice quantization. We point out an interesting analogy between SWCQ and entropy coded quantization in classic source coding. Furthermore, a practical scheme of SWCQ using 1-D nested lattice quantization and LDPC is implemented. For GPC, since the actual design procedure relies on the more precise setting of the problem, we choose to investigate the design of GPC as the form of a digital watermarking problem as digital watermarking is the precise dual of WZC. We then introduce an enhanced version of the well-known spread spectrum watermarking technique. Two applications related to digital watermarking are presented

    Securing Cloud Storage by Transparent Biometric Cryptography

    Get PDF
    With the capability of storing huge volumes of data over the Internet, cloud storage has become a popular and desirable service for individuals and enterprises. The security issues, nevertheless, have been the intense debate within the cloud community. Significant attacks can be taken place, the most common being guessing the (poor) passwords. Given weaknesses with verification credentials, malicious attacks have happened across a variety of well-known storage services (i.e. Dropbox and Google Drive) – resulting in loss the privacy and confidentiality of files. Whilst today's use of third-party cryptographic applications can independently encrypt data, it arguably places a significant burden upon the user in terms of manually ciphering/deciphering each file and administering numerous keys in addition to the login password. The field of biometric cryptography applies biometric modalities within cryptography to produce robust bio-crypto keys without having to remember them. There are, nonetheless, still specific flaws associated with the security of the established bio-crypto key and its usability. Users currently should present their biometric modalities intrusively each time a file needs to be encrypted/decrypted – thus leading to cumbersomeness and inconvenience while throughout usage. Transparent biometrics seeks to eliminate the explicit interaction for verification and thereby remove the user inconvenience. However, the application of transparent biometric within bio-cryptography can increase the variability of the biometric sample leading to further challenges on reproducing the bio-crypto key. An innovative bio-cryptographic approach is developed to non-intrusively encrypt/decrypt data by a bio-crypto key established from transparent biometrics on the fly without storing it somewhere using a backpropagation neural network. This approach seeks to handle the shortcomings of the password login, and concurrently removes the usability issues of the third-party cryptographic applications – thus enabling a more secure and usable user-oriented level of encryption to reinforce the security controls within cloud-based storage. The challenge represents the ability of the innovative bio-cryptographic approach to generate a reproducible bio-crypto key by selective transparent biometric modalities including fingerprint, face and keystrokes which are inherently noisier than their traditional counterparts. Accordingly, sets of experiments using functional and practical datasets reflecting a transparent and unconstrained sample collection are conducted to determine the reliability of creating a non-intrusive and repeatable bio-crypto key of a 256-bit length. With numerous samples being acquired in a non-intrusive fashion, the system would be spontaneously able to capture 6 samples within minute window of time. There is a possibility then to trade-off the false rejection against the false acceptance to tackle the high error, as long as the correct key can be generated via at least one successful sample. As such, the experiments demonstrate that a correct key can be generated to the genuine user once a minute and the average FAR was 0.9%, 0.06%, and 0.06% for fingerprint, face, and keystrokes respectively. For further reinforcing the effectiveness of the key generation approach, other sets of experiments are also implemented to determine what impact the multibiometric approach would have upon the performance at the feature phase versus the matching phase. Holistically, the multibiometric key generation approach demonstrates the superiority in generating the bio-crypto key of a 256-bit in comparison with the single biometric approach. In particular, the feature-level fusion outperforms the matching-level fusion at producing the valid correct key with limited illegitimacy attempts in compromising it – 0.02% FAR rate overall. Accordingly, the thesis proposes an innovative bio-cryptosystem architecture by which cloud-independent encryption is provided to protect the users' personal data in a more reliable and usable fashion using non-intrusive multimodal biometrics.Higher Committee of Education Development in Iraq (HCED

    On Experimental Quantum Communication and Cryptography

    Get PDF
    One of the most fascinating recent developments in research has been how different disciplines have become more and more interconnected. So much so that fields as disparate as information theory and fundamental physics have combined to produce ideas for the next generation of computing and secure information technologies, both of which have far reaching consequences. For more than fifty years Moore's law, which describes the trend of the transistor's size shrinking by half every two years, has proven to be uncannily accurate. However, the computing industry is now approaching a fundamental barrier as the size of a transistor approaches that of an individual atom and the laws of physics and quantum mechanics take over. Rather then look at this as the end, quantum information science has emerged to ask the question of what additional power and functionality might be realized by harnessing some of these quantum effects. This thesis presents work on the sub-field of quantum cryptography which seeks to use quantum means in order to assure the security of ones communications. The beauty of quantum cryptographic methods are that they can be proven secure, now and indefinitely into the future, relying solely on the validity of the laws of physics for their proofs of security. This is something which is impossible for nearly all current classical cryptographic methods to claim. The thesis begins by examining the first implementation of an entangled quantum key distribution system over two free-space optical links. This system represents the first test-bed of its kind in the world and while its practical importance in terrestrial applications is limited to a smaller university or corporate campus, the system mimics the setup for an entangled satellite system aiding in the study of distributing entangled photons from an orbiting satellite to two earthbound receivers. Having completed the construction of a second free-space link and the automation of the alignment system, I securely distribute keys to Alice and Bob in two distant locations separated by 1,575 m with no direct line-of-sight between them. I examine all of the assumptions necessary for my claims of security, something which is particularly important for moving these systems out of the lab and into commercial industry. I then go on to describe the free-space channel over which the photons are sent and the implementation of each of the major system components. I close with a discussion of the experiment which saw raw detected entangled photon rates of 565 s^{-1} and a quantum bit error rate (QBER) of 4.92% resulting in a final secure key rate of 85 bits/s. Over the six hour night time experiment I was able to generate 1,612,239 bits of secure key. With a successful QKD experiment completed, this thesis then turns to the problem of improving the technology to make it more practical by increasing the key rate of the system and thus the speed at which it can securely encrypt information. It does so in three different ways, involving each of the major disciplines comprising the system: measurement hardware, source technology, and software post-processing. First, I experimentally investigate a theoretical proposal for biasing the measurement bases in the QKD system showing a 79% improvement in the secret key generated from the same raw key rates. Next, I construct a second generation entangled photon source with rates two orders of magnitude higher than the previous source using the idea of a Sagnac interferometer. More importantly, the new source has a QBER as low as 0.93% which is not only important for the security of the QKD system but will be required for the implementation of a new cryptographic primitive later. Lastly, I study the free-space link transmission statistics and the use of a signal-to-noise ratio (SNR) filter to improve the key rate by 25.2% from the same amount of raw key. The link statistics have particular relevance for a current project with the Canadian Space Agency to exchange a quantum key with an orbiting satellite - a project which I have participated in two feasibility studies for. Wanting to study the usefulness of more recent ideas in quantum cryptography this thesis then looks at the first experimental implementation of a new cryptographic primitive called oblivious transfer (OT) in the noisy storage model. This primitive has obvious important applications as it can be used to implement a secure identification scheme provably secure in a quantum scenario. Such a scheme could one day be used, for example, to authenticate a user over short distances, such as at ATM machines, which have proven to be particularly vulnerable to hacking and fraud. Over a four hour experiment, Alice and Bob measure 405,642,088 entangled photon pairs with an average QBER of 0.93% allowing them to create a secure OT key of 8,939,150 bits. As a first implementer, I examine many of the pressing issues currently preventing the scheme from being more widely adopted such as the need to relax the dependance of the OT rate on the loss of the system and the need to extend the security proof to cover a wider range of quantum communication channels and memories. It is important to note that OT is fundamentally different than QKD for security as the information is never physically exchanged over the communication line but rather the joint equality function f(x) = f(y) is evaluated. Thus, security in QKD does not imply security for OT. Finally, this thesis concludes with the construction and initial alignment of a second generation free-space quantum receiver, useful for increasing the QKD key rates, but designed for a fundamental test of quantum theory namely a Svetlichny inequality violation. Svetlichny's inequality is a generalization of Bell's inequality to three particles where any two of the three particles maybe be non-locally correlated. Even so, a violation of Svetlichny's inequality shows that certain quantum mechanical states are incompatible with this restricted class of non-local yet realistic theories. Svetlichny's inequality is particularly important because while there has been an overwhelming number of Bell experiments performed testing two-body correlations, experiments on many-body systems have been few and far between. Experiments of this type are particularly valuable to explore since we live in a many-body world. The new receiver incorporates an active polarization analyzer capable of switching between measurement bases on a microsecond time-scale through the use of a Pockels cell while maintaining measurements of a high fidelity. Some of the initial alignment and analysis results are detailed including the final measured contrasts of 1:25.2 and 1:22.6 in the rectilinear and diagonal bases respectively

    Dependable Embedded Systems

    Get PDF
    This Open Access book introduces readers to many new techniques for enhancing and optimizing reliability in embedded systems, which have emerged particularly within the last five years. This book introduces the most prominent reliability concerns from today’s points of view and roughly recapitulates the progress in the community so far. Unlike other books that focus on a single abstraction level such circuit level or system level alone, the focus of this book is to deal with the different reliability challenges across different levels starting from the physical level all the way to the system level (cross-layer approaches). The book aims at demonstrating how new hardware/software co-design solution can be proposed to ef-fectively mitigate reliability degradation such as transistor aging, processor variation, temperature effects, soft errors, etc. Provides readers with latest insights into novel, cross-layer methods and models with respect to dependability of embedded systems; Describes cross-layer approaches that can leverage reliability through techniques that are pro-actively designed with respect to techniques at other layers; Explains run-time adaptation and concepts/means of self-organization, in order to achieve error resiliency in complex, future many core systems

    Long distance free-space quantum key distribution

    Get PDF
    In the age of information and globalisation, secure communication as well as the protection of sensitive data against unauthorised access are of utmost importance. Quantum cryptography currently provides the only way to exchange a cryptographic key between two parties in an unconditionally secure fashion. Owing to losses and noise of today's optical fibre and detector technology, at present quantum cryptography is limited to distances below a few 100 km. In principle, larger distances could be subdivided into shorter segments, but the required quantum repeaters are still beyond current technology. An alternative approach for bridging larger distances is a satellite-based system, that would enable secret key exchange between two arbitrary points on the globe using free-space optical communication. The aim of the presented experiment was to investigate the feasibility of satellite-based global quantum key distribution. In this context, a free-space quantum key distribution experiment over a real distance of 144 km was performed. The transmitter and the receiver were situated in 2500 m altitude on the Canary Islands of La Palma and Tenerife, respectively. The small and compact transmitter unit generated attenuated laser pulses, that were sent to the receiver via a 15-cm optical telescope. The receiver unit for polarisation analysis and detection of the sent pulses was integrated into an existing mirror telescope designed for classical optical satellite communications. To ensure the required stability and efficiency of the optical link in the presence of atmospheric turbulence, the two telescopes were equipped with a bi-directional automatic tracking system. Still, due to stray light and high optical attenuation, secure key exchange would not be possible using attenuated pulses in connection with the standard BB84 protocol. The photon number statistics of attenuated pulses follows a Poissonian distribution. Hence, by removing a photon from all pulses containing two or more photons, an eavesdropper could measure its polarisation without disturbing the polarisation state of the remaining pulse. In this way, he can gain information about the key without introducing detectable errors. To protect against such attacks, the presented experiment employed the recently developed method of using additional "decoy" states, i.e., the the intensity of the pulses created by the transmitter were varied in a random manner. By analysing the detection probabilities of the different pulses individually, a photon-number-splitting attack can be detected. Thanks to the decoy-state analysis, the secrecy of the resulting quantum key could be ensured despite the Poissonian nature of the emitted pulses. For a channel attenuation as high as 35 dB, a secret key rate of up to 250 bit/s was achieved. Our outdoor experiment was carried out under real atmospheric conditions and with a channel attenuation comparable to an optical link from ground to a satellite in low earth orbit. Hence, it definitely shows the feasibility of satellite-based quantum key distribution using a technologically comparatively simple system
    corecore