43,240 research outputs found

    Energy efficiency of error correction on wireless systems

    Get PDF
    Since high error rates are inevitable to the wireless environment, energy-efficient error-control is an important issue for mobile computing systems. We have studied the energy efficiency of two different error correction mechanisms and have measured the efficiency of an implementation in software. We show that it is not sufficient to concentrate on the energy efficiency of error control mechanisms only, but the required extra energy consumed by the wireless interface should be incorporated as well. A model is presented that can be used to determine an energy-efficient error correction scheme of a minimal system consisting of a general purpose processor and a wireless interface. As an example we have determined these error correction parameters on two systems with a WaveLAN interfac

    Energy-efficient wireless communication

    Get PDF
    In this chapter we present an energy-efficient highly adaptive network interface architecture and a novel data link layer protocol for wireless networks that provides Quality of Service (QoS) support for diverse traffic types. Due to the dynamic nature of wireless networks, adaptations in bandwidth scheduling and error control are necessary to achieve energy efficiency and an acceptable quality of service. In our approach we apply adaptability through all layers of the protocol stack, and provide feedback to the applications. In this way the applications can adapt the data streams, and the network protocols can adapt the communication parameters

    E2MaC: an energy efficient MAC protocol for multimedia traffic

    Get PDF
    Energy efficiency is an important issue for mobile computers since they must rely on their batteries. We present a novel MAC protocol that achieves a good energy efficiency of wireless interface of the mobile and provides support for diverse traffic types and QoS. The scheduler of the base station is responsible to provide the required QoS to connections on the wireless link and to minimise the amount of energy spend by the mobile. The main principles of the E2MaC protocol are to avoid unsuccessful actions, minimise the number of transitions, and synchronise the mobile and the base-station. We will show that considerable amounts of energy can be saved using these principles. In the protocol the actions of the mobile are minimised. The base-station with plenty of energy performs actions in courtesy of the mobile. We have paid much attention in reducing the cost of a mobile for just being connected. The protocol is able to provide near-optimal energy efficiency (i.e. energy is only spent for the actual transfer) for a mobile within the constraints of the QoS of all connections in a cell, and only requires a small overhead

    Optimal Iris Fuzzy Sketches

    Full text link
    Fuzzy sketches, introduced as a link between biometry and cryptography, are a way of handling biometric data matching as an error correction issue. We focus here on iris biometrics and look for the best error-correcting code in that respect. We show that two-dimensional iterative min-sum decoding leads to results near the theoretical limits. In particular, we experiment our techniques on the Iris Challenge Evaluation (ICE) database and validate our findings.Comment: 9 pages. Submitted to the IEEE Conference on Biometrics: Theory, Applications and Systems, 2007 Washington D

    Energy-efficient adaptive wireless network design

    Get PDF
    Energy efficiency is an important issue for mobile computers since they must rely on their batteries. We present an energy-efficient highly adaptive architecture of a network interface and novel data link layer protocol for wireless networks that provides quality of service (QoS) support for diverse traffic types. Due to the dynamic nature of wireless networks, adaptations are necessary to achieve energy efficiency and an acceptable quality of service. The paper provides a review of ideas and techniques relevant to the design of an energy efficient adaptive wireless networ

    Confinement-Higgs transition in a disordered gauge theory and the accuracy threshold for quantum memory

    Get PDF
    We study the +/- J random-plaquette Z_2 gauge model (RPGM) in three spatial dimensions, a three-dimensional analog of the two-dimensional +/- J random-bond Ising model (RBIM). The model is a pure Z_2 gauge theory in which randomly chosen plaquettes (occuring with concentration p) have couplings with the ``wrong sign'' so that magnetic flux is energetically favored on these plaquettes. Excitations of the model are one-dimensional ``flux tubes'' that terminate at ``magnetic monopoles.'' Electric confinement can be driven by thermal fluctuations of the flux tubes, by the quenched background of magnetic monopoles, or by a combination of the two. Like the RBIM, the RPGM has enhanced symmetry along a ``Nishimori line'' in the p-T plane (where T is the temperature). The critical concentration p_c of wrong-sign plaquettes at the confinement-Higgs phase transition along the Nishimori line can be identified with the accuracy threshold for robust storage of quantum information using topological error-correcting codes: if qubit phase errors, qubit bit-flip errors, and errors in the measurement of local check operators all occur at rates below p_c, then encoded quantum information can be protected perfectly from damage in the limit of a large code block. Numerically, we measure p_{c0}, the critical concentration along the T=0 axis (a lower bound on p_c), finding p_{c0}=.0293 +/- .0002. We also measure the critical concentration of antiferromagnetic bonds in the two-dimensional RBIM on the T=0 axis, finding p_{c0}=.1031 +/-.0001. Our value of p_{c0} is incompatible with the value of p_c=.1093 +/-.0002 found in earlier numerical studies of the RBIM, in disagreement with the conjecture that the phase boundary of the RBIM is vertical (parallel to the T axis) below the Nishimori line.Comment: 16 pages, 11 figures, REVTeX, improved numerics and an additional autho

    Topological quantum memory

    Get PDF
    We analyze surface codes, the topological quantum error-correcting codes introduced by Kitaev. In these codes, qubits are arranged in a two-dimensional array on a surface of nontrivial topology, and encoded quantum operations are associated with nontrivial homology cycles of the surface. We formulate protocols for error recovery, and study the efficacy of these protocols. An order-disorder phase transition occurs in this system at a nonzero critical value of the error rate; if the error rate is below the critical value (the accuracy threshold), encoded information can be protected arbitrarily well in the limit of a large code block. This phase transition can be accurately modeled by a three-dimensional Z_2 lattice gauge theory with quenched disorder. We estimate the accuracy threshold, assuming that all quantum gates are local, that qubits can be measured rapidly, and that polynomial-size classical computations can be executed instantaneously. We also devise a robust recovery procedure that does not require measurement or fast classical processing; however for this procedure the quantum gates are local only if the qubits are arranged in four or more spatial dimensions. We discuss procedures for encoding, measurement, and performing fault-tolerant universal quantum computation with surface codes, and argue that these codes provide a promising framework for quantum computing architectures.Comment: 39 pages, 21 figures, REVTe
    corecore