126,390 research outputs found

    A Mariotte-based verification system for heat-based sap flow sensors

    Get PDF
    Determination of the accuracy of commonly used techniques for measuring sap flux density in trees often presents a challenge. We therefore designed and built a verification system for heat-based sap flow sensors typically used at stem level. In the laboratory, a Mariotte's bottle device was used to maintain a constant flow rate of water through freshly cut stem segments of American beech (Fagus grandifolia Ehrh.). This verification system was used to determine the accuracy of three heat-based sap flux density techniques: heat pulse velocity, thermal dissipation and heat field deformation. All three techniques substantially underestimated sap flux density when compared against gravimetric measurements. On average the actual sap flux density was underestimated by 35% using heat pulse velocity, 46% using heat field deformation and 60% using thermal dissipation. These differences were consistent across sap flux densities ranging from 5 to 80 cm(3) cm(-2) h(-1). Field measurements supported the relative sensor performance observed in the laboratory. Applying a sensor-specific correction factor based on the laboratory test to the field data produced similar estimates of sap flux density from all three techniques. We concluded that a species-specific calibration is therefore necessary when using any of these techniques to insure that accurate estimates of sap flux density are obtained, at least until a physical basis for error correction can be proposed

    QuBEC: Boosting Equivalence Checking for Quantum Circuits with QEC Embedding

    Full text link
    Quantum computing has proven to be capable of accelerating many algorithms by performing tasks that classical computers cannot. Currently, Noisy Intermediate Scale Quantum (NISQ) machines struggle from scalability and noise issues to render a commercial quantum computer. However, the physical and software improvements of a quantum computer can efficiently control quantum gate noise. As the complexity of quantum algorithms and implementation increases, software control of quantum circuits may lead to a more intricate design. Consequently, the verification of quantum circuits becomes crucial in ensuring the correctness of the compilation, along with other processes, including quantum error correction and assertions, that can increase the fidelity of quantum circuits. In this paper, we propose a Decision Diagram-based quantum equivalence checking approach, QuBEC, that requires less latency compared to existing techniques, while accounting for circuits with quantum error correction redundancy. Our proposed methodology reduces verification time on certain benchmark circuits by up to 271.49×271.49 \times, while the number of Decision Diagram nodes required is reduced by up to 798.31×798.31 \times, compared to state-of-the-art strategies. The proposed QuBEC framework can contribute to the advancement of quantum computing by enabling faster and more efficient verification of quantum circuits, paving the way for the development of larger and more complex quantum algorithms

    Parity-based Data Outsourcing: Extension, Implementation, and Evaluation

    Get PDF
    Our research has developed a Parity-based Data Outsourcing (PDO) model. This model outsources a set of raw data by associating it with a set of parity data and then distributing both sets of data among a number of cloud servers that are managed independently by different service providers. Users query the servers for the data of their interest and are allowed to perform both authentication and correction. The former refers to the capability of verifying if the query result they receive is correct (i.e., all data items that satisfy the query condition are received, and every data item received is original from the data owner), whereas the latter, the capability of correcting the corrupted data, if any. Existing techniques all rely on complex cryptographic techniques and require the cloud server to build verification objects. In particular, they support only query authentication, but not error correction. In contrast, our approach enables users to perform both query authentication and error correction, and does so without having to install any additional software on a cloud server, which makes it possible to take advantage of the many cloud data management services available on the market today. This thesis makes the following contributions. 1) We extend the PDO model, which was originally designed for one-dimensional data, to handle multi-dimensional data. 2) We implement the PDO model, including parity coding, data encoding, data retrieval, query authentication and correction. 3) We evaluate the performance of the PDO model. We compare it with Merkle Hash Tree (MH-tree) and Signature Chain, two existing techniques that support query authentication, in terms of storage, communication, and computation overhead

    Pre-silicon FEC decoding verification on SoC FPGAs

    Get PDF
    Forward error correction (FEC) decoding hardware modules are challenging to verify at pre-silicon stage, when they are usually described at register-transfer (RT)/logic level with a hardware description language (HDL). They tend to hide faults due to their inherent tendency to correct errors and the required simulations with a massive insertion of inputs are too slow. In this work, two verification techniques based on FPGA-prototyping are applied in order to complement the mentioned simulations: golden model vs implementation matching with thousands of random codewords and codeword/bit error rate (CER/BER) curve computation. For this purpose, a system on chip (SoC) field-programmable gate array (FPGA) is used, implementing in the programmable hardware part several replicas of the decoder (exploiting the parallel capabilities of hardware) and managing the verification by parallel programming the software part of the SoC (exploiting the presence of multiple processing cores). The presented approach allows a seamless integration with high-level models, does not need expensive testing/emulation platforms and obtains the results in a reasonable amount of time.This work has been supported by Project TEC2017-86722-C4-3-R, funded by Spanish MICINN/AEI

    Cryptographic techniques used to provide integrity of digital content in long-term storage

    Get PDF
    The main objective of the project was to obtain advanced mathematical methods to guarantee the verification that a required level of data integrity is maintained in long-term storage. The secondary objective was to provide methods for the evaluation of data loss and recovery. Additionally, we have provided the following initial constraints for the problem: a limitation of additional storage space, a minimal threshold for desired level of data integrity and a defined probability of a single-bit corruption. With regard to the main objective, the study group focused on the exploration methods based on hash values. It has been indicated that in the case of tight constraints, suggested by PWPW, it is not possible to provide any method based only on the hash values. This observation stems from the fact that the high probability of bit corruption leads to unacceptably large number of broken hashes, which in turn stands in contradiction with the limitation for additional storage space. However, having loosened the initial constraints to some extent, the study group has proposed two methods that use only the hash values. The first method, based on a simple scheme of data subdivision in disjoint subsets, has been provided as a benchmark for other methods discussed in this report. The second method ("hypercube" method), introduced as a type of the wider class of clever-subdivision methods, is built on the concept of rewriting data-stream into a n-dimensional hypercube and calculating hash values for some particular (overlapping) sections of the cube. We have obtained interesting results by combining hash value methods with error-correction techniques. The proposed framework, based on the BCH codes, appears to have promising properties, hence further research in this field is strongly recommended. As a part of the report we have also presented features of secret sharing methods for the benefit of novel distributed data-storage scenarios. We have provided an overview of some interesting aspects of secret sharing techniques and several examples of possible applications

    Quantum money with nearly optimal error tolerance

    Full text link
    We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23%, which we conjecture reaches 25% asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25% is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semi-definite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the re-usability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Lastly, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.Comment: 17 pages, 5 figure
    corecore