81 research outputs found

    On connectivity-dependent resource requirements for digital quantum simulation of dd-level particles

    Full text link
    A primary objective of quantum computation is to efficiently simulate quantum physics. Scientifically and technologically important quantum Hamiltonians include those with spin-ss, vibrational, photonic, and other bosonic degrees of freedom, i.e. problems composed of or approximated by dd-level particles (qudits). Recently, several methods for encoding these systems into a set of qubits have been introduced, where each encoding's efficiency was studied in terms of qubit and gate counts. Here, we build on previous results by including effects of hardware connectivity. To study the number of SWAP gates required to Trotterize commonly used quantum operators, we use both analytical arguments and automatic tools that optimize the schedule in multiple stages. We study the unary (or one-hot), Gray, standard binary, and block unary encodings, with three connectivities: linear array, ladder array, and square grid. Among other trends, we find that while the ladder array leads to substantial efficiencies over the linear array, the advantage of the square over the ladder array is less pronounced. These results are applicable in hardware co-design and in choosing efficient qudit encodings for a given set of near-term quantum hardware. Additionally, this work may be relevant to the scheduling of other quantum algorithms for which matrix exponentiation is a subroutine.Comment: Accepted to QCE20 (IEEE Quantum Week). Corrected erroneous circuits in Figure

    In vitro digestibility of field pea as influenced by processing methods

    Get PDF
    Field pea meals exposed to different treatments (flaking, extrusion, expansion, dry heating at 150°C/15' or 30', dry heating at 150°C/30' after addition of 1% of xylose, 4% NaOH addition, microwave irradiation at 800 W for 6' or 9') were controlled for their 6 and 24 hours in vitro fermentability by the gas production (GP) technique. Flaking and extrusion accelerated initial fermentation but tended to reduce 24h GP, whereas dry heating and microwaves mainly improved final gas volume, but NaOH had the opposite effect. Apparent dry matter digestion at 6h was lowered by dry heating, NaOH addition and the shorter microwave irradiation. Xylose addition did not substantially change the effects of dry heating, but lowered the initial disappearance. Ammonia concentration was in general lowered by the treatments, suggesting a reduction in protein degradability but also a possible higher microbial uptake for protein synthesis. Microwave irradiation had limited effects on all the parameters. Dry heating, with or without xylose addition, seems interesting to increase rumen escaping protein fraction without accelerating starch fermentation that could expose to higher risks of rumen acidosis

    A new method to determine arterial distensibility in small arteries

    Get PDF
    Several methods allow to measure arterial distensibilty. One of them consists in estimating the direct distensibility (D) from diameter and distending blood pressure. Herein, we propose a new method to assess the distensibility in small arteries which is based on spectral analysis of time motion mode ultrasound images of radial arteries. A Fourier transform was performed on intensity of upper and lower walls. Spectral amplitude at heart frequency from both wall spectra was estimated and summed (SumAmp). SumAmp was then compared with direct distensibility. A significant correlation was found between SumAmp and D (r = 0.7, p = 0.02)

    Relevance of laser Doppler and laser speckle techniques for assessing vascular function: state of the art and future trends.

    Get PDF
    In clinical and research applications, the assessment of vascular function has become of major importance to evaluate and follow the evolution of cardiovascular pathologies, diabetes, hypertension, or foot ulcers. Therefore, the development of engineering methodologies able to monitor noninvasively blood vessel activities-such as endothelial function-is a significant and emerging challenge. Laser-based techniques have been used to respond-as much as possible-to these requirements. Among them, laser Doppler flowmetry (LDF) and laser Doppler imaging (LDI) were proposed a few decades ago. They provide interesting vascular information but possess drawbacks that prevent an easy use in some clinical situations. Recently, the laser speckle contrast imaging (LSCI) technique, a noninvasive camera-based tool, was commercialized and overcomes some of the LDF and LDI weaknesses. Our paper describes how-using engineering methodologies-LDF, LDI, and LSCI can meet the challenging clinician needs in assessing vascular function, with a special focus on the state of the art and future trends
    • …
    corecore