165 research outputs found

    Distributed quantum computing over 7.0 km

    Full text link
    Distributed quantum computing provides a viable approach towards scalable quantum computation, which relies on nonlocal quantum gates to connect distant quantum nodes, to overcome the limitation of a single device. However, such an approach has only been realized within single nodes or between nodes separated by a few tens of meters, preventing the target of harnessing computing resources in large-scale quantum networks. Here, we demonstrate distributed quantum computing between two nodes spatially separated by 7.0 km, using stationary qubits based on multiplexed quantum memories, flying qubits at telecom wavelengths, and active feedforward control based on field-deployed fiber. Specifically, we illustrate quantum parallelism by implementing Deutsch-Jozsa algorithm and quantum phase estimation algorithm between the two remote nodes. These results represent the first demonstration of distributed quantum computing over metropolitan-scale distances and lay the foundation for the construction of large-scale quantum computing networks relying on existing fiber channels.Comment: 6 pages, 3 figure

    Inverse Modeling for MEG/EEG data

    Full text link
    We provide an overview of the state-of-the-art for mathematical methods that are used to reconstruct brain activity from neurophysiological data. After a brief introduction on the mathematics of the forward problem, we discuss standard and recently proposed regularization methods, as well as Monte Carlo techniques for Bayesian inference. We classify the inverse methods based on the underlying source model, and discuss advantages and disadvantages. Finally we describe an application to the pre-surgical evaluation of epileptic patients.Comment: 15 pages, 1 figur

    Life cycle assessment of energy consumption and environmental emissions for cornstalk-based ethyl levulinate

    Get PDF
    This is the author accepted manuscript. The final version is available from the publisher via the DOI in this record.This study analysed the sustainability of fuel-ethyl levulinate (EL) production along with furfural, as a by-product, from cornstalk in China. A life cycle assessment (LCA) was conducted using the SimaPro software to evaluate the energy consumption (EC), greenhouse gas (GHG) and criteria emissions, from cornstalk growth to EL utilisation. The total life cycle EC was found to be 4.54MJ/MJ EL, of which 94.7% was biomass energy. EC in the EL production stage was the highest, accounting for 96.8% of total EC. Fossil EC in this stage was estimated to be 0.095 MJ/MJ, which also represents the highest fossil EC throughout the life cycle (39.5% of the total). The ratio of biomass to fossil EC over the life cycle was 17.9, indicating good utilisation of renewable energy in cornstalk-based EL production. The net life cycle GHG emissions were 96.6 g CO2-eq/MJ. The EL production stage demonstrated the highest GHG emissions, representing 53.4% of the total positive amount. Criteria emissions of carbon monoxide (CO) and particulates ≤ 10 um (PM10) showed negative values, of -3.15 and -0.72 g/MJ, respectively. Nitrogen oxides (NOx) and sulphur dioxide (SO2) emissions showed positive values of 0.33 and 0.28 g/MJ, respectively, mainly arising from the EL production stage. According to the sensitivity analysis, increasing or removing the cornstalk revenue in the LCA leads to an increase or decrease in the EC and environmental emissions while burning cornstalk directly in the field results in large increases in emissions of NMVOC, CO, NOx and PM10 but decreases in fossil EC, and SO2 and GHG emissions.This study was supported by the National Natural Science Foundation of China (51506049), the National High Technology Research and Development Program of China (863 Program) (2012AA051802) and the Henan Province Foundation and Advanced Technology Research Project (132300413218)

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Use of anticoagulants and antiplatelet agents in stable outpatients with coronary artery disease and atrial fibrillation. International CLARIFY registry

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Ptychography

    Get PDF
    Ptychography is a computational imaging technique. A detector records an extensive data set consisting of many inference patterns obtained as an object is displaced to various positions relative to an illumination field. A computer algorithm of some type is then used to invert these data into an image. It has three key advantages: it does not depend upon a good-quality lens, or indeed on using any lens at all; it can obtain the image wave in phase as well as in intensity; and it can self-calibrate in the sense that errors that arise in the experimental set up can be accounted for and their effects removed. Its transfer function is in theory perfect, with resolution being wavelength limited. Although the main concepts of ptychography were developed many years ago, it has only recently (over the last 10 years) become widely adopted. This chapter surveys visible light, x-ray, electron, and EUV ptychography as applied to microscopic imaging. It describes the principal experimental arrangements used at these various wavelengths. It reviews the most common inversion algorithms that are nowadays employed, giving examples of meta code to implement these. It describes, for those new to the field, how to avoid the most common pitfalls in obtaining good quality reconstructions. It also discusses more advanced techniques such as modal decomposition and strategies to cope with three-dimensional () multiple scattering
    corecore