2,787 research outputs found

    Implementation of three-qubit Toffoli gate in a single step

    Full text link
    Single-step implementations of multi-qubit gates are generally believed to provide a simpler design, a faster operation, and a lower decoherence. For coupled three qubits interacting with a photon field, a realizable scheme for a single-step Toffoli gate is investigated. We find that the three qubit system can be described by four effective modified Jaynes-Cummings models in the states of two control qubits. Within the rotating wave approximation, the modified Jaynes-Cummings models are shown to be reduced to the conventional Jaynes-Cummings models with renormalized couplings between qubits and photon fields. A single-step Toffoli gate is shown to be realizable with tuning the four characteristic oscillation periods that satisfy a commensurate condition. Possible values of system parameters are estimated for single-step Toffli gate. From numerical calculation, further, our single-step Toffoli gate operation errors are discussed due to imperfections in system parameters, which shows that a Toffoli gate with high fidelity can be obtained by adjusting pairs of the photon-qubit and the qubit-qubit coupling strengthes. In addition, a decoherence effect on the Toffoli gate operation is discussed due to a thermal reservoir.Comment: 8 pages, 4 figures, to appear in PR

    Comparing the impact of environmental conditions and microphysics on the forecast uncertainty of deep convective clouds and hail

    Get PDF
    Severe hailstorms have the potential to damage buildings and crops. However, important processes for the prediction of hailstorms are insufficiently represented in operational weather forecast models. Therefore, our goal is to identify model input parameters describing environmental conditions and cloud microphysics, such as the vertical wind shear and strength of ice multiplication, which lead to large uncertainties in the prediction of deep convective clouds and precipitation. We conduct a comprehensive sensitivity analysis simulating deep convective clouds in an idealized setup of a cloud-resolving model. We use statistical emulation and variance-based sensitivity analysis to enable a Monte Carlo sampling of the model outputs across the multi-dimensional parameter space. The results show that the model dynamical and microphysical properties are sensitive to both the environmental and microphysical uncertainties in the model. The microphysical parameters lead to larger uncertainties in the output of integrated hydrometeor mass contents and precipitation variables. In particular, the uncertainty in the fall velocities of graupel and hail account for more than 65 % of the variance of all considered precipitation variables and for 30 %–90 % of the variance of the integrated hydrometeor mass contents. In contrast, variations in the environmental parameters – the range of which is limited to represent model uncertainty – mainly affect the vertical profiles of the diabatic heating rates

    Pathways to clinical CLARITY: volumetric analysis of irregular, soft, and heterogeneous tissues in development and disease

    Get PDF
    AbstractThree-dimensional tissue-structural relationships are not well captured by typical thin-section histology, posing challenges for the study of tissue physiology and pathology. Moreover, while recent progress has been made with intact methods for clearing, labeling, and imaging whole organs such as the mature brain, these approaches are generally unsuitable for soft, irregular, and heterogeneous tissues that account for the vast majority of clinical samples and biopsies. Here we develop a biphasic hydrogel methodology, which along with automated analysis, provides for high-throughput quantitative volumetric interrogation of spatially-irregular and friable tissue structures. We validate and apply this approach in the examination of a variety of developing and diseased tissues, with specific focus on the dynamics of normal and pathological pancreatic innervation and development, including in clinical samples. Quantitative advantages of the intact-tissue approach were demonstrated compared to conventional thin-section histology, pointing to broad applications in both research and clinical settings.</jats:p

    Using Emulators to Understand the Sensitivity of Deep Convective Clouds and Hail to Environmental Conditions

    Get PDF
    This study aims to identify model parameters describing atmospheric conditions such as wind shear and cloud condensation nuclei (CCN) concentration, which lead to large uncertainties in the prediction of deep convective clouds. In an idealized setup of a cloud-resolving model including a two-moment microphysics scheme we use the approach of statistical emulation to allow for a Monte Carlo sampling of the parameter space, which enables a comprehensive sensitivity analysis. We analyze the impact of six uncertain input parameters on cloud properties (vertically integrated content of six hydrometeor classes), precipitation, and the size distribution of hail. Furthermore, we investigate whether the sensitivities are robust for different trigger mechanisms of convection. We find that the uncertainties of most cloud and precipitation outputs are dominated by the uncertainty in the temperature profile and the CCN concentration while the contributions of other input parameters to the uncertainties may vary. The temperature profile is also an important factor in determining the size distribution of surface hail. We also notice that the sensitivities of cloud water and hail to the CCN concentration depend on environmental conditions. Our results show that depending on the choice of the trigger mechanism, the contribution of the input parameters to the uncertainty varies, which means that studies with different trigger mechanisms might not be comparable. Overall, the emulator approach appears to be a powerful tool for the analysis of complex weather prediction models in an idealized setup

    Measuring measurement

    Full text link
    Measurement connects the world of quantum phenomena to the world of classical events. It plays both a passive role, observing quantum systems, and an active one, preparing quantum states and controlling them. Surprisingly - in the light of the central status of measurement in quantum mechanics - there is no general recipe for designing a detector that measures a given observable. Compounding this, the characterization of existing detectors is typically based on partial calibrations or elaborate models. Thus, experimental specification (i.e. tomography) of a detector is of fundamental and practical importance. Here, we present the realization of quantum detector tomography: we identify the optimal positive-operator-valued measure describing the detector, with no ancillary assumptions. This result completes the triad, state, process, and detector tomography, required to fully specify an experiment. We characterize an avalanche photodiode and a photon number resolving detector capable of detecting up to eight photons. This creates a new set of tools for accurately detecting and preparing non-classical light.Comment: 6 pages, 4 figures,see video abstract at http://www.quantiki.org/video_abstracts/0807244

    Single electron emission in two-phase xenon with application to the detection of coherent neutrino-nucleus scattering

    Get PDF
    We present an experimental study of single electron emission in ZEPLIN-III, a two-phase xenon experiment built to search for dark matter WIMPs, and discuss applications enabled by the excellent signal-to-noise ratio achieved in detecting this signature. Firstly, we demonstrate a practical method for precise measurement of the free electron lifetime in liquid xenon during normal operation of these detectors. Then, using a realistic detector response model and backgrounds, we assess the feasibility of deploying such an instrument for measuring coherent neutrino-nucleus elastic scattering using the ionisation channel in the few-electron regime. We conclude that it should be possible to measure this elusive neutrino signature above an ionisation threshold of \sim3 electrons both at a stopped pion source and at a nuclear reactor. Detectable signal rates are larger in the reactor case, but the triggered measurement and harder recoil energy spectrum afforded by the accelerator source enable lower overall background and fiducialisation of the active volume

    Adequate debridement and drainage of the mediastinum using open thoracotomy or video-assisted thoracoscopic surgery for Boerhaave’s syndrome

    Get PDF
    Background Boerhaave's syndrome has a high mortality rate (14-40%). Surgical treatment varies from a minimal approach consisting of adequate debridement with drainage of the mediastinum and pleural cavity to esophageal resection. This study compared the results between a previously preferred open minimal approach and a video-assisted thoracoscopic surgery (VATS) procedure currently considered the method of choice. Methods In this study, 12 consecutive patients treated with a historical nonresectional drainage approach (1985-2001) were compared with 12 consecutive patients treated prospectively after the introduction of VATS during the period 2002-2009. Baseline characteristics were equally distributed between the two groups. Results In the prospective group, 2 of the 12 patients had the VATS procedure converted to an open thoracotomy, and 2 additional patients were treated by open surgery. In the prospective group, 8 patients experienced postoperative complications compared with all 12 patients in the historical control group. Four patients (17%), two in each group, underwent reoperation. Six patients, three in each group, were readmitted to the hospital. The overall in-hospital mortality was 8% (1 patient in each group), which compares favorably with other reports (7-27%) based on drainage alone. Conclusions Adequate surgical debridement with drainage of the mediastinum and pleural cavity resulted in a low mortality rate. The results for VATS in this relatively small series were comparable with those for an open thoracotomy

    Efficient quantum state tomography

    Get PDF
    Quantum state tomography, the ability to deduce the state of a quantum system from measured data, is the gold standard for verification and benchmarking of quantum devices. It has been realized in systems with few components, but for larger systems it becomes infeasible because the number of quantum measurements and the amount of computation required to process them grows exponentially in the system size. Here we show that we can do exponentially better than direct state tomography for a wide range of quantum states, in particular those that are well approximated by a matrix product state ansatz. We present two schemes for tomography in 1-D quantum systems and touch on generalizations. One scheme requires unitary operations on a constant number of subsystems, while the other requires only local measurements together with more elaborate post-processing. Both schemes rely only on a linear number of experimental operations and classical postprocessing that is polynomial in the system size. A further strength of the methods is that the accuracy of the reconstructed states can be rigorously certified without any a priori assumptions.Comment: 9 pages, 4 figures. Combines many of the results in arXiv:1002.3780, arXiv:1002.3839, and arXiv:1002.4632 into one unified expositio

    Hadronic Mass Moments in Inclusive Semileptonic B Meson Decays

    Full text link
    We have measured the first and second moments of the hadronic mass-squared distribution in B -> X_c l nu, for P(lepton) > 1.5 GeV/c. We find <M_X^2 - M_D[Bar]^2> = 0.251 +- 0.066 GeV^2, )^2 > = 0.576 +- 0.170 GeV^4, where M_D[Bar] is the spin-averaged D meson mass. From that first moment and the first moment of the photon energy spectrum in b -> s gamma, we find the HQET parameter lambda_1 (MS[Bar], to order 1/M^3 and beta_0 alpha_s^2) to be -0.24 +- 0.11 GeV^2. Using these first moments and the B semileptonic width, and assuming parton-hadron duality, we obtain |V_cb| = 0.0404 +- 0.0013.Comment: 11 pages postscript, also available through http://w4.lns.cornell.edu/public/CLNS, submitted to PR

    A Genome-Wide Approach to Discovery of Small RNAs Involved in Regulation of Virulence in Vibrio cholerae

    Get PDF
    Small RNAs (sRNAs) are becoming increasingly recognized as important regulators in bacteria. To investigate the contribution of sRNA mediated regulation to virulence in Vibrio cholerae, we performed high throughput sequencing of cDNA generated from sRNA transcripts isolated from a strain ectopically expressing ToxT, the major transcriptional regulator within the virulence gene regulon. We compared this data set with ToxT binding sites determined by pulldown and deep sequencing to identify sRNA promoters directly controlled by ToxT. Analysis of the resulting transcripts with ToxT binding sites in cis revealed two sRNAs within the Vibrio Pathogenicity Island. When deletions of these sRNAs were made and the resulting strains were competed against the parental strain in the infant mouse model of V. cholerae colonization, one, TarB, displayed a variable colonization phenotype dependent on its physiological state at the time of inoculation. We identified a target of TarB as the mRNA for the secreted colonization factor, TcpF. We verified negative regulation of TcpF expression by TarB and, using point mutations that disrupted interaction between TarB and tpcF mRNA, showed that loss of this negative regulation was primarily responsible for the colonization phenotype observed in the TarB deletion mutant
    corecore