38 research outputs found

    Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    Get PDF
    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost

    Guided Quasicontinuous Atom Laser

    Full text link
    We report the first realization of a guided quasicontinuous atom laser by rf outcoupling a Bose-Einstein condensate from a hybrid optomagnetic trap into a horizontal atomic waveguide. This configuration allows us to cancel the acceleration due to gravity and keep the de Broglie wavelength constant at 0.5 μ\mum during 0.1 s of propagation. We also show that our configuration, equivalent to pigtailing an optical fiber to a (photon) semiconductor laser, ensures an intrinsically good transverse mode matching.Comment: version published in Phys. Rev. Lett. 97, 200402 (2006

    Space-Based Sentinels for Measurement of Infrared Cooling in the Thermosphere for Space Weather Nowcasting

    Get PDF
    Infrared radiative cooling by nitric oxide (NO) and carbon dioxide (CO2) modulates the thermospheres density and thermal response to geomagnetic storms. Satellite tracking and collision avoidance planning require accurate density forecasts during these events. Over the past several years, failed density forecasts have been tied to the onset of rapid and significant cooling due to production of NO and its associated radiative cooling via emission of infrared radiation at 5.3 m. These results have been diagnosed, after the fact, through analyses of measurements of infrared cooling made by the Sounding of the Atmosphere using Broadband Emission Radiometry instrument now in orbit over 16 years on the National Aeronautics and Space Administration Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics satellite. Radiative cooling rates for NO and CO2 have been further shown to be directly correlated with composition and exospheric temperature changes during geomagnetic storms. These results strongly suggest that a network of smallsats observing the infrared radiative cooling of the thermosphere could serve as space weather sentinels. These sentinels would observe and provide radiative cooling rate data in real time to generate nowcasts of density and aerodynamic drag on space vehicles. Currently, radiative cooling is not directly considered in operational space weather forecast models. In addition, recent research has shown that different geomagnetic storm types generate substantially different infrared radiative response, and hence, substantially different thermospheric density response. The ability to identify these storms, and to measure and predict the Earths response to them, should enable substantial improvement in thermospheric density forecasts

    Theory of mirror benchmarking and demonstration on a quantum computer

    Full text link
    A new class of protocols called mirror benchmarking was recently proposed to measure the system-level performance of quantum computers. These protocols involve circuits with random sequences of gates followed by mirroring, that is, inverting each gate in the sequence. We give a simple proof that mirror benchmarking leads to an exponential decay of the survival probability with sequence length, under the uniform noise assumption, provided the twirling group forms a 2-design. The decay rate is determined by a quantity that is a quadratic function of the error channel, and for certain types of errors is equal to the unitarity. This result yields a new method for estimating the coherence of noise. We present data from mirror benchmarking experiments run on the Honeywell System Model H1. This data constitutes a set of performance curves, indicating the success probability for random circuits as a function of qubit number and circuit depth

    Measuring the Loschmidt amplitude for finite-energy properties of the Fermi-Hubbard model on an ion-trap quantum computer

    Full text link
    Calculating the equilibrium properties of condensed matter systems is one of the promising applications of near-term quantum computing. Recently, hybrid quantum-classical time-series algorithms have been proposed to efficiently extract these properties from a measurement of the Loschmidt amplitude ⟨ψ∣e−iH^t∣ψ⟩\langle \psi| e^{-i \hat H t}|\psi \rangle from initial states ∣ψ⟩|\psi\rangle and a time evolution under the Hamiltonian H^\hat H up to short times tt. In this work, we study the operation of this algorithm on a present-day quantum computer. Specifically, we measure the Loschmidt amplitude for the Fermi-Hubbard model on a 1616-site ladder geometry (32 orbitals) on the Quantinuum H2-1 trapped-ion device. We assess the effect of noise on the Loschmidt amplitude and implement algorithm-specific error mitigation techniques. By using a thus-motivated error model, we numerically analyze the influence of noise on the full operation of the quantum-classical algorithm by measuring expectation values of local observables at finite energies. Finally, we estimate the resources needed for scaling up the algorithm.Comment: 18 pages, 12 figure

    Convergent Antibody Responses to SARS-CoV-2 Infection in Convalescent Individuals

    Get PDF
    During the COVID-19 pandemic, SARS-CoV-2 infected millions of people and claimed hundreds of thousands of lives. Virus entry into cells depends on the receptor binding domain (RBD) of the SARS-CoV-2 spike protein (S). Although there is no vaccine, it is likely that antibodies will be essential for protection. However, little is known about the human antibody response to SARS-CoV-2. Here we report on 149 COVID-19 convalescent individuals. Plasmas collected an average of 39 days after the onset of symptoms had variable half-maximal pseudovirus neutralizing titres: less than 1:50 in 33% and below 1:1,000 in 79%, while only 1% showed titres above 1:5,000. Antibody sequencing revealed expanded clones of RBD-specific memory B cells expressing closely related antibodies in different individuals. Despite low plasma titres, antibodies to three distinct epitopes on RBD neutralized at half-maximal inhibitory concentrations (ICâ‚…â‚€ values) as low as single digit nanograms per millitre. Thus, most convalescent plasmas obtained from individuals who recover from COVID-19 do not contain high levels of neutralizing activity. Nevertheless, rare but recurring RBD-specific antibodies with potent antiviral activity were found in all individuals tested, suggesting that a vaccine designed to elicit such antibodies could be broadly effective
    corecore