20,661 research outputs found

    Control of quantum phenomena: Past, present, and future

    Full text link
    Quantum control is concerned with active manipulation of physical and chemical processes on the atomic and molecular scale. This work presents a perspective of progress in the field of control over quantum phenomena, tracing the evolution of theoretical concepts and experimental methods from early developments to the most recent advances. The current experimental successes would be impossible without the development of intense femtosecond laser sources and pulse shapers. The two most critical theoretical insights were (1) realizing that ultrafast atomic and molecular dynamics can be controlled via manipulation of quantum interferences and (2) understanding that optimally shaped ultrafast laser pulses are the most effective means for producing the desired quantum interference patterns in the controlled system. Finally, these theoretical and experimental advances were brought together by the crucial concept of adaptive feedback control, which is a laboratory procedure employing measurement-driven, closed-loop optimization to identify the best shapes of femtosecond laser control pulses for steering quantum dynamics towards the desired objective. Optimization in adaptive feedback control experiments is guided by a learning algorithm, with stochastic methods proving to be especially effective. Adaptive feedback control of quantum phenomena has found numerous applications in many areas of the physical and chemical sciences, and this paper reviews the extensive experiments. Other subjects discussed include quantum optimal control theory, quantum control landscapes, the role of theoretical control designs in experimental realizations, and real-time quantum feedback control. The paper concludes with a prospective of open research directions that are likely to attract significant attention in the future.Comment: Review article, final version (significantly updated), 76 pages, accepted for publication in New J. Phys. (Focus issue: Quantum control

    Magnetic turbulence in the plasma sheet

    Full text link
    Small-scale magnetic turbulence observed by the Cluster spacecraft in the plasma sheet is investigated by means of a wavelet estimator suitable for detecting distinct scaling characteristics even in noisy measurements. The spectral estimators used for this purpose are affected by a frequency dependent bias. The variances of the wavelet coefficients, however, match the power-law shaped spectra, which makes the wavelet estimator essentially unbiased. These scaling characteristics of the magnetic field data appear to be essentially non-steady and intermittent. The scaling properties of bursty bulk flow (BBF) and non-BBF associated magnetic fluctuations are analysed with the aim of understanding processes of energy transfer between scales. Small-scale (0.080.3\sim 0.08-0.3 s) magnetic fluctuations having the same scaling index α2.6\alpha \sim 2.6 as the large-scale (0.75\sim 0.7-5 s) magnetic fluctuations occur during BBF-associated periods. During non-BBF associated periods the energy transfer to small scales is absent, and the large-scale scaling index α1.7\alpha \sim 1.7 is closer to Kraichnan or Iroshnikov-Kraichnan scalings. The anisotropy characteristics of magnetic fluctuations show both scale-dependent and scale-independent behavior. The former can be partly explained in terms of the Goldreich-Sridhar model of MHD turbulence, which leads to the picture of Alfv\'{e}nic turbulence parallel and of eddy turbulence perpendicular to the mean magnetic field direction. Nonetheless, other physical mechanisms, such as transverse magnetic structures, velocity shears, or boundary effects can contribute to the anisotropy characteristics of plasma sheet turbulence. The scale-independent features are related to anisotropy characteristics which occur during a period of magnetic reconnection and fast tailward flow.Comment: 32 pages, 12 figure

    Constraining the Solution to the Last Parsec Problem with Pulsar Timing

    Get PDF
    The detection of a stochastic gravitational-wave signal from the superposition of many inspiraling supermassive black holes with pulsar timing arrays (PTAs) is likely to occur within the next decade. With this detection will come the opportunity to learn about the processes that drive black-hole-binary systems toward merger through their effects on the gravitational-wave spectrum. We use Bayesian methods to investigate the extent to which effects other than gravitational-wave emission can be distinguished using PTA observations. We show that, even in the absence of a detection, it is possible to place interesting constraints on these dynamical effects for conservative predictions of the population of tightly bound supermassive black-hole binaries. For instance, if we assume a relatively weak signal consistent with a low number of bound binaries and a low black-hole-mass to galaxy-mass correlation, we still find that a non-detection by a simulated array, with a sensitivity that should be reached in practice within a few years, disfavors gravitational-wave-dominated evolution with an odds ratio of \sim30:1. Such a finding would suggest either that all existing astrophysical models for the population of tightly bound binaries are overly optimistic, or else that some dynamical effect other than gravitational-wave emission is actually dominating binary evolution even at the relatively high frequencies/small orbital separations probed by PTAs.Comment: 14 pages, 8 figure

    Regularized adaptive long autoregressive spectral analysis

    Full text link
    This paper is devoted to adaptive long autoregressive spectral analysis when (i) very few data are available, (ii) information does exist beforehand concerning the spectral smoothness and time continuity of the analyzed signals. The contribution is founded on two papers by Kitagawa and Gersch. The first one deals with spectral smoothness, in the regularization framework, while the second one is devoted to time continuity, in the Kalman formalism. The present paper proposes an original synthesis of the two contributions: a new regularized criterion is introduced that takes both information into account. The criterion is efficiently optimized by a Kalman smoother. One of the major features of the method is that it is entirely unsupervised: the problem of automatically adjusting the hyperparameters that balance data-based versus prior-based information is solved by maximum likelihood. The improvement is quantified in the field of meteorological radar

    Review of high-contrast imaging systems for current and future ground- and space-based telescopes I. Coronagraph design methods and optical performance metrics

    Full text link
    The Optimal Optical Coronagraph (OOC) Workshop at the Lorentz Center in September 2017 in Leiden, the Netherlands gathered a diverse group of 25 researchers working on exoplanet instrumentation to stimulate the emergence and sharing of new ideas. In this first installment of a series of three papers summarizing the outcomes of the OOC workshop, we present an overview of design methods and optical performance metrics developed for coronagraph instruments. The design and optimization of coronagraphs for future telescopes has progressed rapidly over the past several years in the context of space mission studies for Exo-C, WFIRST, HabEx, and LUVOIR as well as ground-based telescopes. Design tools have been developed at several institutions to optimize a variety of coronagraph mask types. We aim to give a broad overview of the approaches used, examples of their utility, and provide the optimization tools to the community. Though it is clear that the basic function of coronagraphs is to suppress starlight while maintaining light from off-axis sources, our community lacks a general set of standard performance metrics that apply to both detecting and characterizing exoplanets. The attendees of the OOC workshop agreed that it would benefit our community to clearly define quantities for comparing the performance of coronagraph designs and systems. Therefore, we also present a set of metrics that may be applied to theoretical designs, testbeds, and deployed instruments. We show how these quantities may be used to easily relate the basic properties of the optical instrument to the detection significance of the given point source in the presence of realistic noise.Comment: To appear in Proceedings of the SPIE, vol. 1069

    Multidisciplinary design of a micro-USV for re-entry operations

    Get PDF
    Unmanned Space Vehicles (USV) are seen as a test-bed for enabling technologies and as a carrier to deliver and return experiments to and from low-Earth orbit. USV's are a potentially interesting solution also for the exploration of other planets or as long-range recognisance vehicles. As test bed, USV's are seen as a stepping stone for the development of future generation re-usable launchers but also as way to test key technologies for re-entry operations. Examples of recent developments are the PRORA-USV, designed by the Italian Aerospace Research Center (CIRA) in collaboration with Gavazzi Space, or the Boeing X-37B Orbital Test Vehicle (OTV), that is foreseen as an alternative to the space shuttle to deliver experiments into Earth orbit. Among the technologies to be demonstrated with the X-37 are improved thermal protection systems, avionics, the autonomous guidance system, and an advanced airfram
    corecore