169 research outputs found

    Two-Tone Optomechanical Instability and Its Fundamental Implications for Backaction-Evading Measurements

    Get PDF
    While quantum mechanics imposes a fundamental limit on the precision of interferometric measurements of mechanical motion due to measurement backaction, the nonlinear nature of the coupling also leads to parametric instabilities that place practical limits on the sensitivity by limiting the power in the interferometer. Such instabilities have been extensively studied in the context of gravitational wave detectors, and their presence has recently been reported in Advanced LIGO. Here, we observe experimentally and describe theoretically a new type of optomechanical instability that arises in two-tone backaction-evading (BAE) measurements, designed to overcome the standard quantum limit, and demonstrate the effect in the optical domain with a photonic crystal nanobeam, and in the microwave domain with a micromechanical oscillator coupled to a microwave resonator. In contrast to the well-known oscillatory parametric instability that occurs in single-tone, blue-detuned pumping, which is characterized by a vanishing effective mechanical damping, the parametric instability in balanced two-tone optomechanics is exponential, and is a result of small detuning errors in the two pump frequencies. Its origin can be understood in a rotating frame as the vanishing of the effective mechanical frequency due to an optical spring effect. Counterintuitively, the instability occurs even in the presence of perfectly balanced intracavity fields, and can occur for both signs of detuning. We find excellent quantitative agreement with our theoretical predictions. Since the constraints on tuning accuracy become stricter with increasing probe power, it imposes a fundamental limitation on BAE measurements, as well as other two-tone schemes. In addition to introducing a new limitation in two-tone BAE measurements, the results also introduce a new type of nonlinear dynamics in cavity optomechanics

    Results of the Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC)

    Get PDF
    Next-generation surveys like the Legacy Survey of Space and Time (LSST) on the Vera C. Rubin Observatory (Rubin) will generate orders of magnitude more discoveries of transients and variable stars than previous surveys. To prepare for this data deluge, we developed the Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC), a competition that aimed to catalyze the development of robust classifiers under LSST-like conditions of a nonrepresentative training set for a large photometric test set of imbalanced classes. Over 1000 teams participated in PLAsTiCC, which was hosted in the Kaggle data science competition platform between 2018 September 28 and 2018 December 17, ultimately identifying three winners in 2019 February. Participants produced classifiers employing a diverse set of machine-learning techniques including hybrid combinations and ensemble averages of a range of approaches, among them boosted decision trees, neural networks, and multilayer perceptrons. The strong performance of the top three classifiers on Type Ia supernovae and kilonovae represent a major improvement over the current state of the art within astronomy. This paper summarizes the most promising methods and evaluates their results in detail, highlighting future directions both for classifier development and simulation needs for a next-generation PLAsTiCC data set

    Science-Driven Optimization of the LSST Observing Strategy

    Get PDF
    The Large Synoptic Survey Telescope is designed to provide an unprecedented optical imaging dataset that will support investigations of our Solar System, Galaxy and Universe, across half the sky and over ten years of repeated observation. However, exactly how the LSST observations will be taken (the observing strategy or "cadence") is not yet finalized. In this dynamically-evolving community white paper, we explore how the detailed performance of the anticipated science investigations is expected to depend on small changes to the LSST observing strategy. Using realistic simulations of the LSST schedule and observation properties, we design and compute diagnostic metrics and Figures of Merit that provide quantitative evaluations of different observing strategies, analyzing their impact on a wide range of proposed science projects. This is work in progress: we are using this white paper to communicate to each other the relative merits of the observing strategy choices that could be made, in an effort to maximize the scientific value of the survey. The investigation of some science cases leads to suggestions for new strategies that could be simulated and potentially adopted. Notably, we find motivation for exploring departures from a spatially uniform annual tiling of the sky: focusing instead on different parts of the survey area in different years in a "rolling cadence" is likely to have significant benefits for a number of time domain and moving object astronomy projects. The communal assembly of a suite of quantified and homogeneously coded metrics is the vital first step towards an automated, systematic, science-based assessment of any given cadence simulation, that will enable the scheduling of the LSST to be as well-informed as possible

    Financial Stability Monitoring

    Full text link

    The sensitivity of GPz estimates of photo-z posterior PDFs to realistically complex training set imperfections

    No full text
    The accurate estimation of photometric redshifts is crucial to many upcoming galaxy surveys, for example, the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST). Almost all Rubin extragalactic and cosmological science requires accurate and precise calculation of photometric redshifts; many diverse approaches to this problem are currently in the process of being developed, validated, and tested. In this work, we use the photometric redshift code GPz to examine two realistically complex training set imperfections scenarios for machine learning based photometric redshift calculation: (i) where the spectroscopic training set has a very different distribution in color–magnitude space to the test set, and (ii) where the effect of emission line confusion causes a fraction of the training spectroscopic sample to not have the true redshift. By evaluating the sensitivity of GPz to a range of increasingly severe imperfections, with a range of metrics (both of photo-z point estimates as well as posterior probability distribution functions, PDFs), we quantify the degree to which predictions get worse with higher degrees of degradation. In particular, we find that there is a substantial drop-off in photo-z quality when line-confusion goes above ∌1%, and sample incompleteness below a redshift of 1.5, for an experimental setup using data from the Buzzard Flock synthetic sky catalogs

    Floquet dynamics in the quantum measurement of mechanical motion

    No full text
    The radiation-pressure interaction between one or more laser fields and a mechanical oscillator gives rise to a wide range of phenomena: from sideband cooling and backaction-evading measurements to pondermotive and mechanical squeezing to entanglement and motional sideband asymmetry. In many protocols, such as dissipative mechanical squeezing, multiple lasers are utilized, giving rise to periodically driven optomechanical systems. Here we show that in this case, Floquet dynamics can arise due to presence of Kerr-type nonlinearities, which are ubiqitious in optomechanical systems. Specifically, employing multiple probe tones, we perform sideband asymmetry measurements, a macroscopic quantum effect, on a silicon optomechanical crystal sideband-cooled to 40% ground-state occupation. We show that the Floquet dynamics, resulting from the presence of multiple pump tones, gives rise to an artificially modified motional sideband asymmetry by redistributing thermal and quantum fluctuations among the initially independently scattered thermomechanical sidebands. For pump tones exhibiting large frequency separation, the dynamics is suppressed and accurate quantum noise thermometry demonstrated. We develop a theoretical model based on Floquet theory that accurately describes our observations. The resulting dynamics can be understood as resulting from a synthetic gauge field among the Fourier modes, which is created by the phase lag of the Kerr-type response. This novel phenomenon has wide-ranging implications for schemes utilizing several pumping tones, as commonly employed in backaction-evading measurements, dissipative optical squeezing, dissipative mechanical squeezing and quantum noise thermometry. Our observation may equally well be used for optomechanical Floquet engineering, e.g. generation of topological phases of sound by periodic time-modulation
    • 

    corecore