99,331 research outputs found

    Performance Measurement Systems Must Be Engineered

    Get PDF
    The aim of this paper, which puts special emphasis on IT-related aspects, is threefold. ·First, it defines requirements a modern Performance Measurements System (PMS) should meet. The list of requirements generated can be used both to assess a current PMS, and to identify ways to improve an existing PMS. ·Second, it reports the findings of an empirical study, which seeks to identify the shortcomings of existing PMSs. ·Third, a life cycle for PMSs is suggested

    Measurement and control of a mechanical oscillator at its thermal decoherence rate

    Full text link
    In real-time quantum feedback protocols, the record of a continuous measurement is used to stabilize a desired quantum state. Recent years have seen highly successful applications in a variety of well-isolated micro-systems, including microwave photons and superconducting qubits. By contrast, the ability to stabilize the quantum state of a tangibly massive object, such as a nanomechanical oscillator, remains a difficult challenge: The main obstacle is environmental decoherence, which places stringent requirements on the timescale in which the state must be measured. Here we describe a position sensor that is capable of resolving the zero-point motion of a solid-state, nanomechanical oscillator in the timescale of its thermal decoherence, a critical requirement for preparing its ground state using feedback. The sensor is based on cavity optomechanical coupling, and realizes a measurement of the oscillator's displacement with an imprecision 40 dB below that at the standard quantum limit, while maintaining an imprecision-back-action product within a factor of 5 of the Heisenberg uncertainty limit. Using the measurement as an error signal and radiation pressure as an actuator, we demonstrate active feedback cooling (cold-damping) of the 4.3 MHz oscillator from a cryogenic bath temperature of 4.4 K to an effective value of 1.1±\pm0.1 mK, corresponding to a mean phonon number of 5.3±\pm0.6 (i.e., a ground state probability of 16%). Our results set a new benchmark for the performance of a linear position sensor, and signal the emergence of engineered mechanical oscillators as practical subjects for measurement-based quantum control.Comment: 24 pages, 10 figures; typos corrected in main text and figure

    An approach to quantify value provided by an engineered asset according to the ISO 5500x series of standards

    Get PDF
    Asset Intelligence through Integration and Interoperability and Contemporary Vibration Engineering Technologies : Proceedings of the 12th World Congress on Engineering Asset Management and the 13th International Conference on Vibration Engineering and Technology of Machinery. 2-4 August 2017, Brisbane, AustraliaThe purpose of any asset is to provide value to the organization and its stakeholders. In Asset Management, the concept of value encompasses quantitative and qualitative, as well as tangible and intangible benefits that assets may provide to an organization. The definitions of asset and value are not only closely linked but also complementary. An “asset” provides the means for the realisation of “value” thus the management of an asset is strategic and has to be linked to an organization’s value norms. This paper extrapolates from the definitions in ISO 5500x series of standards to describe a generic approach for quantifying the value provided by engineered assets deployed by a business organisation.Unión Europea. 64573

    Cross-Layer Peer-to-Peer Track Identification and Optimization Based on Active Networking

    Get PDF
    P2P applications appear to emerge as ultimate killer applications due to their ability to construct highly dynamic overlay topologies with rapidly-varying and unpredictable traffic dynamics, which can constitute a serious challenge even for significantly over-provisioned IP networks. As a result, ISPs are facing new, severe network management problems that are not guaranteed to be addressed by statically deployed network engineering mechanisms. As a first step to a more complete solution to these problems, this paper proposes a P2P measurement, identification and optimisation architecture, designed to cope with the dynamicity and unpredictability of existing, well-known and future, unknown P2P systems. The purpose of this architecture is to provide to the ISPs an effective and scalable approach to control and optimise the traffic produced by P2P applications in their networks. This can be achieved through a combination of different application and network-level programmable techniques, leading to a crosslayer identification and optimisation process. These techniques can be applied using Active Networking platforms, which are able to quickly and easily deploy architectural components on demand. This flexibility of the optimisation architecture is essential to address the rapid development of new P2P protocols and the variation of known protocols

    Experimental quantum verification in the presence of temporally correlated noise

    Full text link
    Growth in the complexity and capabilities of quantum information hardware mandates access to practical techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). We study these using an analytic toolkit based on a formalism mapping noise to errors for arbitrary sequences of unitary operations. This analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171^{171}Yb+^{+} ion as a qubit and inject engineered noise (σz\propto \sigma^z) to probe protocol performance. Experiments on RB validate predictions that the distribution of measured fidelities over sequences is described by a gamma distribution varying between approximately Gaussian for rapidly varying noise, and a broad, highly skewed distribution for the slowly varying case. Similarly we find a strong gate set dependence of GST in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σz\sigma^z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σx\sigma^x or σy\sigma^y errors or rapidly varying noise processes, highlighting the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.Comment: Expanded and updated analysis of GST, including detailed examination of the role of gauge optimization in GST. Full GST data sets and supplementary information available on request from the authors. Related results available from http://www.physics.usyd.edu.au/~mbiercuk/Publications.htm

    Sub-nanosecond signal propagation in anisotropy engineered nanomagnetic logic chains

    Get PDF
    Energy efficient nanomagnetic logic (NML) computing architectures propagate and process binary information by relying on dipolar field coupling to reorient closely-spaced nanoscale magnets. Signal propagation in nanomagnet chains of various sizes, shapes, and magnetic orientations has been previously characterized by static magnetic imaging experiments with low-speed adiabatic operation; however the mechanisms which determine the final state and their reproducibility over millions of cycles in high-speed operation (sub-ns time scale) have yet to be experimentally investigated. Monitoring NML operation at its ultimate intrinsic speed reveals features undetectable by conventional static imaging including individual nanomagnetic switching events and systematic error nucleation during signal propagation. Here, we present a new study of NML operation in a high speed regime at fast repetition rates. We perform direct imaging of digital signal propagation in permalloy nanomagnet chains with varying degrees of shape-engineered biaxial anisotropy using full-field magnetic soft x-ray transmission microscopy after applying single nanosecond magnetic field pulses. Further, we use time-resolved magnetic photo-emission electron microscopy to evaluate the sub-nanosecond dipolar coupling signal propagation dynamics in optimized chains with 100 ps time resolution as they are cycled with nanosecond field pulses at a rate of 3 MHz. An intrinsic switching time of 100 ps per magnet is observed. These experiments, and accompanying macro-spin and micromagnetic simulations, reveal the underlying physics of NML architectures repetitively operated on nanosecond timescales and identify relevant engineering parameters to optimize performance and reliability.Comment: Main article (22 pages, 4 figures), Supplementary info (11 pages, 5 sections
    corecore