3,204 research outputs found

    The Quantum Frontier

    Full text link
    The success of the abstract model of computation, in terms of bits, logical operations, programming language constructs, and the like, makes it easy to forget that computation is a physical process. Our cherished notions of computation and information are grounded in classical mechanics, but the physics underlying our world is quantum. In the early 80s researchers began to ask how computation would change if we adopted a quantum mechanical, instead of a classical mechanical, view of computation. Slowly, a new picture of computation arose, one that gave rise to a variety of faster algorithms, novel cryptographic mechanisms, and alternative methods of communication. Small quantum information processing devices have been built, and efforts are underway to build larger ones. Even apart from the existence of these devices, the quantum view on information processing has provided significant insight into the nature of computation and information, and a deeper understanding of the physics of our universe and its connections with computation. We start by describing aspects of quantum mechanics that are at the heart of a quantum view of information processing. We give our own idiosyncratic view of a number of these topics in the hopes of correcting common misconceptions and highlighting aspects that are often overlooked. A number of the phenomena described were initially viewed as oddities of quantum mechanics. It was quantum information processing, first quantum cryptography and then, more dramatically, quantum computing, that turned the tables and showed that these oddities could be put to practical effect. It is these application we describe next. We conclude with a section describing some of the many questions left for future work, especially the mysteries surrounding where the power of quantum information ultimately comes from.Comment: Invited book chapter for Computation for Humanity - Information Technology to Advance Society to be published by CRC Press. Concepts clarified and style made more uniform in version 2. Many thanks to the referees for their suggestions for improvement

    Maximum Likelihood Estimation for Single Particle, Passive Microrheology Data with Drift

    Get PDF
    Volume limitations and low yield thresholds of biological fluids have led to widespread use of passive microparticle rheology. The mean-squared-displacement (MSD) statistics of bead position time series (bead paths) are either applied directly to determine the creep compliance [Xu et al (1998)] or transformed to determine dynamic storage and loss moduli [Mason & Weitz (1995)]. A prevalent hurdle arises when there is a non-diffusive experimental drift in the data. Commensurate with the magnitude of drift relative to diffusive mobility, quantified by a P\'eclet number, the MSD statistics are distorted, and thus the path data must be "corrected" for drift. The standard approach is to estimate and subtract the drift from particle paths, and then calculate MSD statistics. We present an alternative, parametric approach using maximum likelihood estimation that simultaneously fits drift and diffusive model parameters from the path data; the MSD statistics (and consequently the compliance and dynamic moduli) then follow directly from the best-fit model. We illustrate and compare both methods on simulated path data over a range of P\'eclet numbers, where exact answers are known. We choose fractional Brownian motion as the numerical model because it affords tunable, sub-diffusive MSD statistics consistent with typical 30 second long, experimental observations of microbeads in several biological fluids. Finally, we apply and compare both methods on data from human bronchial epithelial cell culture mucus.Comment: 29 pages, 12 figure

    Joint ERCIM eMobility and MobiSense Workshop

    Get PDF

    Reinforcement Learning

    Get PDF
    Brains rule the world, and brain-like computation is increasingly used in computers and electronic devices. Brain-like computation is about processing and interpreting data or directly putting forward and performing actions. Learning is a very important aspect. This book is on reinforcement learning which involves performing actions to achieve a goal. The first 11 chapters of this book describe and extend the scope of reinforcement learning. The remaining 11 chapters show that there is already wide usage in numerous fields. Reinforcement learning can tackle control tasks that are too complex for traditional, hand-designed, non-learning controllers. As learning computers can deal with technical complexities, the tasks of human operators remain to specify goals on increasingly higher levels. This book shows that reinforcement learning is a very dynamic area in terms of theory and applications and it shall stimulate and encourage new research in this field

    Optimizing Stimulation Strategies in Cochlear Implants for Music Listening

    Get PDF
    Most cochlear implant (CI) strategies are optimized for speech characteristics while music enjoyment is signicantly below normal hearing performance. In this thesis, electrical stimulation strategies in CIs are analyzed for music input. A simulation chain consisting of two parallel paths, simulating normal hearing conditions and electrical hearing respectively, is utilized. One thesis objective is to congure and develop the sound processor of the CI chain to analyze dierent compression- and channel selection strategies to optimally capture the characteristics of music signals. A new set of knee points (KPs) for the compression function are investigated together with clustering of frequency bands. The N-of-M electrode selection strategy models the eect of a psychoacoustic masking threshold. In order to evaluate the performance of the CI model, the normal hearing model is considered a true reference. Similarity among the resulting neurograms of respective model are measured using the image analysis method Neurogram Similarity Index Measure (NSIM). The validation and resolution of NSIM is another objective of the thesis. Results indicate that NSIM is sensitive to no-activity regions in the neurograms and has diculties capturing small CI changes, i.e. compression settings. Further verication of the model setup is suggested together with investigating an alternative optimal electric hearing reference and/or objective similarity measure

    Validated force-based modeling of pedestrian dynamics

    Get PDF
    This dissertation investigates force-based modeling of pedestrian dynamics. Having the quantitative validation of mathematical models in focus principle questions will be addressed throughout this work: Is it manageable to describe pedestrian dynamics solely with the equations of motion derived from the Newtonian dynamics? On the road to giving answers to this question we investigate the consequences and side-effects of completing a force-based model with additional rules and imposing restrictions on the state variables. Another important issue is the representation of modeled pedestrians. Does the geometrical shape of a two dimensional projection of the human body matter when modeling pedestrian movement? If yes which form is most suitable? This point is investigated in the second part while introducing a new force-based model. Moreover, we highlight a frequently underestimated aspect in force-based modeling which is to what extent the steering of pedestrians influences their dynamics? In the third part we introduce four possible strategies to define the desired direction of each pedestrian when moving in a facility. Finally, the effects of the aforementioned approaches are discussed by means of numerical tests in different geometries with one set of model parameters. Furthermore, the validation of the developed model is questioned by comparing simulation results with empirical data

    Safe Maneuvering Near Offshore Installations: A New Algorithmic Tool

    Get PDF
    Maneuvers of human-operated and autonomous marine vessels in the safety zone of drilling rigs, wind farms and other installations present a risk of collision. This article proposes an algorithmic toolkit that ensures maneuver safety, taking into account the restrictions imposed by ship dynamics. The algorithms can be used for anomaly detection, decision making by a human operator or an unmanned vehicle guidance system. We also consider a response to failures in the vessel's control systems and emergency escape maneuvers. Data used by the algorithms come from the vessel's dynamic positioning control system and positional survey charts of the marine installations

    Quantum Transpiler Optimization: On the Development, Implementation, and Use of a Quantum Research Testbed

    Get PDF
    Quantum computing research is at the cusp of a paradigm shift. As the complexity of quantum systems increases, so does the complexity of research procedures for creating and testing layers of the quantum software stack. However, the tools used to perform these tasks have not experienced the increase in capability required to effectively handle the development burdens involved. This case is made particularly clear in the context of IBM QX Transpiler optimization algorithms and functions. IBM QX systems use the Qiskit library to create, transform, and execute quantum circuits. As coherence times and hardware qubit counts increase and qubit topologies become more complex, so does orchestration of qubit mapping and qubit state movement across these topologies. The transpiler framework used to create and test improved algorithms has not kept pace. A testbed is proposed to provide abstractions to create and test transpiler routines. The development process is analyzed and implemented, from design principles through requirements analysis and verification testing. Additionally, limitations of existing transpiler algorithms are identified and initial results are provided that suggest more effective algorithms for qubit mapping and state movement

    Sonic Booms in Atmospheric Turbulence (SonicBAT): The Influence of Turbulence on Shaped Sonic Booms

    Get PDF
    The objectives of the Sonic Booms in Atmospheric Turbulence (SonicBAT) Program were to develop and validate, via research flight experiments under a range of realistic atmospheric conditions, one numeric turbulence model research code and one classic turbulence model research code using traditional N-wave booms in the presence of atmospheric turbulence, and to apply these models to assess the effects of turbulence on the levels of shaped sonic booms predicted from low boom aircraft designs. The SonicBAT program has successfully investigated sonic boom turbulence effects through the execution of flight experiments at two NASA centers, Armstrong Flight Research Center (AFRC) and Kennedy Space Center (KSC), collecting a comprehensive set of acoustic and atmospheric turbulence data that were used to validate the numeric and classic turbulence models developed. The validated codes were incorporated into the PCBoom sonic boom prediction software and used to estimate the effect of turbulence on the levels of shaped sonic booms associated with several low boom aircraft designs. The SonicBAT program was a four year effort that consisted of turbulence model development and refinement throughout the entire period as well as extensive flight test planning that culminated with the two research flight tests being conducted in the second and third years of the program. The SonicBAT team, led by Wyle, includes partners from the Pennsylvania State University, Lockheed Martin, Gulfstream Aerospace, Boeing, Eagle Aeronautics, Technical & Business Systems, and the Laboratory of Fluid Mechanics and Acoustics (France). A number of collaborators, including the Japan Aerospace Exploration Agency, also participated by supporting the experiments with human and equipment resources at their own expense. Three NASA centers, AFRC, Langley Research Center (LaRC), and KSC were essential to the planning and conduct of the experiments. The experiments involved precision flight of either an F-18A or F-18B executing steady, level passes at supersonic airspeeds in a turbulent atmosphere to create sonic boom signatures that had been distorted by turbulence. The flights spanned a range of atmospheric turbulence conditions at NASA Armstrong and Kennedy in order to provide a variety of conditions for code validations. The SonicBAT experiments at both sites were designed to capture simultaneous F-18A or F-18B onboard flight instrumentation data, high fidelity ground based and airborne acoustic data, surface and upper air meteorological data, and additional meteorological data from ultrasonic anemometers and SODARs to determine the local atmospheric turbulence and boundary layer height
    corecore