16,160 research outputs found

    Control of quantum phenomena: Past, present, and future

    Full text link
    Quantum control is concerned with active manipulation of physical and chemical processes on the atomic and molecular scale. This work presents a perspective of progress in the field of control over quantum phenomena, tracing the evolution of theoretical concepts and experimental methods from early developments to the most recent advances. The current experimental successes would be impossible without the development of intense femtosecond laser sources and pulse shapers. The two most critical theoretical insights were (1) realizing that ultrafast atomic and molecular dynamics can be controlled via manipulation of quantum interferences and (2) understanding that optimally shaped ultrafast laser pulses are the most effective means for producing the desired quantum interference patterns in the controlled system. Finally, these theoretical and experimental advances were brought together by the crucial concept of adaptive feedback control, which is a laboratory procedure employing measurement-driven, closed-loop optimization to identify the best shapes of femtosecond laser control pulses for steering quantum dynamics towards the desired objective. Optimization in adaptive feedback control experiments is guided by a learning algorithm, with stochastic methods proving to be especially effective. Adaptive feedback control of quantum phenomena has found numerous applications in many areas of the physical and chemical sciences, and this paper reviews the extensive experiments. Other subjects discussed include quantum optimal control theory, quantum control landscapes, the role of theoretical control designs in experimental realizations, and real-time quantum feedback control. The paper concludes with a prospective of open research directions that are likely to attract significant attention in the future.Comment: Review article, final version (significantly updated), 76 pages, accepted for publication in New J. Phys. (Focus issue: Quantum control

    Connecting Levels of Analysis in Educational Neuroscience: A Review of Multi-level Structure of Educational Neuroscience with Concrete Examples

    Get PDF
    In its origins educational neuroscience has started as an endeavor to discuss implications of neuroscience studies for education. However, it is now on its way to become a transdisciplinary field, incorporating findings, theoretical frameworks and methodologies from education, and cognitive and brain sciences. Given the differences and diversity in the originating disciplines, it has been a challenge for educational neuroscience to integrate both theoretical and methodological perspective in education and neuroscience in a coherent way. We present a multi-level framework for educational neuroscience, which argues for integration of multiple levels of analysis, some originating in brain and cognitive sciences, others in education, as a roadmap for the future of educational neuroscience with concrete examples in moral education

    Constructing model robust mixture designs via weighted G-optimality criterion

    Get PDF
    We propose and develop a new G-optimality criterion using the concept of weighted optimality criteria and certain additional generalizations. The goal of the weighted G-optimality is to minimize a weighted average of the maximum scaled prediction variance in the design region over a set of reduced models. A genetic algorithm (GA) is used for generating the weighted G-optimal exact designs in an experimental region for mixtures. The performance of the proposed GA designs is evaluated and compared to the performance of the designs produced by our genetic algorithm and the PROC OPTEX exchange algorithm of SAS/QC. The evaluation demonstrates the advantages of GA designs over the designs generated using exchange algorithm, showing that the proposed GA designs have better model-robust properties and perform better than the designs generated by the PROC OPTEX exchange algorithm

    Second-order nearly orthogonal Latin hypercubes for exploring stochastic simulations

    Get PDF
    The article of record as published may be found at http://dx.doi.org/10.1057/jos.2016.8This paper presents new Latin hypercube designs with minimal correlations between all main, quadratic, and two-way interaction effects for a full second-order model. These new designs facilitate exploratory analysis of stochastic simulation models in which there is considerable a priori uncertainty about the forms of the responses. We focus on understanding the underlying complexities of simulated systems by exploring the input variables’effects on the behavior of simulation responses. These new designs allow us to determine the driving factors, detect interactions between input variables, identify points of diminishing or increasing rates of return, and find thresholds or change points in localized areas. Our proposed designs enable analysts to fit many diverse metamodels to multiple outputs with a single set of runs. Creating these designs is computationally intensive; therefore, several have been cataloged and made available online to experimenters.Office of Naval Research (N0001412WX20823

    Computational Techniques to Predict Orthopaedic Implant Alignment and Fit in Bone

    Get PDF
    Among the broad palette of surgical techniques employed in the current orthopaedic practice, joint replacement represents one of the most difficult and costliest surgical procedures. While numerous recent advances suggest that computer assistance can dramatically improve the precision and long term outcomes of joint arthroplasty even in the hands of experienced surgeons, many of the joint replacement protocols continue to rely almost exclusively on an empirical basis that often entail a succession of trial and error maneuvers that can only be performed intraoperatively. Although the surgeon is generally unable to accurately and reliably predict a priori what the final malalignment will be or even what implant size should be used for a certain patient, the overarching goal of all arthroplastic procedures is to ensure that an appropriate match exists between the native and prosthetic axes of the articulation. To address this relative lack of knowledge, the main objective of this thesis was to develop a comprehensive library of numerical techniques capable to: 1) accurately reconstruct the outer and inner geometry of the bone to be implanted; 2) determine the location of the native articular axis to be replicated by the implant; 3) assess the insertability of a certain implant within the endosteal canal of the bone to be implanted; 4) propose customized implant geometries capable to ensure minimal malalignments between native and prosthetic axes. The accuracy of the developed algorithms was validated through comparisons performed against conventional methods involving either contact-acquired data or navigated implantation approaches, while various customized implant designs proposed were tested with an original numerical implantation method. It is anticipated that the proposed computer-based approaches will eliminate or at least diminish the need for undesirable trial and error implantation procedures in a sense that present error-prone intraoperative implant insertion decisions will be at least augmented if not even replaced by optimal computer-based solutions to offer reliable virtual “previews” of the future surgical procedure. While the entire thesis is focused on the elbow as the most challenging joint replacement surgery, many of the developed approaches are equally applicable to other upper or lower limb articulations

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    SIMULATIONS-GUIDED DESIGN OF PROCESS ANALYTICAL SENSOR USING MOLECULAR FACTOR COMPUTING

    Get PDF
    Many areas of science now generate huge volumes of data that present visualization, modeling, and interpretation challenges. Methods for effectively representing the original data in a reduced coordinate space are therefore receiving much attention. The purpose of this research is to test the hypothesis that molecular computing of vectors for transformation matrices enables spectra to be represented in any arbitrary coordinate system. New coordinate systems are selected to reduce the dimensionality of the spectral hyperspace and simplify the mechanical/electrical/computational construction of a spectrometer. A novel integrated sensing and processing system, termed Molecular Factor Computing (MFC) based near infrared (NIR) spectrometer, is proposed in this dissertation. In an MFC -based NIR spectrometer, spectral features are encoded by the transmission spectrum of MFC filters which effectively compute the calibration function or the discriminant functions by weighing the signals received from a broad wavelength band. Compared with the conventional spectrometers, the novel NIR analyzer proposed in this work is orders of magnitude faster and more rugged than traditional spectroscopy instruments without sacrificing the accuracy that makes it an ideal analytical tool for process analysis. Two different MFC filter-generating algorithms are developed and tested for searching a near-infrared spectral library to select molecular filters for MFC-based spectroscopy. One using genetic algorithms coupled with predictive modeling methods to select MFC filters from a spectral library for quantitative prediction is firstly described. The second filter-generating algorithm designed to select MFC filters for qualitative classification purpose is then presented. The concept of molecular factor computing (MFC)-based predictive spectroscopy is demonstrated with quantitative analysis of ethanol-in-water mixtures in a MFC-based prototype instrument
    • …
    corecore