207 research outputs found

    Quantum initial value representations using approximate Bohmian trajectories

    Full text link
    Quantum trajectories, originating from the de Broglie-Bohm (dBB) hydrodynamic description of quantum mechanics, are used to construct time-correlation functions in an initial value representation (IVR). The formulation is fully quantum mechanical and the resulting equations for the correlation functions are similar in form to their semi-classical analogs but do not require the computation of the stability or monodromy matrix or conjugate points. We then move to a {\em local} trajectory description by evolving the cumulants of the wave function along each individual path. The resulting equations of motion are an infinite hierarchy, which we truncate at a given order. We show that time-correlation functions computed using these approximate quantum trajectories can be used to accurately compute the eigenvalue spectrum for various potential systems.Comment: 7 pages, 6 figure

    Beable trajectories for revealing quantum control mechanisms

    Get PDF
    The dynamics induced while controlling quantum systems by optimally shaped laser pulses have often been difficult to understand in detail. A method is presented for quantifying the importance of specific sequences of quantum transitions involved in the control process. The method is based on a ``beable'' formulation of quantum mechanics due to John Bell that rigorously maps the quantum evolution onto an ensemble of stochastic trajectories over a classical state space. Detailed mechanism identification is illustrated with a model 7-level system. A general procedure is presented to extract mechanism information directly from closed-loop control experiments. Application to simulated experimental data for the model system proves robust with up to 25% noise.Comment: Latex, 20 pages, 13 figure

    The Canadian Joint Replacement Registry—what have we learned?

    Get PDF
    The Canadian Joint Replacement Registry (CJRR) was launched in 2000 through the collaborative efforts of the Canadian Orthopedic Association and the Canadian Institutes for Health Information. Participation is voluntary, and data collected by participating surgeons in the operating room is linked to hospital stay information from administrative databases to compile yearly reports. In the fiscal year 2006–2007, there were 62,196 hospitalizations for hip and knee replacements in Canada, excluding Quebec. This represents a 10-year increase of 101% and a 1-year increase of 6%. Compared to men, Canadian women have higher age-adjusted rates per 105 for both TKA (148 vs. 110) and THA (86 vs. 76). There also exist substantial inter-provincial variations in both age-adjusted rates of arthroplasty and implant utilization that cannot be explained entirely on the basis of differing patient demographics. The reasons for these variations are unclear, but probably represent such factors as differences in provincial health expenditure, efforts to reduce waiting lists, and surgeon preference. The main challenge currently facing the CJRR is to increase procedure capture to > 90%. This is being pursued through a combination of efforts including simplification of the consent process, streamlining of the data collection form, and the production of customized reports with information that has direct clinical relevance for surgeons and administrators. As the CJRR continues to mature, we are optimistic that it will provide clinically important information on the wide range of factors that affect arthroplasty outcome

    Enabling and scaling biomolecular simulations of 100 million atoms on petascale machines with a multicore-optimized message-driven runtime

    Full text link
    A 100-million-atom biomolecular simulation with NAMD is one of the three benchmarks for the NSF-funded sustainable petascale machine. Simulating this large molecular system on a petascale machine presents great challenges, including handling I/O, large memory footprint and getting good strong-scaling results. In this paper, we present parallel I/O techniques to enable the simula-tion. A new SMP model is designed to efficiently utilize ubiquitous wide multicore clusters by extending the CHARM++ asynchronous message-driven runtime. We exploit node-aware techniques to op-timize both the application and the underlying SMP runtime. Hi-erarchical load balancing is further exploited to scale NAMD to the full Jaguar PF Cray XT5 (224,076 cores) at Oak Ridge Na-tional Laboratory, both with and without PME full electrostatics, achieving 93 % parallel efficiency (vs 6720 cores) at 9 ms per step for a simple cutoff calculation. Excellent scaling is also obtained on 65,536 cores of the Intrepid Blue Gene/P at Argonne National Laboratory. 1

    Outcomes of unilateral and bilateral total knee arthroplasty in 238,373 patients

    Get PDF
    © 2016 The Author(s). Published by Taylor & Francis on behalf of the Nordic Orthopedic Federation. Background and purpose — There is no consensus about the outcome of simultaneous vs. staged bilateral total knee arthroplasty (TKA). We examined this issue by analyzing 238,373 patients. Patients and methods — Demographic, clinical, and outcome data were evaluated for TKA patients (unilateral: 206,771; simultaneous bilateral: 6,349; staged bilateral: 25,253) from the Canadian Hospital Morbidity Database for fiscal years 2006–2007 to 2012–2013. Outcomes were adjusted for age, sex, comorbidities, and hospital TKA volume. Results — Simultaneous bilateral TKA patients were younger than staged bilateral TKA patients (median 64 years vs. 66 years), were more likely to be male (41% vs. 39%), and had a lower frequency of having ≄1 comorbid condition (2.9% vs. 4.2%). They also had a higher frequency of blood transfusions (41% vs. 19%), a shorter median length of stay (6 days vs. 8 days), a higher frequency of transfer to a rehabilitation facility (46% vs. 9%), and a lower frequency of knee infection (0.5% vs. 0.9%) than staged bilateral TKA patients, but they had higher rate of cardiac complications within 90 days (2.0% vs. 1.7%). Simultaneous patients had higher in-hospital mortality compared to the second TKA in staged patients (0.16% vs. 0.06%), but they had similar rates of in-hospital mortality compared to unilateral patients (0.16% vs. 0.14%). The cumulative 3-year revision rate was highest in the unilateral group (2.3%), but it was similar in the staged and simultaneous bilateral groups (1.4%). Interpretation — We found important differences between the outcomes of simultaneous and staged bilateral TKA. Further clarification of outcomes would be best determined in an adequately powered randomized trial, which would remove the selection bias inherent in this retrospective study design

    A Preliminary Seismic Analysis of 51 Peg: Large and Small Spacings from Standard Models

    Full text link
    We present a preliminary theoretical seismic study of the astronomically famous star 51 Peg. This is done by first performing a detailed analysis within the Hertzsprung-Russell diagram (HRD). Using the Yale stellar evolution code (YREC), a grid of stellar evolutionary tracks has been constructed for the masses 1.00 M_sun, 1.05 M_sun and 1.10 M_sun, in the metallicity range Z=0.024-0.044, and for values of the Galactic helium enrichment ratio DY/DZ in the range 0-2.5. Along these evolutionary tracks, we select 75 stellar model candidates that fall within the 51 Peg observational error box in the HRD (all turn out to have masses of 1.05 M_sun and 1.10 M_sun. The corresponding allowable age range for these models, which depends sensitively on the parameters of the model, is relatively large and is ~2.5 - 5.5 Gyr. For each of the 75 models, a non-radial pulsation analysis is carried out, and the large and small frequency spacings are calculated. The results show that just measuring the large and small frequency spacings will greatly reduce the present uncertainties in the derived physical parameters and in the age of 51 Peg. Finally we discuss briefly refinements in the physics of the models and in the method of analysis which will have to be included in future models to make the best of the precise frequency determinations expected from space observations.Comment: 22 pages, 5 figures, 3 tables. Accepted for publicaton by Ap

    Bell nonlocality, signal locality and unpredictability (or What Bohr could have told Einstein at Solvay had he known about Bell experiments)

    Full text link
    The 1964 theorem of John Bell shows that no model that reproduces the predictions of quantum mechanics can simultaneously satisfy the assumptions of locality and determinism. On the other hand, the assumptions of \emph{signal locality} plus \emph{predictability} are also sufficient to derive Bell inequalities. This simple theorem, previously noted but published only relatively recently by Masanes, Acin and Gisin, has fundamental implications not entirely appreciated. Firstly, nothing can be concluded about the ontological assumptions of locality or determinism independently of each other -- it is possible to reproduce quantum mechanics with deterministic models that violate locality as well as indeterministic models that satisfy locality. On the other hand, the operational assumption of signal locality is an empirically testable (and well-tested) consequence of relativity. Thus Bell inequality violations imply that we can trust that some events are fundamentally \emph{unpredictable}, even if we cannot trust that they are indeterministic. This result grounds the quantum-mechanical prohibition of arbitrarily accurate predictions on the assumption of no superluminal signalling, regardless of any postulates of quantum mechanics. It also sheds a new light on an early stage of the historical debate between Einstein and Bohr.Comment: Substantially modified version; added HMW as co-autho

    Simulating Large Scale Parallel Applications Using Statistical Models for Sequential Execution Blocks

    Full text link
    Predicting sequential execution blocks of a large scale parallel application is an essential part of accurate prediction of the overall performance of the application. When simulating a future machine that is not yet fabricated, or a prototype system only available at a small scale, it becomes a significant challenge. Using hardware simulators may not be feasible due to excessively slowed down execution times and insufficient resources. These challenging issues become increasingly difficult in proportion to scale of the simulation. In this paper, we propose an approach based on statistical models to accurately predict the performance of the sequential execution blocks that comprise a parallel application. We de-ployed these techniques in a trace-driven simulation framework to capture both the detailed behavior of the application as well as the overall predicted performance. The technique is validated using both synthetic benchmarks and the NAMD application. Index Terms—parallel simulator, performance prediction, trace-driven, machine learning, statistical model I
    • 

    corecore