1,797 research outputs found

    Domain decomposition and multilevel integration for fermions

    Full text link
    The numerical computation of many hadronic correlation functions is exceedingly difficult due to the exponentially decreasing signal-to-noise ratio with the distance between source and sink. Multilevel integration methods, using independent updates of separate regions in space-time, are known to be able to solve such problems but have so far been available only for pure gauge theory. We present first steps into the direction of making such integration schemes amenable to theories with fermions, by factorizing a given observable via an approximated domain decomposition of the quark propagator. This allows for multilevel integration of the (large) factorized contribution to the observable, while its (small) correction can be computed in the standard way.Comment: 14 pages, 6 figures, v2: published version, talk presented at the 34th annual International Symposium on Lattice Field Theory, 24-30 July 2016, University of Southampton, U

    Local multiboson factorization of the quark determinant

    Full text link
    We discuss the recently proposed multiboson domain-decomposed factorization of the gauge-field dependence of the fermion determinant in lattice QCD. In particular, we focus on the case of a lattice divided in an arbitrary number of thick time slices. As a consequence, multiple space-time regions can be updated independently. This allows to address the exponential degradation of the signal-to-noise ration of correlation functions with multilevel Monte Carlo sampling. We show numerical evidence of the effectiveness of a two-level integration for pseudoscalar propagators with momentum and for vector propagators, in a two active regions setup. These results are relevant to lattice computation of the hadronic contributions to the anomalous magnetic moment of the muon and to heavy meson decay form factors.Comment: 8 pages, 4 figures, talk presented at the 35th International Symposium on Lattice Field Theory, 18-24 June 2017, Granada, Spai

    Immutability and Encapsulation for Sound OO Information Flow Control

    Get PDF

    Finding Reaction Pathways with Optimal Atomic Index Mappings

    Get PDF
    Finding complex reaction and transformation pathways involving many intermediate states is, in general, not possible on the density-functional theory level with existing simulation methods, due to the very large number of required energy and force evaluations. For complex reactions, it is not possible to determine which atom in the reactant is mapped onto which atom in the product. Trying out all possible atomic index mappings is not feasible because of the factorial increase in the number of possible mappings. We use a penalty function that is invariant under index permutations to bias the potential energy surface in such a way that it obtains the characteristics of a structure seeker, whose global minimum is the reaction product. By performing a minima-hopping-based global optimization on this biased potential energy surface, we rapidly find intermediate states that lead into the global minimum and allow us to then extract entire reaction pathways. We first demonstrate for a benchmark system, namely, the Lennard-Jones cluster LJ 38 , that our method finds intermediate states relevant to the lowest energy reaction pathway, and hence we need to consider much fewer intermediate states than previous methods to find the lowest energy reaction pathway. Finally, we apply the method to two real systems, C 60 and C 20 H 20 , and show that the reaction pathways found contain valuable information on how these molecules can be synthesized

    Information Flow Control-by-Construction for an Object-Oriented Language Using Type Modifiers

    Get PDF
    In security-critical software applications, confidential information must be prevented from leaking to unauthorized sinks. Static analysis techniques are widespread to enforce a secure information flow by checking a program after construction. A drawback of these systems is that incomplete programs during construction cannot be checked properly. The user is not guided to a secure program by most systems. We introduce IFbCOO, an approach that guides users incrementally to a secure implementation by using refinement rules. In each refinement step, confidentiality or integrity (or both) is guaranteed alongside the functional correctness of the program, such that insecure programs are declined by construction. In this work, we formalize IFbCOO and prove soundness of the refinement rules. We implement IFbCOO in the tool CorC and conduct a feasibility study by successfully implementing case studies

    Precise Measures of Orbital Period, Before and After Nova Eruption for QZ Aurigae

    Get PDF
    For the ordinary classical nova QZ Aurigae (which erupted in 1964), we report 1317 magnitudes from 1912--2016, including four eclipses detected on archival photographic plates from long before the eruption. We have accurate and robust measures of the orbital period both pre-eruption and post-eruption, and we find that the orbital period decreased, with a fractional change of -290.71+-0.28 parts-per-million across the eruption, with the orbit necessarily getting smaller. Further, we find that the light curve outside of eclipses and eruption is flat at near B=17.14 from 1912--1981, whereupon the average light curve starts fading down to B=17.49 with large variability. QZ Aur is a robust counter-example against the Hibernation model for the evolution of cataclysmic variables, where the model requires that all novae have their period increase across eruptions. Large period decreases across eruptions can easily arise from mass imbalances in the ejecta, as are commonly seen in asymmetric nova shells.Comment: MNRAS in press, 24 pages, 5 tables, 6 figure

    Congestion-clearing payments to passengers

    Get PDF
    This paper reports on a project that considers whether the goals of (de)congestion pricing could be achieved in whole or in part by incentivizing mode-shift rather than using charging to force it: buying rather than selling decongestion. The project developed a method for estimating the net present value of the costs and benefits of a permanent ITS-enabled program of paying people to travel as passengers rather than as drivers-to reduce existing congestion in a target corridor to a target maximum level of delay-taking into account the mix of the traffic and the potential impact of latent demand and induced trips. This is relevant for making better use of existing infrastructure (a build nothing alternative to expansion, but not a do nothing one), for decarbonizing transport, and in the run up to automated vehicles where the possibility exists that new infrastructure investments in the 1-20-year timeframe will become stranded assets under some future scenarios. The project incorporated: a thorough review of the literature; focus groups; and a survey in a case study corridor in California to test the theory, develop the method, and determine the likely costs and benefits. The main insights include 1) the significance of an \u27intra-peak demand shift\u27 that would occur if congestion was removed; 2) the need for four major components in a congestion-clearing payments program: a) incentives to switch from driving to being a passenger, b) incentives to travel at less preferred times, c) park and ride/pool facilities near the bottleneck to ease the passenger switch, and d) some limitation on single-occupant vehicle travel in the peak-of-the-peak in order to reserve space for vehicles carrying passengers; and 3) the possible need for different land-use regulations in a successful payments to passengers environment where the amount of traffic might no longer be an obvious constraint for expanding the local economy. The case study benefit cost analysis delivers a benefit cost ratio of 4.5 to 1

    Congestion-Clearing Payments to Passengers

    Get PDF
    Peak period motor vehicle traffic volume congests roads all over the world. This project hypothesizes implementing congestion- clearing payments to passengers as a permanent congestion-management solution. Ongoing congestion-free travel would be achieved by removing existing congestion, and absorbing (re)generated demand, at costs that would be expected to increase as the total number of travelers increases over time. The project develops a comprehensive, step-by-step methodology to calculate the benefits and costs of paying for drivers to become passengers at a congestion-clearing level and to maintain this level over time. The method is derived from the literature, analysis by the project team, and development of a case study. The case study, based on a long-standing bottleneck location in California, enabled the project team to think through the real challenges of developing and evaluating such a solution. The project finds that the conceptual underpinning of the solution is sound. Based on a survey, the case study finds that there is a level of payment that could clear congestion and maintain free-flow for twenty years, with benefits that outweigh costs on a net present value basis by about four to one—though calibration is required. After the initial reward clears the queue at the bottleneck, a significant intra-peak demand shift would occur as existing and new travelers depart home at times that are more to their liking, potentially causing the queue to re-form. A second incentive manages time of travel, rewarding people for traveling as passengers earlier (or later) than the preferred high demand peak-of-the-peak. In the case study, the high proportion of people who say they will only drive alone would eventually result in some periods of single-occupant-vehicle-only traffic during peak, which is an unintended and undesirable consequence. For the case study route, a limit on single-occupant-vehicle travel during the peak- of-the-peak would ensure that high-occupancy-vehicle travel is given preference and would reduce the overall cost of the solution. For the case study, the cost of the congestion-clearing payments-to-passengers solution on a net present value basis is within the estimated range of costs of the alternative of expanding the facility, and the benefits are expected to be greater than for facility expansion. Congestion-clearing payments to passengers can be implemented much sooner and will have greater positive long-term economic impacts. Facility expansion would provide lower and shorter-term benefits and would be expected to return to congested conditions within a year. The project team proposes a pilot project on the case study route to test and calibrate the solution, as well as recommending development of further case study routes to find out how different routes vary and determine the causes of any variations
    • …
    corecore