10,155 research outputs found

    Neutron matter from chiral two- and three-nucleon calculations up to N3^3LO

    Full text link
    Neutron matter is an ideal laboratory for nuclear interactions derived from chiral effective field theory since all contributions are predicted up to next-to-next-to-next-to-leading order (N3^3LO) in the chiral expansion. By making use of recent advances in the partial-wave decomposition of three- nucleon (3N) forces, we include for the first time N3^3LO 3N interactions in many-body perturbation theory (MBPT) up to third order and in self-consistent Green's function theory (SCGF). Using these two complementary many-body frameworks we provide improved predictions for the equation of state of neutron matter at zero temperature and also analyze systematically the many-body convergence for different chiral EFT interactions. Furthermore, we present an extension of the normal-ordering framework to finite temperatures. These developments open the way to improved calculations of neutron-rich matter including estimates of theoretical uncertainties for astrophysical applications.Comment: minor changes, published versio

    Choreographies in Practice

    Full text link
    Choreographic Programming is a development methodology for concurrent software that guarantees correctness by construction. The key to this paradigm is to disallow mismatched I/O operations in programs, called choreographies, and then mechanically synthesise distributed implementations in terms of standard process models via a mechanism known as EndPoint Projection (EPP). Despite the promise of choreographic programming, there is still a lack of practical evaluations that illustrate the applicability of choreographies to concrete computational problems with standard concurrent solutions. In this work, we explore the potential of choreographies by using Procedural Choreographies (PC), a model that we recently proposed, to write distributed algorithms for sorting (Quicksort), solving linear equations (Gaussian elimination), and computing Fast Fourier Transform. We discuss the lessons learned from this experiment, giving possible directions for the usage and future improvements of choreography languages

    Uncertainties in constraining low-energy constants from 3^3H β\beta decay

    Full text link
    We discuss the uncertainties in constraining low-energy constants of chiral effective field theory from 3^3H β\beta decay. The half-life is very precisely known, so that the Gamow-Teller matrix element has been used to fit the coupling cDc_D of the axial-vector current to a short-range two-nucleon pair. Because the same coupling also describes the leading one-pion-exchange three-nucleon force, this in principle provides a very constraining fit, uncorrelated with the 3^3H binding energy fit used to constrain another low-energy coupling in three-nucleon forces. However, so far such 3^3H half-life fits have only been performed at a fixed cutoff value. We show that the cutoff dependence due to the regulators in the axial-vector two-body current can significantly affect the Gamow-Teller matrix elements and consequently also the extracted values for the cDc_D coupling constant. The degree of the cutoff dependence is correlated with the softness of the employed NN interaction. As a result, present three-nucleon forces based on a fit to 3^3H β\beta decay underestimate the uncertainty in cDc_D. We explore a range of cDc_D values that is compatible within cutoff variation with the experimental 3^3H half-life and estimate the resulting uncertainties for many-body systems by performing calculations of symmetric nuclear matter.Comment: 9 pages, 11 figures, published version, includes Erratum, which corrects Figs. 2-6 due to the incorrect c_D relation between 3N forces and two-body currents use

    The Many Facets of a Diamond: Space and Gender in Selma Lagerlöf’s Antikrists Mirakler and in her Italian Legends

    Get PDF
    For the first time ever, cutting-edge research about the Swedish Nobel Laureate Selma Lagerlöf and her work is made available to a world-wide audience in one comprehensive volume. Written by an international group of scholars, Re-Mapping Lagerlöf highlights the interdisciplinarity of current Lagerlöf research which frequently cuts across genres, media and disciplines. The structure of the book, with sections dedicated to performance, film and intermediality, transnational narratives and European transmissions, is reinforced by the extensive introductory portal. The authors explore themes such as Lagerlöf in and political contexts, her involvement in the women’s movement, the construction of her celebrity persona, her role for early Swedish film, the transnationality of her work and its impact in international contexts. The volume includes a number of illustrations that are rarely reproduced, and the detailed bibliographical section will contribute to making Re-Mapping Lagerlöf an indispensable platform for Lagerlöf scholarship for years to come. It also offers a model for interdisciplinary research in the arts and humanities

    The Paths to Choreography Extraction

    Get PDF
    Choreographies are global descriptions of interactions among concurrent components, most notably used in the settings of verification (e.g., Multiparty Session Types) and synthesis of correct-by-construction software (Choreographic Programming). They require a top-down approach: programmers first write choreographies, and then use them to verify or synthesize their programs. However, most existing software does not come with choreographies yet, which prevents their application. To attack this problem, we propose a novel methodology (called choreography extraction) that, given a set of programs or protocol specifications, automatically constructs a choreography that describes their behavior. The key to our extraction is identifying a set of paths in a graph that represents the symbolic execution of the programs of interest. Our method improves on previous work in several directions: we can now deal with programs that are equipped with a state and internal computation capabilities; time complexity is dramatically better; we capture programs that are correct but not necessarily synchronizable, i.e., they work because they exploit asynchronous communication

    Position and velocity space diffusion of test particles in stochastic electromagnetic fields

    Full text link
    The two--dimensional diffusive dynamics of test particles in a random electromagnetic field is studied. The synthetic electromagnetic fluctuations are generated through randomly placed magnetised ``clouds'' oscillating with a frequency ω\omega. We investigate the mean square displacements of particles in both position and velocity spaces. As ω\omega increases the particles undergo standard (Brownian--like) motion, anomalous diffusion and ballistic motion in position space. Although in general the diffusion properties in velocity space are not trivially related to those in position space, we find that energization is present only when particles display anomalous diffusion in position space. The anomalous character of the diffusion is only in the non--standard values of the scaling exponents while the process is Gaussian.Comment: 10 pages, 4 figure

    Economic Ideology and the Rise of the Firm as a Criminal Enterprise

    Get PDF
    Over the last 50 years, the institutions, ideology, nature, and power of firms in the United States have been radically transformed. Neoclassical economics has led that transformation, supplying an ideology that justified a dramatic increase in top executive compensation while dismantling the mechanisms that produced personal accountability tied to anything but relatively short term shifts in share prices. Yet, alongside the rise of the corporation, from the time of Adam Smith forward, has been concern that the separation of ownership and control creates opportunities to use the corporation as a “weapon” of fraud, and with the return of global financial crises, there has been renewed concern that finance has once again become an agent of crime that threatens the economic order. This article compares economic and criminological approaches to the corporation. Both approaches focus on incentives and assume that rational actors are responsive to changes such as the dramatic growth in executive financial compensation and the evisceration of other forms of corporate accountability. Both approaches study the separation of ownership and control, and the temptations that come from the ability to speculate with other people’s money. Yet, neoclassical economists assume that markets will police fraud and that fraud therefore need not be a serious subject of study while criminologists posit that the policies that neoclassic economists have championed have created “criminogenic environments” that encourage the use of firms as instruments of fraud. Criminologists call the use of seemingly legitimate firms to manipulate financial markets “control fraud,” that is, fraud by the persons in charge, most typically starting with the Chief Executive Officer (CEO). Modern executive and professional compensation has transformed the CEO and the independent professionals, such as accountants, who once served as sources of external discipline. Today’s CEO can disguise losses, eliminate underwriting, lay off needed workers and take other measures that boost share prices at the expense of a company’s long term viability. Moreover, if enough executives increase their companies’ apparent profitability (and their own bonuses) in doing so, the result creates a “Gresham’s dynamic” in which bad ethics drives good ethics from the industry and the professions. In these criminogenic environments, control frauds become so pervasive that prosecution becomes extremely difficult and markets do respond—with extremely destructive boom and bust financial cycles

    Quantifying signals with power-law correlations: A comparative study of detrended fluctuation analysis and detrended moving average techniques

    Full text link
    Detrended fluctuation analysis (DFA) and detrended moving average (DMA) are two scaling analysis methods designed to quantify correlations in noisy non-stationary signals. We systematically study the performance of different variants of the DMA method when applied to artificially generated long-range power-law correlated signals with an {\it a-priori} known scaling exponent α0\alpha_{0} and compare them with the DFA method. We find that the scaling results obtained from different variants of the DMA method strongly depend on the type of the moving average filter. Further, we investigate the optimal scaling regime where the DFA and DMA methods accurately quantify the scaling exponent α0\alpha_{0}, and how this regime depends on the correlations in the signal. Finally, we develop a three-dimensional representation to determine how the stability of the scaling curves obtained from the DFA and DMA methods depends on the scale of analysis, the order of detrending, and the order of the moving average we use, as well as on the type of correlations in the signal.Comment: 15 pages, 16 figure
    • …
    corecore