1,197 research outputs found

    Quantum Criticality and Yang-Mills Gauge Theory

    Full text link
    We present a family of nonrelativistic Yang-Mills gauge theories in D+1 dimensions whose free-field limit exhibits quantum critical behavior with gapless excitations and dynamical critical exponent z=2. The ground state wavefunction is intimately related to the partition function of relativistic Yang-Mills in D dimensions. The gauge couplings exhibit logarithmic scaling and asymptotic freedom in the upper critical spacetime dimension, equal to 4+1. The theories can be deformed in the infrared by a relevant operator that restores Poincare invariance as an accidental symmetry. In the large-N limit, our nonrelativistic gauge theories can be expected to have weakly curved gravity duals.Comment: 10 page

    Long-range frustration in T=0 first-step replica-symmetry-broken solutions of finite-connectivity spin glasses

    Full text link
    In a finite-connectivity spin-glass at the zero-temperature limit, long-range correlations exist among the unfrozen vertices (whose spin values being non-fixed). Such long-range frustrations are partially removed through the first-step replica-symmetry-broken (1RSB) cavity theory, but residual long-range frustrations may still persist in this mean-field solution. By way of population dynamics, here we perform a perturbation-percolation analysis to calculate the magnitude of long-range frustrations in the 1RSB solution of a given spin-glass system. We study two well-studied model systems, the minimal vertex-cover problem and the maximal 2-satisfiability problem. This work points to a possible way of improving the zero-temperature 1RSB mean-field theory of spin-glasses.Comment: 5 pages, two figures. To be published in JSTA

    Long time limit of equilibrium glassy dynamics and replica calculation

    Full text link
    It is shown that the limit ttt-t'\to\infty of the equilibrium dynamic self-energy can be computed from the n1n\to 1 limit of the static self-energy of a nn-times replicated system with one step replica symmetry breaking structure. It is also shown that the Dyson equation of the replicated system leads in the n1n\to 1 limit to the bifurcation equation for the glass ergodicity breaking parameter computed from dynamics. The equivalence of the replica formalism to the long time limit of the equilibrium relaxation dynamics is proved to all orders in perturbation for a scalar theory.Comment: 25 pages, 12 Figures, RevTeX. Corrected misprints. Published versio

    Event by Event Analysis and Entropy of Multiparticle Systems

    Get PDF
    The coincidence method of measuring the entropy of a system, proposed some time ago by Ma, is generalized to include systems out of equilibrium. It is suggested that the method can be adapted to analyze multiparticle states produced in high-energy collisions.Comment: 13 pages, 2 figure

    Critical exponents predicted by grouping of Feynman diagrams in phi^4 model

    Get PDF
    Different perturbation theory treatments of the Ginzburg-Landau phase transition model are discussed. This includes a criticism of the perturbative renormalization group (RG) approach and a proposal of a novel method providing critical exponents consistent with the known exact solutions in two dimensions. The usual perturbation theory is reorganized by appropriate grouping of Feynman diagrams of phi^4 model with O(n) symmetry. As a result, equations for calculation of the two-point correlation function are obtained which allow to predict possible exact values of critical exponents in two and three dimensions by proving relevant scaling properties of the asymptotic solution at (and near) the criticality. The new values of critical exponents are discussed and compared to the results of numerical simulations and experiments.Comment: 34 pages, 6 figure

    Four-point renormalized coupling constant and Callan-Symanzik beta-function in O(N) models

    Full text link
    We investigate some issues concerning the zero-momentum four-point renormalized coupling constant g in the symmetric phase of O(N) models, and the corresponding Callan-Symanzik beta-function. In the framework of the 1/N expansion we show that the Callan- Symanzik beta-function is non-analytic at its zero, i.e. at the fixed-point value g^* of g. This fact calls for a check of the actual accuracy of the determination of g^* from the resummation of the d=3 perturbative g-expansion, which is usually performed assuming analyticity of the beta-function. Two alternative approaches are exploited. We extend the \epsilon-expansion of g^* to O(\epsilon^4). Quite accurate estimates of g^* are then obtained by an analysis exploiting the analytic behavior of g^* as function of d and the known values of g^* for lower-dimensional O(N) models, i.e. for d=2,1,0. Accurate estimates of g^* are also obtained by a reanalysis of the strong-coupling expansion of lattice N-vector models allowing for the leading confluent singularity. The agreement among the g-, \epsilon-, and strong-coupling expansion results is good for all N. However, at N=0,1, \epsilon- and strong-coupling expansion favor values of g^* which are sligthly lower than those obtained by the resummation of the g-expansion assuming analyticity in the Callan-Symanzik beta-function.Comment: 35 pages (3 figs), added Ref. for GRT, some estimates are revised, other minor change

    Assessment of heritage timber structures: Review of standards, guidelines and procedures

    Get PDF
    This paper reviews the official documentation (standards, guidelines and procedures) available for the assessment of heritage timber structures. The subsequent discussion does not catalogue all relevant technical literature. Instead, it intends to convey the state of background knowledge, recommendations and code rules using some illustrative examples. A specific focus is given to visual inspection as a fundamental first step for all different scopes and levels of assessment. The objectives of this review are to: (1) highlight the gaps and limitations in the currently available tools as well as the need for standardization; (2) contribute to the definition of an ontological approach, relating the scope of the assessment, information required and necessary procedures, (3) identify guidance for the different scopes of the assessment. The variety of timber species, architectural typologies and structural solutions, together with the varied response of these structures to climatic and other natural and man-made hazards, warrant a multifaceted and integrated assessment methodology that accounts for the hierarchical nature of timber structures behaviour and the multitude of agents affecting such behaviour. A review of existing standards and guidelines illustrates the need for a tool to consistently record the assessment process and the final decision taken, which will serve to constitute the knowledge base for the development of the next generation of more integrated and heritage specific guidelines

    A rigorous bound on quark distributions in the nucleon

    Full text link
    I deduce an inequality between the helicity and the transversity distribution of a quark in a nucleon, at small energy scales. Then I establish, thanks to the positivity constraint, a rigorous bound on longitudinally polarized valence quark densities, which finds nontrivial applications to d-quarks. This, in turn, implies a bound for the distributions of the longitudinally polarized sea, which is probably not SU(3)-symmetric. Some model predictions and parametrizations of quark distributions are examined in the light of these results.Comment: Talk given at the QCD03 Conference, Montpellier, 2-9 July 200

    Forecasting in the light of Big Data

    Get PDF
    Predicting the future state of a system has always been a natural motivation for science and practical applications. Such a topic, beyond its obvious technical and societal relevance, is also interesting from a conceptual point of view. This owes to the fact that forecasting lends itself to two equally radical, yet opposite methodologies. A reductionist one, based on the first principles, and the naive inductivist one, based only on data. This latter view has recently gained some attention in response to the availability of unprecedented amounts of data and increasingly sophisticated algorithmic analytic techniques. The purpose of this note is to assess critically the role of big data in reshaping the key aspects of forecasting and in particular the claim that bigger data leads to better predictions. Drawing on the representative example of weather forecasts we argue that this is not generally the case. We conclude by suggesting that a clever and context-dependent compromise between modelling and quantitative analysis stands out as the best forecasting strategy, as anticipated nearly a century ago by Richardson and von Neumann
    corecore