6,637 research outputs found

    Inflation and Unemployment in the U.S. and Canada: A Common Framework

    Get PDF
    This paper summarizes the results of our efforts to broaden the theory of the Phillips curve and to explain the joint evolution of inflation and unemployment in the United States and Canada since 1930.Phillips curve, unemployment, inflation

    Weibull-type limiting distribution for replicative systems

    Full text link
    The Weibull function is widely used to describe skew distributions observed in nature. However, the origin of this ubiquity is not always obvious to explain. In the present paper, we consider the well-known Galton-Watson branching process describing simple replicative systems. The shape of the resulting distribution, about which little has been known, is found essentially indistinguishable from the Weibull form in a wide range of the branching parameter; this can be seen from the exact series expansion for the cumulative distribution, which takes a universal form. We also find that the branching process can be mapped into a process of aggregation of clusters. In the branching and aggregation process, the number of events considered for branching and aggregation grows cumulatively in time, whereas, for the binomial distribution, an independent event occurs at each time with a given success probability.Comment: 6 pages and 5 figure

    Limit Cycles in Four Dimensions

    Full text link
    We present an example of a limit cycle, i.e., a recurrent flow-line of the beta-function vector field, in a unitary four-dimensional gauge theory. We thus prove that beta functions of four-dimensional gauge theories do not produce gradient flows. The limit cycle is established in perturbation theory with a three-loop calculation which we describe in detail.Comment: 12 pages, 1 figure. Significant revision of the interpretation of our result. Improved description of three-loop calculatio

    Multimorbidity in younger deprived patients: An exploratory study of research and service implications in general practice

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Multimorbidity has been defined as the co-existence of two or more chronic conditions. It has a profound impact on both the individuals affected and on their use of healthcare services. The limited research to date has focused on its epidemiology rather than the development of interventions to improve outcomes in multimorbidity patients, particularly for patients aged less than 65 years. Potential barriers to such research relate to methods of disease recording and coding and examination of the process of care. We aimed to assess the feasibility of identifying younger individuals with multimorbidity at general practice level and to explore the effect of multimorbidity on the type and volume of health care delivered. We also describe the barriers encountered in attempting to carry out this exploratory research.</p> <p>Methods</p> <p>Cross sectional survey of GP records in two large urban general practices in Dublin focusing on poorer individuals with at least three chronic conditions and aged between 45 and 64 years.</p> <p>Results</p> <p>92 patients with multimorbidity were identified. The median number of conditions was 4 per patient. Individuals received a mean number of 7.5 medications and attended a mean number of GP visits of 11.3 in the 12 months preceding the survey. Barriers to research into multimorbidity at practice level were identified including difficulties relating to GP clinical software; variation in disease coding; assessment of specialist sector activity through the GP-specialist communications and assessment of the full scale of primary care activity in relation to other disciplines and other types of GP contacts such as home visits and telephone contacts.</p> <p>Conclusion</p> <p>This study highlights the importance of multimorbidity in general practice and indicates that it is feasible to identify younger patients with multimorbidity through their GP records. This is a first step towards planning a clinical intervention to improve outcomes for such patients in primary care.</p

    New path description for the M(k+1,2k+3) models and the dual Z_k graded parafermions

    Full text link
    We present a new path description for the states of the non-unitary M(k+1,2k+3) models. This description differs from the one induced by the Forrester-Baxter solution, in terms of configuration sums, of their restricted-solid-on-solid model. The proposed path representation is actually very similar to the one underlying the unitary minimal models M(k+1,k+2), with an analogous Fermi-gas interpretation. This interpretation leads to fermionic expressions for the finitized M(k+1,2k+3) characters, whose infinite-length limit represent new fermionic characters for the irreducible modules. The M(k+1,2k+3) models are also shown to be related to the Z_k graded parafermions via a (q to 1/q) duality transformation.Comment: 43 pages (minor typo corrected and minor rewording in the introduction

    Origin of the approximate universality of distributions in equilibrium correlated systems

    Get PDF
    We propose an interpretation of previous experimental and numerical experiments, showing that for a large class of systems, distributions of global quantities are similar to a distribution originally obtained for the magnetization in the 2D-XY model . This approach, developed for the Ising model, is based on previous numerical observations. We obtain an effective action using a perturbative method, which successfully describes the order parameter fluctuations near the phase transition. This leads to a direct link between the D-dimensional Ising model and the XY model in the same dimension, which appears to be a generic feature of many equilibrium critical systems and which is at the heart of the above observations.Comment: To appear in Europhysics Letter

    The H.E.S.S. extragalactic sky

    Full text link
    The H.E.S.S. Cherenkov telescope array, located on the southern hemisphere in Namibia, studies very high energy (VHE; E>100 GeV) gamma-ray emission from astrophysical objects. During its successful operations since 2002 more than 80 galactic and extra-galactic gamma-ray sources have been discovered. H.E.S.S. devotes over 400 hours of observation time per year to the observation of extra-galactic sources resulting in the discovery of several new sources, mostly AGNs, and in exciting physics results e.g. the discovery of very rapid variability during extreme flux outbursts of PKS 2155-304, stringent limits on the density of the extragalactic background light (EBL) in the near-infrared derived from the energy spectra of distant sources, or the discovery of short-term variability in the VHE emission from the radio galaxy M 87. With the recent launch of the Fermi satellite in 2008 new insights into the physics of AGNs at GeV energies emerged, leading to the discovery of several new extragalactic VHE sources. Multi-wavelength observations prove to be a powerful tool to investigate the production mechanism for VHE emission in AGNs. Here, new results from H.E.S.S. observations of extragalactic sources will be presented and their implications for the physics of these sources will be discussed.Comment: 8 pages, 6 figures, invited review talk, in the proceedings of the "International Workshop on Beamed and Unbeamed Gamma-Rays from Galaxies" 11-15 April 2011, Lapland Hotel Olos, Muonio, Finland, Journal of Physics: Conference Series Volume 355, 201

    Revue bibliographique des méthodes de prévision des débits

    Get PDF
    Dans le domaine de la prévision des débits, une grande variété de méthodes sont disponibles: des modèles stochastiques et conceptuels mais aussi des approches plus novatrices telles que les réseaux de neurones artificiels, les modèles à base de règles floues, la méthode des k plus proches voisins, la régression floue et les splines de régression. Après avoir effectué une revue détaillée de ces méthodes et de leurs applications récentes, nous proposons une classification qui permet de mettre en lumière les différences mais aussi les ressemblances entre ces approches. Elles sont ensuite comparées pour les problèmes différents de la prévision à court, moyen et long terme. Les recommandations que nous effectuons varient aussi avec le niveau d'information a priori. Par exemple, lorsque l'on dispose de séries chronologiques stationnaires de longue durée, nous recommandons l'emploi de la méthode non paramétrique des k plus proches voisins pour les prévisions à court et moyen terme. Au contraire, pour la prévision à plus long terme à partir d'un nombre restreint d'observations, nous suggérons l'emploi d'un modèle conceptuel couplé à un modèle météorologique basé sur l'historique. Bien que l'emphase soit mise sur le problème de la prévision des débits, une grande partie de cette revue, principalement celle traitant des modèles empiriques, est aussi pertinente pour la prévision d'autres variables.A large number of models are available for streamflow forecasting. In this paper we classify and compare nine types of models for short, medium and long-term flow forecasting, according to six criteria: 1. validity of underlying hypotheses, 2. difficulties encountered when building and calibrating the model, 3. difficulties in computing the forecasts, 4. uncertainty modeling, 5. information required by each type of model, and 6. parameter updating. We first distinguish between empirical and conceptual models, the difference being that conceptual models correspond to simplified representations of the watershed, while empirical model only try to capture the structural relationships between inputs to the watershed and outputs, such as streamflow. Amongst empirical models, we distinguish between stochastic models, i.e. models based on the theory of probability, and non-stochastic models. Three types of stochastic models are presented: statistical regression models, Box-Jenkins models, and the nonparametric k-nearest neighbor method. Statistical linear regression is only applicable for long term forecasting (monthly flows, for example), since it requires independent and identically distributed observations. It is a simple method of forecasting, and its hypotheses can be validated a posteriori if sufficient data are available. Box-Jenkins models include linear autoregressive models (AR), linear moving average models (MA), linear autoregressive - moving average models (ARMA), periodic ARMA models (PARMA) and ARMA models with auxiliary inputs (ARMAX). They are more adapted for weekly or daily flow forecasting, since the yallow for the explicit modeling of time dependence. Efficient methods are available for designing the model and updating the parameters as more data become available. For both statistical linear regression and Box-Jenkins models, the inputs must be uncorrelated and linearly related to the output. Furthermore, the process must be stationary. When it is suspected that the inputs are correlated or have a nonlinear effect on the output, the k-nearest neighbor method may be considered. This data-based nonparametric approach simply consists in looking, among past observations of the process, for the k events which are most similar to the present situation. A forecast is then built from the flows which were observed for these k events. Obviously, this approach requires a large database and a stationary process. Furthermore, the time required to calibrate the model and compute the forecasts increases rapidly with the size of the database. A clear advantage of stochastic models is that forecast uncertainty may be quantified by constructing a confidence interval. Three types of non-stochastic empirical models are also discussed: artificial neural networks (ANN), fuzzy linear regression and multivariate adaptive regression splines (MARS). ANNs were originally designed as simple conceptual models of the brain. However, for forecasting purposes, these models can be thought of simply as a subset of non linear empirical models. In fact, the ANN model most commonly used in forecasting, a multi-layer feed-forward network, corresponds to a non linear autoregressive model (NAR). To capture the moving average components of a time series, it is necessary to use recurrent architectures. ANNs are difficult to design and calibrate, and the computation of forecasts is also complex. Fuzzy linear regression makes it possible to extract linear relationships from small data sets, with fewer hypotheses than statistical linear regression. It does not require the observations to be uncorrelated, nor does it ask for the error variance to be homogeneous. However, the model is very sensitive to outliers. Furthermore, a posteriori validation of the hypothesis of linearity is not possible for small data sets. MARS models are based on the hypothesis that time series are chaotic instead of stochastic. The main advantage of the method is its ability to model non-stationary processes. The approach is non-parametric, and therefore requires a large data set.Amongst conceptual models, we distinguish between physical models, hydraulic machines, and fuzzy rule-based systems. Most conceptual hydrologic models are hydraulic machines, in which the watershed is considered to behave like a network of reservoirs. Physical modeling of a watershed would imply using fundamental physical equations at a small scale, such as the law of conservation of mass. Given the complexity of a watershed, this can be done in practice only for water routing. Consequently, only short term flow forecasts can be obtained from a physical model, since the effects of precipitation, infiltration and evaporation must be negligible. Fuzzy rule-based systems make it possible to model the water cycle using fuzzy IF-THEN rules, such as IF it rains a lot in a short period of time, THEN there will be a large flow increase following the concentration time. Each fuzzy quantifier is modeled using a fuzzy number to take into account the uncertainty surrounding it. When sufficient data are available, the fuzzy quantifiers can be constructed from the data. In general, conceptual models require more effort to develop than empirical models. However, for exceptional events, conceptual models can often provide more realistic forecasts, since empirical models are not well suited for extrapolation.A fruitful approach is to combine conceptual and empirical models. One way of doing this, called extended streamflow prediction or ESP, is to combine a stochastic model for generating meteorological scenarios with a conceptual model of the watershed.Based on this review of flow forecasting models, we recommend for short term forecasting (hourly and daily flows) the use of the k-nearest neighbor method, Box-Jenkins models, water routing models or hydraulic machines. For medium term forecasting (weekly flows, for example), we recommend the k-nearest neighbor method and Box-Jenkins models, as well as fuzzy-rule based and ESP models. For long term forecasting (monthly flows), we recommend statistical and fuzzy regression, Box-Jenkins, MARS and ESP models. It is important to choose a type of model which is appropriate for the problem at hand and for which the information available is sufficient. Each type of model having its advantages, it can be more efficient to combine different approaches when forecasting streamflow

    A Pyramid Scheme for Particle Physics

    Full text link
    We introduce a new model, the Pyramid Scheme, of direct mediation of SUSY breaking, which is compatible with the idea of Cosmological SUSY Breaking (CSB). It uses the trinification scheme of grand unification and avoids problems with Landau poles in standard model gauge couplings. It also avoids problems, which have recently come to light, associated with rapid stellar cooling due to emission of the pseudo Nambu-Goldstone Boson (PNGB) of spontaneously broken hidden sector baryon number. With a certain pattern of R-symmetry breaking masses, a pattern more or less required by CSB, the Pyramid Scheme leads to a dark matter candidate that decays predominantly into leptons, with cross sections compatible with a variety of recent observations. The dark matter particle is not a thermal WIMP but a particle with new strong interactions, produced in the late decay of some other scalar, perhaps the superpartner of the QCD axion, with a reheat temperature in the TeV range. This is compatible with a variety of scenarios for baryogenesis, including some novel ones which exploit specific features of the Pyramid Scheme.Comment: JHEP Latex, 32 pages, 1 figur
    corecore