1,426 research outputs found

    Universality, limits and predictability of gold-medal performances at the Olympic Games

    Get PDF
    Inspired by the Games held in ancient Greece, modern Olympics represent the world's largest pageant of athletic skill and competitive spirit. Performances of athletes at the Olympic Games mirror, since 1896, human potentialities in sports, and thus provide an optimal source of information for studying the evolution of sport achievements and predicting the limits that athletes can reach. Unfortunately, the models introduced so far for the description of athlete performances at the Olympics are either sophisticated or unrealistic, and more importantly, do not provide a unified theory for sport performances. Here, we address this issue by showing that relative performance improvements of medal winners at the Olympics are normally distributed, implying that the evolution of performance values can be described in good approximation as an exponential approach to an a priori unknown limiting performance value. This law holds for all specialties in athletics-including running, jumping, and throwing-and swimming. We present a self-consistent method, based on normality hypothesis testing, able to predict limiting performance values in all specialties. We further quantify the most likely years in which athletes will breach challenging performance walls in running, jumping, throwing, and swimming events, as well as the probability that new world records will be established at the next edition of the Olympic Games.Comment: 8 pages, 3 figures, 1 table. Supporting information files and data are available at filrad.homelinux.or

    The role of guidelines and the patient's life-style in GPs' management of hypercholesterolaemia

    Get PDF
    BACKGROUND: Recent Swedish and joint European guidelines on hyperlipidaemia stress the high coronary risk for patients with already established arterio-sclerotic disease (secondary prevention) or diabetes. For the remaining group, calculation of the ten-year risk for coronary events using the Framingham equation is suggested. There is evidence that use of and adherence to guidelines is incomplete and that tools for risk estimations are seldom used. Intuitive risk estimates are difficult and systematically biased. The purpose of the study was to examine how GPs use knowledge of guidelines in their decisions to recommend or not recommend a cholesterol-lowering drug and the reasons for their decisions. METHODS: Twenty GPs were exposed to six case vignettes presented on a computer. In the course of six screens, successively more information was added to the case. The doctors were instructed to think aloud while processing the cases (Think-Aloud Protocols) and finally to decide for or against drug treatment. After the six cases they were asked to describe how they usually reason when they meet patients with high cholesterol values (Free-Report Protocols). The two sets of protocols were coded for cause-effect relations that were supposed to reflect the doctors' knowledge of guidelines. The Think-Aloud Protocols were also searched for reasons for the decisions to prescribe or not to prescribe. RESULTS: According to the protocols, the GPs were well aware of the importance of previous coronary heart disease and diabetes in their decisions. On the other hand, only a few doctors mentioned other arterio-sclerotic diseases like stroke and peripheral artery disease as variables affecting their decisions. There were several instances when the doctors' decisions apparently deviated from their knowledge of the guidelines. The arguments for the decisions in these cases often concerned aspects of the patient's life-style like smoking or overweight- either as risk-increasing factors or as alternative strategies for intervention. CONCLUSIONS: Coding verbal protocols for knowledge and for decision arguments seems to be a valuable tool for increasing our understanding of how guidelines are used in the on treatment of hypercholesterolaemia. By analysing arguments for treatment decisions it was often possible to understand why departures from the guidelines were made. While the need for decision support is obvious, the current guidelines may be too simple in some respects

    Neutralino versus axion/axino cold dark matter in the 19 parameter SUGRA model

    Full text link
    We calculate the relic abundance of thermally produced neutralino cold dark matter in the general 19 parameter supergravity (SUGRA-19) model. A scan over GUT scale parameters reveals that models with a bino-like neutralino typically give rise to a dark matter density \Omega_{\tz_1}h^2\sim 1-1000, i.e. between 1 and 4 orders of magnitude higher than the measured value. Models with higgsino or wino cold dark matter can yield the correct relic density, but mainly for neutralino masses around 700-1300 GeV. Models with mixed bino-wino or bino-higgsino CDM, or models with dominant co-annihilation or A-resonance annihilation can yield the correct abundance, but such cases are extremely hard to generate using a general scan over GUT scale parameters; this is indicative of high fine-tuning of the relic abundance in these cases. Requiring that m_{\tz_1}\alt 500 GeV (as a rough naturalness requirement) gives rise to a minimal probably dip in parameter space at the measured CDM abundance. For comparison, we also scan over mSUGRA space with four free parameters. Finally, we investigate the Peccei-Quinn augmented MSSM with mixed axion/axino cold dark matter. In this case, the relic abundance agrees more naturally with the measured value. In light of our cumulative results, we conclude that future axion searches should probe much more broadly in axion mass, and deeper into the axion coupling.Comment: 23 pages including 17 .eps figure

    NK Cells Are Not Required for Spontaneous Autoimmune Diabetes in NOD Mice

    Get PDF
    NK cells have been shown to either promote or protect from autoimmune diseases. Several studies have examined the role of receptors preferentially expressed by NK cells in the spontaneous disease of NOD mice or the direct role of NK cells in acute induced disease models of diabetes. Yet, the role of NK cells in spontaneous diabetes has not been directly addressed. Here, we used the NOD.NK1.1 congenic mouse model to examine the role of NK cells in spontaneous diabetes. Significant numbers of NK cells were only seen in the pancreas of mice with disease. Pancreatic NK cells displayed an activated surface phenotype and proliferated more than NK cells from other tissues in the diseased mice. Nonetheless, depletion of NK cells had no effect on dendritic cell maturation or T cell proliferation. In spontaneous disease, the deletion of NK cells had no significant impact on disease onset. NK cells were also not required to promote disease induced by adoptively transferred pathogenic CD4+ T cells. Thus, NK cells are not required for spontaneous autoimmune diabetes in NOD mice

    Distinguishing Asthma Phenotypes Using Machine Learning Approaches.

    Get PDF
    Asthma is not a single disease, but an umbrella term for a number of distinct diseases, each of which are caused by a distinct underlying pathophysiological mechanism. These discrete disease entities are often labelled as asthma endotypes. The discovery of different asthma subtypes has moved from subjective approaches in which putative phenotypes are assigned by experts to data-driven ones which incorporate machine learning. This review focuses on the methodological developments of one such machine learning technique-latent class analysis-and how it has contributed to distinguishing asthma and wheezing subtypes in childhood. It also gives a clinical perspective, presenting the findings of studies from the past 5 years that used this approach. The identification of true asthma endotypes may be a crucial step towards understanding their distinct pathophysiological mechanisms, which could ultimately lead to more precise prevention strategies, identification of novel therapeutic targets and the development of effective personalized therapies

    Hidden SUSY at the LHC: the light higgsino-world scenario and the role of a lepton collider

    Get PDF
    While the SUSY flavor, CP and gravitino problems seem to favor a very heavy spectrum of matter scalars, fine-tuning in the electroweak sector prefers low values of superpotential mass \mu. In the limit of low \mu, the two lightest neutralinos and light chargino are higgsino-like. The light charginos and neutralinos may have large production cross sections at LHC, but since they are nearly mass degenerate, there is only small energy release in three-body sparticle decays. Possible dilepton and trilepton signatures are difficult to observe after mild cuts due to the very soft p_T spectrum of the final state isolated leptons. Thus, the higgsino-world scenario can easily elude standard SUSY searches at the LHC. It should motivate experimental searches to focus on dimuon and trimuon production at the very lowest p_T(\mu) values possible. If the neutralino relic abundance is enhanced via non-standard cosmological dark matter production, then there exist excellent prospects for direct or indirect detection of higgsino-like WIMPs. While the higgsino-world scenario may easily hide from LHC SUSY searches, a linear e^+e^- collider or a muon collider operating in the \sqrt{s}\sim 0.5-1 TeV range would be able to easily access the chargino and neutralino pair production reactions.Comment: 20 pages including 12 .eps figure
    • …
    corecore