804 research outputs found

    A novel approach to light-front perturbation theory

    Get PDF
    We suggest a possible algorithm to calculate one-loop n-point functions within a variant of light-front perturbation theory. The key ingredients are the covariant Passarino-Veltman scheme and a surprising integration formula that localises Feynman integrals at vanishing longitudinal momentum. The resulting expressions are generalisations of Weinberg's infinite-momentum results and are manifestly Lorentz invariant. For n = 2 and 3 we explicitly show how to relate those to light-front integrals with standard energy denominators. All expressions are rendered finite by means of transverse dimensional regularisation.Comment: 10 pages, 5 figure

    Fast methods for training Gaussian processes on large data sets

    Get PDF
    Gaussian process regression (GPR) is a non-parametric Bayesian technique for interpolating or fitting data. The main barrier to further uptake of this powerful tool rests in the computational costs associated with the matrices which arise when dealing with large data sets. Here, we derive some simple results which we have found useful for speeding up the learning stage in the GPR algorithm, and especially for performing Bayesian model comparison between different covariance functions. We apply our techniques to both synthetic and real data and quantify the speed-up relative to using nested sampling to numerically evaluate model evidences.Comment: Fixed missing reference

    Uses of strength-based interventions for people with serious mental illness: a critical review

    Get PDF
    Background: For the past 3 decades, mental health practitioners have increasingly adopted aspects and tools of strength-based approaches. Providing strength-based intervention and amplifying strengths relies heavily on effective interpersonal processes. Aim: This article is a critical review of research regarding the use of strength-based approaches in mental health service settings. The aim is to discuss strength-based interventions within broader research on recovery, focussing on effectiveness and advances in practice where applicable. Method: A systematic search for peer-reviewed intervention studies published between 2001 and December 2014 yielded 55 articles of potential relevance to the review. Results: Seven studies met the inclusion criteria and were included in the analysis. The Quality Assessment Tool for Quantitative Studies was used to appraise the quality of the studies. Our review found emerging evidence that the utilisation of a strength-based approach improves outcomes including hospitalisation rates, employment/educational attainment, and intrapersonal outcomes such as self-efficacy and sense of hope. Conclusion: Recent studies confirm the feasibility of implementing a high-fidelity strength-based approach in clinical settings and its relevance for practitioners in health care. More high-quality studies are needed to further examine the effectiveness of strength-based approaches

    Towards an understanding of game software development processes: a case study

    Get PDF
    This paper aims to fill the gap that exists about software development processes in game development in the research literature, and address the gap in the research literature by investigating and reporting information about the software development processes used in game development. To investigate the role of the software development process in relation to the game development process, and to better understand the processes and practices used in game software development, a single industrial based case study was undertaken and reported to investigate in a real world context the software development processes and practices used in game development. This research contributes to our knowledge of the field of game development and potentially forms the foundation for further research in the area

    High-Precision Measurement of the 19Ne Half-Life and Implications for Right-Handed Weak Currents

    Full text link
    We report a precise determination of the 19Ne half-life to be T1/2=17.262±0.007T_{1/2} = 17.262 \pm 0.007 s. This result disagrees with the most recent precision measurements and is important for placing bounds on predicted right-handed interactions that are absent in the current Standard Model. We are able to identify and disentangle two competing systematic effects that influence the accuracy of such measurements. Our findings prompt a reassessment of results from previous high-precision lifetime measurements that used similar equipment and methods.Comment: 5 pages and 5 figures. Paper accepted for publication in Phys. Rev. Let

    Stalking influenza by vaccination with pre-fusion headless HA mini-stem.

    Get PDF
    Inaccuracies in prediction of circulating viral strain genotypes and the possibility of novel reassortants causing a pandemic outbreak necessitate the development of an anti-influenza vaccine with increased breadth of protection and potential for rapid production and deployment. The hemagglutinin (HA) stem is a promising target for universal influenza vaccine as stem-specific antibodies have the potential to be broadly cross-reactive towards different HA subtypes. Here, we report the design of a bacterially expressed polypeptide that mimics a H5 HA stem by protein minimization to focus the antibody response towards the HA stem. The HA mini-stem folds as a trimer mimicking the HA prefusion conformation. It is resistant to thermal/chemical stress, and it binds to conformation-specific, HA stem-directed broadly neutralizing antibodies with high affinity. Mice vaccinated with the group 1 HA mini-stems are protected from morbidity and mortality against lethal challenge by both group 1 (H5 and H1) and group 2 (H3) influenza viruses, the first report of cross-group protection. Passive transfer of immune serum demonstrates the protection is mediated by stem-specific antibodies. Furthermore, antibodies indudced by these HA stems have broad HA reactivity, yet they do not have antibody-dependent enhancement activity

    Statistical methods in cosmology

    Full text link
    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.Comment: 31, pages, 6 figures, notes from 2nd Trans-Regio Winter school in Passo del Tonale. To appear in Lectures Notes in Physics, "Lectures on cosmology: Accelerated expansion of the universe" Feb 201

    Iterative graph cuts for image segmentation with a nonlinear statistical shape prior

    Full text link
    Shape-based regularization has proven to be a useful method for delineating objects within noisy images where one has prior knowledge of the shape of the targeted object. When a collection of possible shapes is available, the specification of a shape prior using kernel density estimation is a natural technique. Unfortunately, energy functionals arising from kernel density estimation are of a form that makes them impossible to directly minimize using efficient optimization algorithms such as graph cuts. Our main contribution is to show how one may recast the energy functional into a form that is minimizable iteratively and efficiently using graph cuts.Comment: Revision submitted to JMIV (02/24/13

    Informing investment to reduce inequalities: a modelling approach

    Get PDF
    Background: Reducing health inequalities is an important policy objective but there is limited quantitative information about the impact of specific interventions. Objectives: To provide estimates of the impact of a range of interventions on health and health inequalities. Materials and methods: Literature reviews were conducted to identify the best evidence linking interventions to mortality and hospital admissions. We examined interventions across the determinants of health: a ‘living wage’; changes to benefits, taxation and employment; active travel; tobacco taxation; smoking cessation, alcohol brief interventions, and weight management services. A model was developed to estimate mortality and years of life lost (YLL) in intervention and comparison populations over a 20-year time period following interventions delivered only in the first year. We estimated changes in inequalities using the relative index of inequality (RII). Results: Introduction of a ‘living wage’ generated the largest beneficial health impact, with modest reductions in health inequalities. Benefits increases had modest positive impacts on health and health inequalities. Income tax increases had negative impacts on population health but reduced inequalities, while council tax increases worsened both health and health inequalities. Active travel increases had minimally positive effects on population health but widened health inequalities. Increases in employment reduced inequalities only when targeted to the most deprived groups. Tobacco taxation had modestly positive impacts on health but little impact on health inequalities. Alcohol brief interventions had modestly positive impacts on health and health inequalities only when strongly socially targeted, while smoking cessation and weight-reduction programmes had minimal impacts on health and health inequalities even when socially targeted. Conclusions: Interventions have markedly different effects on mortality, hospitalisations and inequalities. The most effective (and likely cost-effective) interventions for reducing inequalities were regulatory and tax options. Interventions focused on individual agency were much less likely to impact on inequalities, even when targeted at the most deprived communities

    Reports of interactive meetings with policy makers

    Get PDF
    The Streamlining of Ocean Wave Farms Impact Assessment (SOWFIA) Project (IEE/09/809/ SI2.558291) is an EU Intelligent Energy Europe (IEE) funded project that draws together ten partners, across eight European countries, who are actively involved with planned wave farm test centres. The SOWFIA project aims to achieve the sharing and consolidation of pan-European experience of consenting processes and environmental and socio-economic impact assessment (IA) best practices for offshore wave energy conversion developments. Studies of wave farm demonstration projects in each of the collaborating EU nations are contributing to the findings. The study sites comprise a wide range of device technologies, environmental settings and stakeholder interests. Through project workshops, meetings, on-going communication and networking amongst project partners, ideas and experiences relating to IA and policy are being shared, and co-ordinated studies addressing key questions for wave energy development are being carried out. The overall goal of the SOWFIA project is to provide recommendations for approval process streamlining and European-wide streamlining of IA processes, thereby helping to remove legal, environmental and socio-economic barriers to the development of offshore power generation from waves. By utilising the findings from technology-specific monitoring at multiple sites, SOWFIA will accelerate knowledge transfer and promote European-wide expertise on environmental and socio-economic impact assessments of wave energy projects. In this way, the development of the future, commercial phase of offshore wave energy installations will benefit from the lessons learned from existing smaller-scale developments
    corecore