1,656 research outputs found

    The Global Trachoma Mapping Project: Methodology of a 34-Country Population-Based Study.

    Get PDF
    PURPOSE: To complete the baseline trachoma map worldwide by conducting population-based surveys in an estimated 1238 suspected endemic districts of 34 countries. METHODS: A series of national and sub-national projects owned, managed and staffed by ministries of health, conduct house-to-house cluster random sample surveys in evaluation units, which generally correspond to "health district" size: populations of 100,000-250,000 people. In each evaluation unit, we invite all residents aged 1 year and older from h households in each of c clusters to be examined for clinical signs of trachoma, where h is the number of households that can be seen by 1 team in 1 day, and the product h × c is calculated to facilitate recruitment of 1019 children aged 1-9 years. In addition to individual-level demographic and clinical data, household-level water, sanitation and hygiene data are entered into the purpose-built LINKS application on Android smartphones, transmitted to the Cloud, and cleaned, analyzed and ministry-of-health-approved via a secure web-based portal. The main outcome measures are the evaluation unit-level prevalence of follicular trachoma in children aged 1-9 years, prevalence of trachomatous trichiasis in adults aged 15 + years, percentage of households using safe methods for disposal of human feces, and percentage of households with proximate access to water for personal hygiene purposes. RESULTS: In the first year of fieldwork, 347 field teams commenced work in 21 projects in 7 countries. CONCLUSION: With an approach that is innovative in design and scale, we aim to complete baseline mapping of trachoma throughout the world in 2015

    Implementation and evaluation of a multi-level mental health promotion intervention for the workplace (MENTUPP): study protocol for a cluster randomised controlled trial

    Get PDF
    Background Well-organised and managed workplaces can be a source of wellbeing. The construction, healthcare and information and communication technology sectors are characterised by work-related stressors (e.g. high workloads, tight deadlines) which are associated with poorer mental health and wellbeing. The MENTUPP intervention is a flexibly delivered, multi-level approach to supporting small- and medium-sized enterprises (SMEs) in creating mentally healthy workplaces. The online intervention is tailored to each sector and designed to support employees and leaders dealing with mental health difficulties (e.g. stress), clinical level anxiety and depression, and combatting mental health-related stigma. This paper presents the protocol for the cluster randomised controlled trial (cRCT) of the MENTUPP intervention in eight European countries and Australia. Methods Each intervention country will aim to recruit at least two SMEs in each of the three sectors. The design of the cRCT is based on the experiences of a pilot study and guided by a Theory of Change process that describes how the intervention is assumed to work. SMEs will be randomly assigned to the intervention or control conditions. The aim of the cRCT is to assess whether the MENTUPP intervention is effective in improving mental health and wellbeing (primary outcome) and reducing stigma, depression and suicidal behaviour (secondary outcome) in employees. The study will also involve a process and economic evaluation. Conclusions At present, there is no known multi-level, tailored, flexible and accessible workplace-based intervention for the prevention of non-clinical and clinical symptoms of depression, anxiety and burnout, and the promotion of mental wellbeing. The results of this study will provide a comprehensive overview of the implementation and effectiveness of such an intervention in a variety of contexts, languages and cultures leading to the overall goal of delivering an evidence-based intervention for mental health in the workplace

    Cerebral small vessel disease genomics and its implications across the lifespan

    Get PDF
    White matter hyperintensities (WMH) are the most common brain-imaging feature of cerebral small vessel disease (SVD), hypertension being the main known risk factor. Here, we identify 27 genome-wide loci for WMH-volume in a cohort of 50,970 older individuals, accounting for modification/confounding by hypertension. Aggregated WMH risk variants were associated with altered white matter integrity (p = 2.5×10-7) in brain images from 1,738 young healthy adults, providing insight into the lifetime impact of SVD genetic risk. Mendelian randomization suggested causal association of increasing WMH-volume with stroke, Alzheimer-type dementia, and of increasing blood pressure (BP) with larger WMH-volume, notably also in persons without clinical hypertension. Transcriptome-wide colocalization analyses showed association of WMH-volume with expression of 39 genes, of which four encode known drug targets. Finally, we provide insight into BP-independent biological pathways underlying SVD and suggest potential for genetic stratification of high-risk individuals and for genetically-informed prioritization of drug targets for prevention trials.Peer reviewe

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for Physics beyond the Standard Model in Events with Overlapping Photons and Jets

    Get PDF
    Results are reported from a search for new particles that decay into a photon and two gluons, in events with jets. Novel jet substructure techniques are developed that allow photons to be identified in an environment densely populated with hadrons. The analyzed proton-proton collision data were collected by the CMS experiment at the LHC, in 2016 at root s = 13 TeV, and correspond to an integrated luminosity of 35.9 fb(-1). The spectra of total transverse hadronic energy of candidate events are examined for deviations from the standard model predictions. No statistically significant excess is observed over the expected background. The first cross section limits on new physics processes resulting in such events are set. The results are interpreted as upper limits on the rate of gluino pair production, utilizing a simplified stealth supersymmetry model. The excluded gluino masses extend up to 1.7 TeV, for a neutralino mass of 200 GeV and exceed previous mass constraints set by analyses targeting events with isolated photons.Peer reviewe

    Photography-based taxonomy is inadequate, unnecessary, and potentially harmful for biological sciences

    Get PDF
    The question whether taxonomic descriptions naming new animal species without type specimen(s) deposited in collections should be accepted for publication by scientific journals and allowed by the Code has already been discussed in Zootaxa (Dubois & Nemésio 2007; Donegan 2008, 2009; Nemésio 2009a–b; Dubois 2009; Gentile & Snell 2009; Minelli 2009; Cianferoni & Bartolozzi 2016; Amorim et al. 2016). This question was again raised in a letter supported by 35 signatories published in the journal Nature (Pape et al. 2016) on 15 September 2016. On 25 September 2016, the following rebuttal (strictly limited to 300 words as per the editorial rules of Nature) was submitted to Nature, which on 18 October 2016 refused to publish it. As we think this problem is a very important one for zoological taxonomy, this text is published here exactly as submitted to Nature, followed by the list of the 493 taxonomists and collection-based researchers who signed it in the short time span from 20 September to 6 October 2016

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Search for dark matter in events with a leptoquark and missing transverse momentum in proton-proton collisions at 13 TeV

    Get PDF
    A search is presented for dark matter in proton-proton collisions at a center-of-mass energy of root s= 13 TeV using events with at least one high transverse momentum (p(T)) muon, at least one high-p(T) jet, and large missing transverse momentum. The data were collected with the CMS detector at the CERN LHC in 2016 and 2017, and correspond to an integrated luminosity of 77.4 fb(-1). In the examined scenario, a pair of scalar leptoquarks is assumed to be produced. One leptoquark decays to a muon and a jet while the other decays to dark matter and low-p(T) standard model particles. The signature for signal events would be significant missing transverse momentum from the dark matter in conjunction with a peak at the leptoquark mass in the invariant mass distribution of the highest p(T) muon and jet. The data are observed to be consistent with the background predicted by the standard model. For the first benchmark scenario considered, dark matter masses up to 500 GeV are excluded for leptoquark masses m(LQ) approximate to 1400 GeV, and up to 300 GeV for m(LQ) approximate to 1500 GeV. For the second benchmark scenario, dark matter masses up to 600 GeV are excluded for m(LQ) approximate to 1400 GeV. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe

    Measurement of b jet shapes in proton-proton collisions at root s=5.02 TeV

    Get PDF
    We present the first study of charged-hadron production associated with jets originating from b quarks in proton-proton collisions at a center-of-mass energy of 5.02 TeV. The data sample used in this study was collected with the CMS detector at the CERN LHC and corresponds to an integrated luminosity of 27.4 pb(-1). To characterize the jet substructure, the differential jet shapes, defined as the normalized transverse momentum distribution of charged hadrons as a function of angular distance from the jet axis, are measured for b jets. In addition to the jet shapes, the per-jet yields of charged particles associated with b jets are also quantified, again as a function of the angular distance with respect to the jet axis. Extracted jet shape and particle yield distributions for b jets are compared with results for inclusive jets, as well as with the predictions from the pythia and herwig++ event generators.Peer reviewe
    corecore