1,824 research outputs found

    Carboxyhaemoglobin levels and their determinants in older British men

    Get PDF
    Background: Although there has been concern about the levels of carbon monoxide exposure, particularly among older people, little is known about COHb levels and their determinants in the general population. We examined these issues in a study of older British men.Methods: Cross-sectional study of 4252 men aged 60-79 years selected from one socially representative general practice in each of 24 British towns and who attended for examination between 1998 and 2000. Blood samples were measured for COHb and information on social, household and individual factors assessed by questionnaire. Analyses were based on 3603 men measured in or close to (< 10 miles) their place of residence.Results: The COHb distribution was positively skewed. Geometric mean COHb level was 0.46% and the median 0.50%; 9.2% of men had a COHb level of 2.5% or more and 0.1% of subjects had a level of 7.5% or more. Factors which were independently related to mean COHb level included season (highest in autumn and winter), region (highest in Northern England), gas cooking (slight increase) and central heating (slight decrease) and active smoking, the strongest determinant. Mean COHb levels were more than ten times greater in men smoking more than 20 cigarettes a day (3.29%) compared with non-smokers (0.32%); almost all subjects with COHb levels of 2.5% and above were smokers (93%). Pipe and cigar smoking was associated with more modest increases in COHb level. Passive cigarette smoking exposure had no independent association with COHb after adjustment for other factors. Active smoking accounted for 41% of variance in COHb level and all factors together for 47%.Conclusion: An appreciable proportion of men have COHb levels of 2.5% or more at which symptomatic effects may occur, though very high levels are uncommon. The results confirm that smoking (particularly cigarette smoking) is the dominant influence on COHb levels

    Observation of interstellar lithium in the low-metallicity Small Magellanic Cloud

    Full text link
    The primordial abundances of light elements produced in the standard theory of Big Bang nucleosynthesis (BBN) depend only on the cosmic ratio of baryons to photons, a quantity inferred from observations of the microwave background. The predicted primordial 7Li abundance is four times that measured in the atmospheres of Galactic halo stars. This discrepancy could be caused by modification of surface lithium abundances during the stars' lifetimes or by physics beyond the Standard Model that affects early nucleosynthesis. The lithium abundance of low-metallicity gas provides an alternative constraint on the primordial abundance and cosmic evolution of lithium that is not susceptible to the in situ modifications that may affect stellar atmospheres. Here we report observations of interstellar 7Li in the low-metallicity gas of the Small Magellanic Cloud, a nearby galaxy with a quarter the Sun's metallicity. The present-day 7Li abundance of the Small Magellanic Cloud is nearly equal to the BBN predictions, severely constraining the amount of possible subsequent enrichment of the gas by stellar and cosmic-ray nucleosynthesis. Our measurements can be reconciled with standard BBN with an extremely fine-tuned depletion of stellar Li with metallicity. They are also consistent with non-standard BBN.Comment: Published in Nature. Includes main text and Supplementary Information. Replaced with final title and abstrac

    Quality of medication use in primary care - mapping the problem, working to a solution: a systematic review of the literature

    Get PDF
    Background: The UK, USA and the World Health Organization have identified improved patient safety in healthcare as a priority. Medication error has been identified as one of the most frequent forms of medical error and is associated with significant medical harm. Errors are the result of the systems that produce them. In industrial settings, a range of systematic techniques have been designed to reduce error and waste. The first stage of these processes is to map out the whole system and its reliability at each stage. However, to date, studies of medication error and solutions have concentrated on individual parts of the whole system. In this paper we wished to conduct a systematic review of the literature, in order to map out the medication system with its associated errors and failures in quality, to assess the strength of the evidence and to use approaches from quality management to identify ways in which the system could be made safer. Methods: We mapped out the medicines management system in primary care in the UK. We conducted a systematic literature review in order to refine our map of the system and to establish the quality of the research and reliability of the system. Results: The map demonstrated that the proportion of errors in the management system for medicines in primary care is very high. Several stages of the process had error rates of 50% or more: repeat prescribing reviews, interface prescribing and communication and patient adherence. When including the efficacy of the medicine in the system, the available evidence suggested that only between 4% and 21% of patients achieved the optimum benefit from their medication. Whilst there were some limitations in the evidence base, including the error rate measurement and the sampling strategies employed, there was sufficient information to indicate the ways in which the system could be improved, using management approaches. The first step to improving the overall quality would be routine monitoring of adherence, clinical effectiveness and hospital admissions. Conclusion: By adopting the whole system approach from a management perspective we have found where failures in quality occur in medication use in primary care in the UK, and where weaknesses occur in the associated evidence base. Quality management approaches have allowed us to develop a coherent change and research agenda in order to tackle these, so far, fairly intractable problems

    CENGO: a web-based serious game to increase the programming knowledge levels of computer engineering students

    Get PDF
    In recent years, games are used to increase the level of knowledge and experience of individuals working in different domains. Especially in the education field, there are several different serious games to teach the subjects of the lectures or other educational materials to students in an enjoyable way. Hence, this study proposes a quantitative research approach to increase the programming knowledge levels of the first-year undergraduate students at computer engineering departments. For this aim, a responsive web platform was developed to teach the syntax and logic of C programming language by using some game elements. Therefore, the students have a chance to repeat the topics related to C programming language continuously since the platform is always accessible. To figure out the efficiency of the designed environment, 10 first-year computer engineering students were selected. According to the results obtained from the user tests, this game can be used as an educational tool, which supports the traditional training methods, to increase the knowledge levels of students about the syntax and logic of C programming language

    Minimum Sensitivity Based Robust Beamforming with Eigenspace Decomposition

    Get PDF
    An enhanced eigenspace-based beamformer (ESB) derived using the minimum sensitivity criterion is proposed with significantly improved robustness against steering vector errors. The sensitivity function is defined as the squared norm of the appropriately scaled weight vector and since the sensitivity function of an array to perturbations becomes very large in the presence of steering vector errors, it can be used to find the best projection for the ESB, irrespective of the distribution of additive noises. As demonstrated by simulation results, the proposed method has a better performance than the classic ESBs and the previously proposed uncertainty set based approach

    Influenza Surveillance among Outpatients and Inpatients in Morocco, 1996–2009

    Get PDF
    There is limited information about the epidemiology of influenza in Africa. We describe the epidemiology and seasonality of influenza in Morocco from 1996 to 2009 with particular emphasis on the 2007-2008 and 2008-2009 influenza seasons. Successes and challenges of the enhanced surveillance system introduced in 2007 are also discussed.Virologic sentinel surveillance for influenza virus was initiated in Morocco in 1996 using a network of private practitioners that collected oro-pharyngeal and naso-pharyngeal swabs from outpatients presenting with influenza-like-illness (ILI). The surveillance network expanded over the years to include inpatients presenting with severe acute respiratory illness (SARI) at hospitals and syndromic surveillance for ILI and acute respiratory infection (ARI). Respiratory samples and structured questionnaires were collected from eligible patients, and samples were tested by immunofluorescence assays and by viral isolation for influenza viruses.We obtained a total of 6465 respiratory specimens during 1996 to 2009, of which, 3102 were collected during 2007-2009. Of those, 2249 (72%) were from patients with ILI, and 853 (27%) were from patients with SARI. Among the 3,102 patients, 98 (3%) had laboratory-confirmed influenza, of whom, 85 (87%) had ILI and 13 (13%) had SARI. Among ILI patients, the highest proportion of laboratory-confirmed influenza occurred in children less than 5 years of age (3/169; 2% during 2007-2008 and 23/271; 9% during 2008-2009) and patients 25-59 years of age (8/440; 2% during 2007-2009 and 21/483; 4% during 2008-2009). All SARI patients with influenza were less than 14 years of age. During all surveillance years, influenza virus circulation was seasonal with peak circulation during the winter months of October through April.Influenza results in both mild and severe respiratory infections in Morocco, and accounted for a large proportion of all hospitalizations for severe respiratory illness among children 5 years of age and younger

    Profiles of physical, emotional and psychosocial wellbeing in the Lothian birth cohort 1936

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Physical, emotional, and psychosocial wellbeing are important domains of function. The aims of this study were to explore the existence of separable groups among 70-year olds with scores representing physical function, perceived quality of life, and emotional wellbeing, and to characterise any resulting groups using demographic, personality, cognition, health and lifestyle variables.</p> <p>Methods</p> <p>We used latent class analysis (LCA) to identify possible groups.</p> <p>Results</p> <p>Results suggested there were 5 groups. These included High (n = 515, 47.2% of the sample), Average (n = 417, 38.3%), and Poor Wellbeing (n = 37, 3.4%) groups. The two other groups had contrasting patterns of wellbeing: one group scored relatively well on physical function, but low on emotional wellbeing (Good Fitness/ Low Spirits,n = 60, 5.5%), whereas the other group showed low physical function but relatively well emotional wellbeing (Low Fitness/Good Spirits, n = 62, 5.7%). Salient characteristics that distinguished all the groups included smoking and drinking behaviours, personality, and illness.</p> <p>Conclusions</p> <p>Despite there being some evidence of these groups, the results also support a largely one-dimensional construct of wellbeing in old age—for the domains assessed here—though with some evidence that some individuals have uneven profiles.</p

    Nonthermal Emission from Star-Forming Galaxies

    Full text link
    The detections of high-energy gamma-ray emission from the nearby starburst galaxies M82 & NGC253, and other local group galaxies, broaden our knowledge of star-driven nonthermal processes and phenomena in non-AGN star-forming galaxies. We review basic aspects of the related processes and their modeling in starburst galaxies. Since these processes involve both energetic electrons and protons accelerated by SN shocks, their respective radiative yields can be used to explore the SN-particle-radiation connection. Specifically, the relation between SN activity, energetic particles, and their radiative yields, is assessed through respective measures of the particle energy density in several star-forming galaxies. The deduced energy densities range from O(0.1) eV/cm^3 in very quiet environments to O(100) eV/cm^3 in regions with very high star-formation rates.Comment: 17 pages, 5 figures, to be published in Astrophysics and Space Science Proceeding

    Practical Issues in Imputation-Based Association Mapping

    Get PDF
    Imputation-based association methods provide a powerful framework for testing untyped variants for association with phenotypes and for combining results from multiple studies that use different genotyping platforms. Here, we consider several issues that arise when applying these methods in practice, including: (i) factors affecting imputation accuracy, including choice of reference panel; (ii) the effects of imputation accuracy on power to detect associations; (iii) the relative merits of Bayesian and frequentist approaches to testing imputed genotypes for association with phenotype; and (iv) how to quickly and accurately compute Bayes factors for testing imputed SNPs. We find that imputation-based methods can be robust to imputation accuracy and can improve power to detect associations, even when average imputation accuracy is poor. We explain how ranking SNPs for association by a standard likelihood ratio test gives the same results as a Bayesian procedure that uses an unnatural prior assumption—specifically, that difficult-to-impute SNPs tend to have larger effects—and assess the power gained from using a Bayesian approach that does not make this assumption. Within the Bayesian framework, we find that good approximations to a full analysis can be achieved by simply replacing unknown genotypes with a point estimate—their posterior mean. This approximation considerably reduces computational expense compared with published sampling-based approaches, and the methods we present are practical on a genome-wide scale with very modest computational resources (e.g., a single desktop computer). The approximation also facilitates combining information across studies, using only summary data for each SNP. Methods discussed here are implemented in the software package BIMBAM, which is available from http://stephenslab.uchicago.edu/software.html

    Corruption and bicameral reforms

    Get PDF
    During the last decade unicameral proposals have been put forward in fourteen US states. In this paper we analyze the effects of the proposed constitutional reforms, in a setting where decision making is subject to ‘hard time constraints’, and lawmakers face the opposing interests of a lobby and the electorate. We show that bicameralism might lead to a decline in the lawmakers’ bargaining power vis-a-vis the lobby, thus compromising their accountability to voters. Hence, bicameralism is not a panacea against the abuse of power by elected legislators and the proposed unicameral reforms could be effective in reducing corruption among elected representatives
    corecore