539 research outputs found

    Quality of medication use in primary care - mapping the problem, working to a solution: a systematic review of the literature

    Get PDF
    Background: The UK, USA and the World Health Organization have identified improved patient safety in healthcare as a priority. Medication error has been identified as one of the most frequent forms of medical error and is associated with significant medical harm. Errors are the result of the systems that produce them. In industrial settings, a range of systematic techniques have been designed to reduce error and waste. The first stage of these processes is to map out the whole system and its reliability at each stage. However, to date, studies of medication error and solutions have concentrated on individual parts of the whole system. In this paper we wished to conduct a systematic review of the literature, in order to map out the medication system with its associated errors and failures in quality, to assess the strength of the evidence and to use approaches from quality management to identify ways in which the system could be made safer. Methods: We mapped out the medicines management system in primary care in the UK. We conducted a systematic literature review in order to refine our map of the system and to establish the quality of the research and reliability of the system. Results: The map demonstrated that the proportion of errors in the management system for medicines in primary care is very high. Several stages of the process had error rates of 50% or more: repeat prescribing reviews, interface prescribing and communication and patient adherence. When including the efficacy of the medicine in the system, the available evidence suggested that only between 4% and 21% of patients achieved the optimum benefit from their medication. Whilst there were some limitations in the evidence base, including the error rate measurement and the sampling strategies employed, there was sufficient information to indicate the ways in which the system could be improved, using management approaches. The first step to improving the overall quality would be routine monitoring of adherence, clinical effectiveness and hospital admissions. Conclusion: By adopting the whole system approach from a management perspective we have found where failures in quality occur in medication use in primary care in the UK, and where weaknesses occur in the associated evidence base. Quality management approaches have allowed us to develop a coherent change and research agenda in order to tackle these, so far, fairly intractable problems

    A mitochondria-targeted mass spectrometry probe to detect glyoxals: implications for diabetes

    Get PDF
    The glycation of protein and nucleic acids that occurs as a consequence of hyperglycaemia disrupts cell function and contributes to many pathologies, including those associated with diabetes and aging. Intracellular glycation occurs following the generation of the reactive 1,2-dicarbonyls methylglyoxal and glyoxal and disruption to mitochondrial function is associated with hyperglycemia. However, the contribution of these reactive dicarbonyls to mitochondrial damage in pathology is unclear due to uncertainties about their levels within mitochondria in cells and in vivo. To address this we have developed a mitochondria-targeted reagent (MitoG) designed to assess the levels of mitochondrial dicarbonyls within cells. MitoG comprises a lipophilic triphenylphosphonium cationic function, which directs the molecules to mitochondria within cells and an o-phenylenediamine moiety that reacts with dicarbonyls to give distinctive and stable products. The extent of accumulation of these diagnostic heterocyclic products can be readily and sensitively quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS), enabling changes to be determined. Using the MitoG-based analysis we assessed the formation of methylglyoxal and glyoxal in response to hyperglycaemia in cells in culture and in the Akita mouse model of diabetes in vivo. These findings indicated that the levels of methylglyoxal and glyoxal within mitochondria increase during hyperglycaemia in both cells and in vivo, suggesting that they can contribute to the pathological mitochondrial dysfunction that occurs in diabetes and aging

    To wet or not to wet: that is the question

    Full text link
    Wetting transitions have been predicted and observed to occur for various combinations of fluids and surfaces. This paper describes the origin of such transitions, for liquid films on solid surfaces, in terms of the gas-surface interaction potentials V(r), which depend on the specific adsorption system. The transitions of light inert gases and H2 molecules on alkali metal surfaces have been explored extensively and are relatively well understood in terms of the least attractive adsorption interactions in nature. Much less thoroughly investigated are wetting transitions of Hg, water, heavy inert gases and other molecular films. The basic idea is that nonwetting occurs, for energetic reasons, if the adsorption potential's well-depth D is smaller than, or comparable to, the well-depth of the adsorbate-adsorbate mutual interaction. At the wetting temperature, Tw, the transition to wetting occurs, for entropic reasons, when the liquid's surface tension is sufficiently small that the free energy cost in forming a thick film is sufficiently compensated by the fluid- surface interaction energy. Guidelines useful for exploring wetting transitions of other systems are analyzed, in terms of generic criteria involving the "simple model", which yields results in terms of gas-surface interaction parameters and thermodynamic properties of the bulk adsorbate.Comment: Article accepted for publication in J. Low Temp. Phy

    Complexity of case mix in a regional allergy service

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Currently in the United Kingdom (UK), there is a mismatch between limited financial resources and the large proportion of patients with suspected allergies actually being referred to specialist allergy clinics. To better understand the case mix of patients being referred, we audited referrals to a regional allergy service over an 8 year period.</p> <p>The main source of data was consultant letters to General Practitioners (GP) summarising the diagnosis of patients, archived from January 2002 to September 2009. Letters were reviewed, extracting the clinic date, doctor seen, gender, date of birth, postcode, GP, and diagnoses. Diagnoses were classified into seven groups and illustrative cases for each group noted.</p> <p>Findings</p> <p>Data from 2,028 new referrals with suspected allergy were analysed. The largest group of patients (43%) were diagnosed with a type I hypersensitivity. The other diagnostic groups were chronic idiopathic (spontaneous) urticaria (35%), suspected type I hypersensitivity but no allergen identified (8%), idiopathic (spontaneous) angioedema (8%), physical urticaria (2.5%), non-allergic symptoms (1.6%), type IV hypersensitivity (0.8%) and ACE inhibitor sensitivity (0.5%). Two thirds of patients seen were female with a higher percentage of female patients in the non type-I hypersensitivity group (71%) than the type 1 hypersensitivity (66%) (χ<sup>2 </sup>= 5.1, 1df, <it>p = 0.024</it>). The type 1 hypersensitivity patients were younger than other patients (38 Vs 46 years, t = -10.8, <it>p < 0.001</it>)</p> <p>Conclusions</p> <p>This study highlights the complexity of specialist allergy practice and the large proportion of patients referred with non-type I hypersensitivities, chronic idiopathic (spontaneous) urticaria being by far the largest group. Such information is critical to inform commissioning decisions, define referral pathways and in primary care education.</p

    Direct-coupling analysis of residue co-evolution captures native contacts across many protein families

    Full text link
    The similarity in the three-dimensional structures of homologous proteins imposes strong constraints on their sequence variability. It has long been suggested that the resulting correlations among amino acid compositions at different sequence positions can be exploited to infer spatial contacts within the tertiary protein structure. Crucial to this inference is the ability to disentangle direct and indirect correlations, as accomplished by the recently introduced Direct Coupling Analysis (DCA) (Weigt et al. (2009) Proc Natl Acad Sci 106:67). Here we develop a computationally efficient implementation of DCA, which allows us to evaluate the accuracy of contact prediction by DCA for a large number of protein domains, based purely on sequence information. DCA is shown to yield a large number of correctly predicted contacts, recapitulating the global structure of the contact map for the majority of the protein domains examined. Furthermore, our analysis captures clear signals beyond intra- domain residue contacts, arising, e.g., from alternative protein conformations, ligand- mediated residue couplings, and inter-domain interactions in protein oligomers. Our findings suggest that contacts predicted by DCA can be used as a reliable guide to facilitate computational predictions of alternative protein conformations, protein complex formation, and even the de novo prediction of protein domain structures, provided the existence of a large number of homologous sequences which are being rapidly made available due to advances in genome sequencing.Comment: 28 pages, 7 figures, to appear in PNA

    Combining Experiments and Simulations Using the Maximum Entropy Principle

    Get PDF
    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges
    corecore