407 research outputs found

    Clinical deterioration during antituberculosis treatment in Africa: Incidence, causes and risk factors

    Get PDF
    BACKGROUND:HIV-1 and Mycobacterium tuberculosis cause substantial morbidity and mortality. Despite the availability of antiretroviral and antituberculosis treatment in Africa, clinical deterioration during antituberculosis treatment remains a frequent reason for hospital admission. We therefore determined the incidence, causes and risk factors for clinical deterioration. METHODS: Prospective cohort study of 292 adults who initiated antituberculosis treatment during a 3-month period. We evaluated those with clinical deterioration over the following 24 weeks of treatment. RESULTS: Seventy-one percent (209/292) of patients were HIV-1 infected (median CD4+: 129 cells/muL [IQR:62-277]). At tuberculosis diagnosis, 23% (34/145) of HIV-1 infected patients qualifying for antiretroviral treatment (ART) were receiving ART; 6 months later, 75% (109/145) had received ART. Within 24 weeks of initiating antituberculosis treatment, 40% (117/292) of patients experienced clinical deterioration due to co-morbid illness (n = 70), tuberculosis related illness (n = 47), non AIDS-defining HIV-1 related infection (n = 25) and AIDS-defining illness (n = 21). Using HIV-1 uninfected patients as the referent group, HIV-1 infected patients had an increasing risk of clinical deterioration as CD4+ counts decreased [CD4+>350 cells/muL: RR = 1.4, 95% CI = 0.7-2.9; CD4+:200-350 cells/muL: RR = 2.0, 95% CI = 1.1-3.6; CD4+<200 cells/muL: RR = 3.0, 95% CI = 1.9-4.7]. During follow-up, 26% (30/117) of patients with clinical deterioration required hospital admission and 15% (17/117) died. Fifteen deaths were in HIV-1 infected patients with a CD4+<200 cells/muL. CONCLUSIONS: In multivariate analysis, HIV-1 infection and a low CD4+ count at tuberculosis diagnosis were significant risk factors for clinical deterioration and death. The initiation of ART at a CD4+ count of <350 cells/muL will likely reduce the high burden of clinical deterioration

    Quality of medication use in primary care - mapping the problem, working to a solution: a systematic review of the literature

    Get PDF
    Background: The UK, USA and the World Health Organization have identified improved patient safety in healthcare as a priority. Medication error has been identified as one of the most frequent forms of medical error and is associated with significant medical harm. Errors are the result of the systems that produce them. In industrial settings, a range of systematic techniques have been designed to reduce error and waste. The first stage of these processes is to map out the whole system and its reliability at each stage. However, to date, studies of medication error and solutions have concentrated on individual parts of the whole system. In this paper we wished to conduct a systematic review of the literature, in order to map out the medication system with its associated errors and failures in quality, to assess the strength of the evidence and to use approaches from quality management to identify ways in which the system could be made safer. Methods: We mapped out the medicines management system in primary care in the UK. We conducted a systematic literature review in order to refine our map of the system and to establish the quality of the research and reliability of the system. Results: The map demonstrated that the proportion of errors in the management system for medicines in primary care is very high. Several stages of the process had error rates of 50% or more: repeat prescribing reviews, interface prescribing and communication and patient adherence. When including the efficacy of the medicine in the system, the available evidence suggested that only between 4% and 21% of patients achieved the optimum benefit from their medication. Whilst there were some limitations in the evidence base, including the error rate measurement and the sampling strategies employed, there was sufficient information to indicate the ways in which the system could be improved, using management approaches. The first step to improving the overall quality would be routine monitoring of adherence, clinical effectiveness and hospital admissions. Conclusion: By adopting the whole system approach from a management perspective we have found where failures in quality occur in medication use in primary care in the UK, and where weaknesses occur in the associated evidence base. Quality management approaches have allowed us to develop a coherent change and research agenda in order to tackle these, so far, fairly intractable problems

    Travel-related schistosomiasis, strongyloidiasis, filariasis, and toxocariasis: the risk of infection and the diagnostic relevance of blood eosinophilia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study prospectively assessed the occurrence of clinical and subclinical schistosomiasis, strongyloidiasis, filariasis, and toxocariasis, and the screening value of eosinophilia in adult short-term travelers to helminth-endemic countries.</p> <p>Methods</p> <p>Visitors of a pre-travel health advice centre donated blood samples for serology and blood cell count before and after travel. Samples were tested for eosinophilia, and for antibodies against schistosomiasis, strongyloidiasis, filariasis, and toxocariasis. Previous infection was defined as seropositivity in pre- and post-travel samples. Recent infection was defined as a seroconversion. Symptoms of parasitic disease were recorded in a structured diary.</p> <p>Results</p> <p>Previous infection was found in 112 of 1207 subjects: schistosomiasis in 2.7%, strongyloidiasis in 2.4%, filariasis in 3.4%, and toxocariasis in 1.8%. Recent schistosomiasis was found in 0.51% of susceptible subjects at risk, strongyloidiasis in 0.25%, filariasis in 0.09%, and toxocariasis in 0.08%. The incidence rate per 1000 person-months was 6.4, 3.2, 1.1, and 1.1, respectively. Recent infections were largely contracted in Asia. The positive predictive value of eosinophilia for diagnosis was 15% for previous infection and 0% for recent infection. None of the symptoms studied had any positive predictive value.</p> <p>Conclusion</p> <p>The chance of infection with schistosomiasis, strongyloidiasis, filariasis, and toxocariasis during one short-term journey to an endemic area is low. However, previous stay leads to a cumulative risk of infection. Testing for eosinophilia appeared to be of no value in routine screening of asymptomatic travelers for the four helminthic infections. Findings need to be replicated in larger prospective studies.</p

    Fluorosis risk from early exposure to fluoride toothpaste

    Full text link
    Swallowed fluoride toothpaste in the early years of life has been postulated to be a risk factor for fluorosis, but the epidemiological evidence is weakened by the fact that most of the relevant studies were done in developed countries where an individual is exposed to multiple sources of fluoride. Objectives: To quantify the risk of fluorosis from fluoride toothpaste in a population whose only potential source of fluoride was fluoride toothpaste. Methods: Case-control analyses were conducted to test the hypothesis that fluoride toothpaste use before the age of 6 years increased an individual's risk of fluorosis. Data came from a cross-sectional clinical dental examination of schoolchildren and a self-administered questionnaire to their parents. The study was conducted in Goa, India. The study group consisted of 1189 seventh grade children with a mean age of 12.2 years. Results: The prevalence of fluorosis was 12.9% using the TF index. Results of the crude, stratified, and logistic regression analyses showed that use of fluoride toothpaste before the age of 6 years was a risk indicator for fluorosis (OR 1.83, 95% CI 1.05–3.15). Among children with fluorosis, beginning brushing before the age of 2 years increased the severity of fluorosis significantly ( P < 0.001). Other factors associated with the use of fluoride toothpaste, such as eating or swallowing fluoride toothpaste and higher frequency of use, did not show a statistically significant increased risk for prevalence or severity of fluorosis. Conclusions: Fluoride toothpaste use before the age of 6 years is a risk indicator for fluorosis in this study population.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/75437/1/j.1600-0528.1998.tb01957.x.pd

    Levels of diphtheria and tetanus specific IgG of Portuguese adult women, before and after vaccination with adult type Td. Duration of immunity following vaccination

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The need for tetanus toxoid decennial booster doses has been questioned by some experts. Several counter arguments have been presented, supporting the maintenance of decennial adult booster doses with tetanus and diphtheria toxoids (adult formulation of the vaccine: Td). This study aimed to evaluate the use of Td in Portuguese adult women under routine conditions. For that purpose we selected a group of women 30+ years of age to which vaccination was recommended. We intended to know if pre-vaccination antibody concentrations were associated with factors as age at first and last vaccination, number of doses and time since last revaccination. We also intended to assess the serological efficacy of Td booster.</p> <p>Methods</p> <p>Following the Portuguese guidelines 100 women were vaccinated with Td. Antitetanus toxin IgG (ATT IgG) and antidiphtheria toxin IgG (ADT IgG) levels were measured (mIU/ml) in 100 pre-vaccination and 91 post-vaccination sera. Detailed vaccination records were available from 88 participants.</p> <p>Results</p> <p>Twenty-two women (Group A) began vaccination with DPT/DT in their early childhood and their pre-vaccination ATT IgG levels increased with the number of doses received (p = 0.022) and decreased with time since last vaccination (p = 0.016). Among the 66 women who began vaccination in adolescence and adulthood (Group B), with monovalent TT, ATT IgG levels decreased with age at first dose (p < 0.001) and with time since last vaccination (p = 0.041). In Group A, antidiphtheria toxin IgG kinetics was very similar to that observed for ATT IgG. Among women not vaccinated with diphtheria toxoid, ADT IgG levels decreased with age. Serological response to both components of Td was good but more pronounced for ATT IgG.</p> <p>Conclusion</p> <p>Our study suggests that, to protect against tetanus, there is no need to administer decennial boosters to the Portuguese adults who have complied with the childhood/adolescent schedule (6 doses of tetanus toxoid). The adult booster intervals could be wider, probably of 20 years. This also seems to apply to protection against diphtheria, but issues on the herd immunity and on the circulation of toxigenic strains need to be better understood.</p

    Optimal Compensation for Temporal Uncertainty in Movement Planning

    Get PDF
    Motor control requires the generation of a precise temporal sequence of control signals sent to the skeletal musculature. We describe an experiment that, for good performance, requires human subjects to plan movements taking into account uncertainty in their movement duration and the increase in that uncertainty with increasing movement duration. We do this by rewarding movements performed within a specified time window, and penalizing slower movements in some conditions and faster movements in others. Our results indicate that subjects compensated for their natural duration-dependent temporal uncertainty as well as an overall increase in temporal uncertainty that was imposed experimentally. Their compensation for temporal uncertainty, both the natural duration-dependent and imposed overall components, was nearly optimal in the sense of maximizing expected gain in the task. The motor system is able to model its temporal uncertainty and compensate for that uncertainty so as to optimize the consequences of movement

    Vertical Binocular Disparity is Encoded Implicitly within a Model Neuronal Population Tuned to Horizontal Disparity and Orientation

    Get PDF
    Primary visual cortex is often viewed as a “cyclopean retina”, performing the initial encoding of binocular disparities between left and right images. Because the eyes are set apart horizontally in the head, binocular disparities are predominantly horizontal. Yet, especially in the visual periphery, a range of non-zero vertical disparities do occur and can influence perception. It has therefore been assumed that primary visual cortex must contain neurons tuned to a range of vertical disparities. Here, I show that this is not necessarily the case. Many disparity-selective neurons are most sensitive to changes in disparity orthogonal to their preferred orientation. That is, the disparity tuning surfaces, mapping their response to different two-dimensional (2D) disparities, are elongated along the cell's preferred orientation. Because of this, even if a neuron's optimal 2D disparity has zero vertical component, the neuron will still respond best to a non-zero vertical disparity when probed with a sub-optimal horizontal disparity. This property can be used to decode 2D disparity, even allowing for realistic levels of neuronal noise. Even if all V1 neurons at a particular retinotopic location are tuned to the expected vertical disparity there (for example, zero at the fovea), the brain could still decode the magnitude and sign of departures from that expected value. This provides an intriguing counter-example to the common wisdom that, in order for a neuronal population to encode a quantity, its members must be tuned to a range of values of that quantity. It demonstrates that populations of disparity-selective neurons encode much richer information than previously appreciated. It suggests a possible strategy for the brain to extract rarely-occurring stimulus values, while concentrating neuronal resources on the most commonly-occurring situations

    Physics of Neutron Star Crusts

    Get PDF
    The physics of neutron star crusts is vast, involving many different research fields, from nuclear and condensed matter physics to general relativity. This review summarizes the progress, which has been achieved over the last few years, in modeling neutron star crusts, both at the microscopic and macroscopic levels. The confrontation of these theoretical models with observations is also briefly discussed.Comment: 182 pages, published version available at <http://www.livingreviews.org/lrr-2008-10

    Adverse Drug Reactions in Hospital In-Patients: A Prospective Analysis of 3695 Patient-Episodes

    Get PDF
    Adverse drug reactions (ADRs) are a major cause of hospital admissions, but recent data on the incidence and clinical characteristics of ADRs which occur following hospital admission, are lacking. Patients admitted to twelve wards over a six-month period in 2005 were assessed for ADRs throughout their admission. Suspected ADRs were recorded and analysed for causality, severity and avoidability and whether they increased the length of stay. Multivariable analysis was undertaken to identify the risk factors for ADRs. The 5% significance level was used when assessing factors for inclusion in multivariable models. Out of the 3695 patient episodes assessed for ADRs, 545 (14.7%, 95% CI 13.6–15.9%) experienced one or more ADRs. Half of ADRs were definitely or possibly avoidable. The patients experiencing ADRs were more likely to be older, female, taking a larger number of medicines, and had a longer length of stay than those without ADRs. However, the only significant predictor of ADRs, from the multivariable analysis of a representative sample of patients, was the number of medicines taken by the patient with each additional medication multiplying the hazard of an ADR episode by 1.14 (95% CI 1.09, 1.20). ADRs directly increased length of stay in 147 (26.8%) patients. The drugs most frequently associated with ADRs were diuretics, opioid analgesics, and anticoagulants. In conclusion, approximately one in seven hospital in-patients experience an ADR, which is a significant cause of morbidity, increasing the length of stay of patients by an average of 0.25 days/patient admission episode. The overall burden of ADRs on hospitals is high, and effective intervention strategies are urgently needed to reduce this burden
    corecore