161 research outputs found

    Fluoroquinolones and isoniazid-resistant tuberculosis: implications for the 2018 WHO guidance.

    Get PDF
    INTRODUCTION: 2018 World Health Organization (WHO) guidelines for the treatment of isoniazid (H)-resistant (Hr) tuberculosis recommend a four-drug regimen: rifampicin (R), ethambutol (E), pyrazinamide (Z) and levofloxacin (Lfx), with or without H ([H]RZE-Lfx). This is used once Hr is known, such that patients complete 6 months of Lfx (≥6[H]RZE-6Lfx). This cohort study assessed the impact of fluoroquinolones (Fq) on treatment effectiveness, accounting for Hr mutations and degree of phenotypic resistance. METHODS: This was a retrospective cohort study of 626 Hr tuberculosis patients notified in London, 2009-2013. Regimens were described and logistic regression undertaken of the association between regimen and negative regimen-specific outcomes (broadly, death due to tuberculosis, treatment failure or disease recurrence). RESULTS: Of 594 individuals with regimen information, 330 (55.6%) were treated with (H)RfZE (Rf=rifamycins) and 211 (35.5%) with (H)RfZE-Fq. The median overall treatment period was 11.9 months and median Z duration 2.1 months. In a univariable logistic regression model comparing (H)RfZE with and without Fqs, there was no difference in the odds of a negative regimen-specific outcome (baseline (H)RfZE, cluster-specific odds ratio 1.05 (95% CI 0.60-1.82), p=0.87; cluster NHS trust). Results varied minimally in a multivariable model. This odds ratio dropped (0.57, 95% CI 0.14-2.28) when Hr genotype was included, but this analysis lacked power (p=0.42). CONCLUSIONS: In a high-income setting, we found a 12-month (H)RfZE regimen with a short Z duration to be similarly effective for Hr tuberculosis with or without a Fq. This regimen may result in fewer adverse events than the WHO recommendations

    Running away experience and psychoactive substance use among adolescents in Taiwan: multi-city street outreach survey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study aimed to examine: 1) the relationship between being a runaway and the time since the first absconding event and adolescent substance use; 2) whether different kinds of psychoactive substances have a different temporal relationship to the first absconding event; and 3) whether the various reasons for the first absconding event are associated with different risks of substance use.</p> <p>Methods</p> <p>Participants were drawn from the 2004-2006 nationwide outreach programs across 26 cities/towns in Taiwan. A total of 17,133 participants, age 12-18 years, who completed an anonymous questionnaire on their experience of running away and substances use and who were now living with their families, were included in the analysis.</p> <p>Results</p> <p>The lifetime risk of tobacco, alcohol, betel nut, and illegal drug/inhalant use increased steadily from adolescents who had experienced a trial runaway episode (one time lasting ≤ 1 day), to those with extended runaway experience (≥ 2 times or lasting > 1 day), when compared to those who had never ran away. Adolescents who had their first running away experience > 6 months previously had a greater risk of betel nut or illegal drug/inhalant use over the past 6-months than those with a similar experience within the last 6 months. Both alcohol and tobacco use were most frequently initiated before the first running away, whereas both betel nut and illegal drug/inhalant use were most frequently initiated after this event. When adolescents who were fleeing an unsatisfactory home life were compared to those who ran away for excitement, the risk of alcohol use was similar but the former tended to have a higher risk of tobacco, betel nut, and illegal drug/inhalant use.</p> <p>Conclusions</p> <p>More significant running away and a longer time since the first absconding experience were associated with more advanced substance involvement among adolescents now living in a family setting. Once adolescents had left home, they developed additional psychoactive substance problems, regardless of their reasons for running away. These findings have implications for caregivers, teachers, and healthcare workers when trying to prevent and/or intervening in adolescent substance use.</p

    A systematic review of the effectiveness and cost-effectiveness of peer education and peer support in prisons.

    Get PDF
    BACKGROUND: Prisoners experience significantly worse health than the general population. This review examines the effectiveness and cost-effectiveness of peer interventions in prison settings. METHODS: A mixed methods systematic review of effectiveness and cost-effectiveness studies, including qualitative and quantitative synthesis was conducted. In addition to grey literature identified and searches of websites, nineteen electronic databases were searched from 1985 to 2012. Study selection criteria were: Population: Prisoners resident in adult prisons and children resident in Young Offender Institutions (YOIs). INTERVENTION: Peer-based interventions Comparators: Review questions 3 and 4 compared peer and professionally led approaches. OUTCOMES: Prisoner health or determinants of health; organisational/ process outcomes; views of prison populations. STUDY DESIGNS: Quantitative, qualitative and mixed method evaluations. RESULTS: Fifty-seven studies were included in the effectiveness review and one study in the cost-effectiveness review; most were of poor methodological quality. Evidence suggested that peer education interventions are effective at reducing risky behaviours, and that peer support services are acceptable within the prison environment and have a positive effect on recipients, practically or emotionally. Consistent evidence from many, predominantly qualitative, studies, suggested that being a peer deliverer was associated with positive effects. There was little evidence on cost-effectiveness of peer-based interventions. CONCLUSIONS: There is consistent evidence from a large number of studies that being a peer worker is associated with positive health; peer support services are also an acceptable source of help within the prison environment and can have a positive effect on recipients. Research into cost-effectiveness is sparse. SYSTEMATIC REVIEW REGISTRATION: PROSPERO ref: CRD42012002349

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Perceptions of eye health in schools in Pakistan

    Get PDF
    BACKGROUND: Research exploring children's and their teachers' perceptions of eye health is lacking. This paper reports for the first time on perceptions of primary schoolchildren and their teachers of healthy and diseased eyes, things that keep eyes healthy and damage them, and what actions to be taken in case of an eye injury. METHODS: Using draw and write technique, 160 boys and girls (9–12 years old) attending four primary schools in Abbottabad district, northern Pakistan, were invited to draw pictures in response to a set of semi-structured questions and then label them. Sixteen teachers who were currently teaching the selected students were interviewed one-on-one. RESULTS: Analysis of text accompanying 800 drawings and of the interview scripts revealed that most children and teachers perceived healthy eyes to be those which could see well, and diseased eyes to be those which have redness, watering, dirty discharge, pain, and itching; or those which have "weak eyesight" and blindness. Among things that students and teachers thought damage the eyes included sun, television, and sharp pointed objects, particularly pencils. Teachers noted that children with eye problems "have difficulty seeing the blackboard well", "screw up their eyes", and "hold their books too close". CONCLUSION: We conclude that schoolchildren and their teachers had a good knowledge of eye health, but many of them had serious misconceptions e.g., use of kohl, medicines and eye drops keeps eyes healthy. Kohl is an important source of lead and can reduce children's intelligence even at low blood levels. Health education in schools must take into account children's existing knowledge of and misconceptions about various aspects of eye health. Such steps if taken could improve the relevance of eye health education to schoolchildren

    Re-Evaluation of the Action Potential Upstroke Velocity as a Measure of the Na+ Current in Cardiac Myocytes at Physiological Conditions

    Get PDF
    Background: The SCN5A encoded sodium current (INa) generates the action potential (AP) upstroke and is a major determinant of AP characteristics and AP propagation in cardiac myocytes. Unfortunately, in cardiac myocytes, investigation of kinetic properties of INa with near-physiological ion concentrations and temperature is technically challenging due to the large amplitude and rapidly activating nature of INa, which may seriously hamper the quality of voltage control over the membrane. We hypothesized that the alternating voltage clamp-current clamp (VC/CC) technique might provide an alternative to traditional voltage clamp (VC) technique for the determination of INa properties under physiological conditions. Principal Findings: We studied INa under close-to-physiological conditions by VC technique in SCN5A cDNA-transfected HEK cells or by alternating VC/CC technique in both SCN5A cDNA-transfected HEK cells and rabbit left ventricular myocytes. In these experiments, peak INa during a depolarizing VC step or maximal upstroke velocity, dV/dtmax, during VC/CC served as an indicator of available INa. In HEK cells, biophysical properties of INa, including current density, voltage dependent (in)activation, development of inactivation, and recovery from inactivation, were highly similar in VC and VC/CC experiments. As an application of the VC/CC technique we studied INa in left ventricular myocytes isolated from control or failing rabbit hearts

    TILLING - a shortcut in functional genomics

    Get PDF
    Recent advances in large-scale genome sequencing projects have opened up new possibilities for the application of conventional mutation techniques in not only forward but also reverse genetics strategies. TILLING (Targeting Induced Local Lesions IN Genomes) was developed a decade ago as an alternative to insertional mutagenesis. It takes advantage of classical mutagenesis, sequence availability and high-throughput screening for nucleotide polymorphisms in a targeted sequence. The main advantage of TILLING as a reverse genetics strategy is that it can be applied to any species, regardless of its genome size and ploidy level. The TILLING protocol provides a high frequency of point mutations distributed randomly in the genome. The great mutagenic potential of chemical agents to generate a high rate of nucleotide substitutions has been proven by the high density of mutations reported for TILLING populations in various plant species. For most of them, the analysis of several genes revealed 1 mutation/200–500 kb screened and much higher densities were observed for polyploid species, such as wheat. High-throughput TILLING permits the rapid and low-cost discovery of new alleles that are induced in plants. Several research centres have established a TILLING public service for various plant species. The recent trends in TILLING procedures rely on the diversification of bioinformatic tools, new methods of mutation detection, including mismatch-specific and sensitive endonucleases, but also various alternatives for LI-COR screening and single nucleotide polymorphism (SNP) discovery using next-generation sequencing technologies. The TILLING strategy has found numerous applications in functional genomics. Additionally, wide applications of this throughput method in basic and applied research have already been implemented through modifications of the original TILLING strategy, such as Ecotilling or Deletion TILLING

    Interventions for families affected by HIV

    Get PDF
    Family-based interventions are efficacious for human immunodeficiency virus (HIV) detection, prevention, and care, but they are not broadly diffused. Understanding intervention adaptation and translation processes can support evidence-based intervention (EBI) diffusion processes. This paper provides a narrative review of a series of EBI for families affected by HIV (FAH) that were adapted across five randomized controlled trials in the US, Thailand, and South Africa over 15 years. The FAH interventions targeted parents living with HIV and their children or caregiver supports. Parents with HIV were primarily mothers infected through sexual transmission. The EBIs for FAH are reviewed with attention to commonalities and variations in risk environments and intervention features. Frameworks for common and robust intervention functions, principles, practice elements, and delivery processes are utilized to highlight commonalities and adaptations for each location, time period, and intervention delivery settings. Health care, housing, food, and financial security vary dramatically in each risk environment. Yet, all FAH face common health, mental health, transmission, and relationship challenges. The EBIs efficaciously addressed these common challenges and were adapted across contexts with fidelity to robust intervention principles, processes, factors, and practices. Intervention adaptation teams have a series of structural decision points: mainstreaming HIV with other local health priorities or not; selecting an optimal delivery site (clinics, homes, community centers); and how to translate intervention protocols to local contexts and cultures. Replication of interventions with fidelity must occur at the level of standardized functions and robust principles, processes, and practices, not manualized protocols. Adopting a continuous quality improvement paradigm will enhance rapid and global diffusion of EBI for FAH
    corecore