321 research outputs found

    Tetraspanin (TSP-17) Protects Dopaminergic Neurons against 6-OHDA-Induced Neurodegeneration in <i>C. elegans</i>

    Get PDF
    Parkinson's disease (PD), the second most prevalent neurodegenerative disease after Alzheimer's disease, is linked to the gradual loss of dopaminergic neurons in the substantia nigra. Disease loci causing hereditary forms of PD are known, but most cases are attributable to a combination of genetic and environmental risk factors. Increased incidence of PD is associated with rural living and pesticide exposure, and dopaminergic neurodegeneration can be triggered by neurotoxins such as 6-hydroxydopamine (6-OHDA). In C. elegans, this drug is taken up by the presynaptic dopamine reuptake transporter (DAT-1) and causes selective death of the eight dopaminergic neurons of the adult hermaphrodite. Using a forward genetic approach to find genes that protect against 6-OHDA-mediated neurodegeneration, we identified tsp-17, which encodes a member of the tetraspanin family of membrane proteins. We show that TSP-17 is expressed in dopaminergic neurons and provide genetic, pharmacological and biochemical evidence that it inhibits DAT-1, thus leading to increased 6-OHDA uptake in tsp-17 loss-of-function mutants. TSP-17 also protects against toxicity conferred by excessive intracellular dopamine. We provide genetic and biochemical evidence that TSP-17 acts partly via the DOP-2 dopamine receptor to negatively regulate DAT-1. tsp-17 mutants also have subtle behavioral phenotypes, some of which are conferred by aberrant dopamine signaling. Incubating mutant worms in liquid medium leads to swimming-induced paralysis. In the L1 larval stage, this phenotype is linked to lethality and cannot be rescued by a dop-3 null mutant. In contrast, mild paralysis occurring in the L4 larval stage is suppressed by dop-3, suggesting defects in dopaminergic signaling. In summary, we show that TSP-17 protects against neurodegeneration and has a role in modulating behaviors linked to dopamine signaling

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    The Lactobacillus flora in vagina and rectum of fertile and postmenopausal healthy Swedish women

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>Lactobacillus </it>species are the most often found inhabitants of vaginal ecosystem of fertile women. In postmenopausal women with low oestrogen levels, <it>Lactobacillus </it>flora is diminishing or absent. However, no studies have been performed to investigate the correlation between oestrogen levels and the lactobacilli in the gut. The aim of the present study was to investigate the relation in healthy women between vaginal and rectal microbial flora as well as possible variations with hormone levels.</p> <p>Methods</p> <p>Vaginal and rectal smears were taken from 20 healthy fertile women, average 40 years (range 28-49 years), in two different phases of the menstrual cycle, and from 20 postmenopausal women, average 60 years (range 52-85 years). Serum sex hormone levels were analyzed. Bacteria from the smears isolated on Rogosa Agar were grouped by Randomly Amplified Polymorphic DNA and identified by multiplex PCR and partial 16S rRNA gene sequencing.</p> <p>Results</p> <p><it>Lactobacillus crispatus </it>was more often found in the vaginal flora of fertile women than in that of postmenopausal (p = 0.036). Fifteen of 20 fertile women had lactobacilli in their rectal smears compared to 10 postmenopausal women (p = 0.071). There was no correlation between the number of bacteria in vagina and rectum, or between the number of bacteria and hormonal levels. Neither could any association between the presence of rectal lactobacilli and hormonal levels be found.</p> <p>Conclusion</p> <p><it>Lactobacillus crispatus </it>was more prevalent in the vaginal flora of fertile women, whereas the <it>Lactobacillus </it>flora of rectum did not correlate to the vaginal flora nor to hormonal levels.</p

    Cholesterol treatment with statins: Who is left out and who makes it to goal?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Whether patient socio-demographic characteristics (age, sex, race/ethnicity, income, and education) are independently associated with failure to receive indicated statin therapy and/or to achieve low density lipoprotein cholesterol (LDL-C) therapy goals are not known. We examined socio-demographic factors associated with a) eligibility for statin therapy among those not on statins, and b) achievement of statin therapy goals.</p> <p>Methods</p> <p>Adults (21-79 years) participating in the United States (US) National Health and Nutrition Examination Surveys, 1999-2006 were studied. Statin eligibility and achievement of target LDL-C was assessed using the US Third Adult Treatment Panel (ATP III) on Treatment of High Cholesterol guidelines.</p> <p>Results</p> <p>Among 6,043 participants not taking statins, 10.4% were eligible. Adjusted predictors of statin eligibility among statin non-users were being older, male, poorer, and less educated. Hispanics were less likely to be eligible but not using statins, an effect that became non-significant with adjustment for language usually spoken at home. Among 537 persons taking statins, 81% were at LDL-C goal. Adjusted predictors of goal failure among statin users were being male and poorer. These risks were not attenuated by adjustment for healthcare access or utilization.</p> <p>Conclusion</p> <p>Among person's not taking statins, the socio-economically disadvantaged are more likely to be eligible and among those on statins, the socio-economically disadvantaged are less likely to achieve statin treatment goals. Further study is needed to identify specific amenable patient and/or physician factors that contribute to these disparities.</p

    The Ontario printed educational message (OPEM) trial to narrow the evidence-practice gap with respect to prescribing practices of general and family physicians: a cluster randomized controlled trial, targeting the care of individuals with diabetes and hypertension in Ontario, Canada

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There are gaps between what family practitioners do in clinical practice and the evidence-based ideal. The most commonly used strategy to narrow these gaps is the printed educational message (PEM); however, the attributes of successful printed educational messages and their overall effectiveness in changing physician practice are not clear. The current endeavor aims to determine whether such messages change prescribing quality in primary care practice, and whether these effects differ with the format of the message.</p> <p>Methods/design</p> <p>The design is a large, simple, factorial, unblinded cluster-randomized controlled trial. PEMs will be distributed with <b><it>informed</it></b>, a quarterly evidence-based synopsis of current clinical information produced by the Institute for Clinical Evaluative Sciences, Toronto, Canada, and will be sent to all eligible general and family practitioners in Ontario. There will be three replicates of the trial, with three different educational messages, each aimed at narrowing a specific evidence-practice gap as follows: 1) angiotensin-converting enzyme inhibitors, hypertension treatment, and cholesterol lowering agents for diabetes; 2) retinal screening for diabetes; and 3) diuretics for hypertension.</p> <p>For each of the three replicates there will be three intervention groups. The first group will receive <b><it>informed </it></b>with an attached postcard-sized, short, directive "outsert." The second intervention group will receive <b><it>informed </it></b>with a two-page explanatory "insert" on the same topic. The third intervention group will receive <b><it>informed</it></b>, with both the above-mentioned outsert and insert. The control group will receive <b><it>informed </it></b>only, without either an outsert or insert.</p> <p>Routinely collected physician billing, prescription, and hospital data found in Ontario's administrative databases will be used to monitor pre-defined prescribing changes relevant and specific to each replicate, following delivery of the educational messages. Multi-level modeling will be used to study patterns in physician-prescribing quality over four quarters, before and after each of the three interventions. Subgroup analyses will be performed to assess the association between the characteristics of the physician's place of practice and target behaviours.</p> <p>A further analysis of the immediate and delayed impacts of the PEMs will be performed using time-series analysis and interventional, auto-regressive, integrated moving average modeling.</p> <p>Trial registration number</p> <p>Current controlled trial ISRCTN72772651.</p

    Vision and visual history in elite-/near-elite level cricketers and rugby-league players

    Get PDF
    Background: The importance of optimal and/or superior vision for participation in high-level sport remains the subject of considerable clinical research interest. Here we examine the vision and visual history of elite/near-elite cricketers and rugby-league players. Methods: Stereoacuity (TNO), colour vision, and distance (with/without pinhole) and near visual acuity (VA) were measured in two cricket squads (elite/international-level, female, n=16; near-elite, male, n=23) and one professional rugby-league squad (male, n=20). Refractive error was determined, and details of any correction worn and visual history were recorded. Results: Overall, 63% had their last eye-examination within 2 years. However, some had not had an eye examination for 5 years, or had never had one (near-elite-cricketers: 30%; rugby-league players: 15%; elite-cricketers: 6%). Comparing our results for all participants to published data for young, optimally-corrected, non-sporting adults, distance VA was ~1 line of letters worse than expected. Adopting α=0.01, the deficit in distance-VA deficit was significant, but only for elite-cricketers (p0.02 for all comparisons). On average, stereoacuity was better than in young adults, but only in elite-cricketers (p<0.001; p=0.03, near-elite-cricketers; p=0.47, rugby-league -players). On-field visual issues were present in 27% of participants, and mostly (in 75% of cases) comprised uncorrected ametropia. Some cricketers (near-elite: 17.4%; elite: 38%) wore refractive correction during play but no rugby-league player did. Some individuals with prescribed correction choose not to wear it when playing. Conclusion: Aside from near stereoacuity in elite-cricketers, these basic visual abilities were not better than equivalent, published data for optimally-corrected adults. 20-25% exhibited sub-optimal vision, suggesting that the clearest possible vision might not be critical for participation at the highest levels in the sports of cricket or rugby-league. Although vision could be improved in a sizeable proportion of our sample, the impact of correcting these, mostly subtle, refractive anomalies on playing performance is unknown
    corecore