966 research outputs found
Role of hypoxia inducible factor-1α (HIF-1α) in innate defense against uropathogenic Escherichia coli infection
Uropathogenic E. coli (UPEC) is the primary cause of urinary tract infections (UTI) affecting approximately 150 million people worldwide. Here, we revealed the importance of transcriptional regulator hypoxia-inducible factor-1 α subunit (HIF-1α) in innate defense against UPEC-mediated UTI. The effects of AKB-4924, a HIF-1α stabilizing agent, were studied using human uroepithelial cells (5637) and a murine UTI model. UPEC adherence and invasion were significantly reduced in 5637 cells when HIF-1α protein was allowed to accumulate. Uroepithelial cells treated with AKB-4924 also experienced reduced cell death and exfoliation upon UPEC challenge. In vivo, fewer UPEC were recovered from the urine, bladders and kidneys of mice treated transurethrally with AKB-4924, whereas increased bacteria were recovered from bladders of mice with a HIF-1α deletion. Bladders and kidneys of AKB-4924 treated mice developed less inflammation as evidenced by decreased pro-inflammatory cytokine release and neutrophil activity. AKB-4924 impairs infection in uroepithelial cells and bladders, and could be correlated with enhanced production of nitric oxide and antimicrobial peptides cathelicidin and β-defensin-2. We conclude that HIF-1α transcriptional regulation plays a key role in defense of the urinary tract against UPEC infection, and that pharmacological HIF-1α boosting could be explored further as an adjunctive therapy strategy for serious or recurrent UTI
Evaluation of noise regression techniques in resting-state fMRI studies using data of 434 older adults
Subject motion is a well-known confound in resting-state functional MRI (rs-fMRI) and the analysis of functional connectivity. Consequently, several clean-up strategies have been established to minimize the impact of subject motion. Physiological signals in response to cardiac activity and respiration are also known to alter the apparent rs-fMRI connectivity. Comprehensive comparisons of common noise regression techniques showed that the Independent Component Analysis based strategy for Automatic Removal of Motion Artifacts (ICA-AROMA) was a preferred pre-processing technique for teenagers and adults. However, motion and physiological noise characteristics may differ substantially for older adults. Here, we present a comprehensive comparison of noise-regression techniques for older adults from a large multi-site clinical trial of exercise and intensive pharmacological vascular risk factor reduction. The Risk Reduction for Alzheimer\u27s Disease (rrAD) trial included hypertensive older adults (60-84 years old) at elevated risk of developing Alzheimer\u27s Disease (AD). We compared the performance of censoring, censoring combined with global signal regression, non-aggressive and aggressive ICA-AROMA, as well as the Spatially Organized Component Klassifikator (SOCK) on the rs-fMRI baseline scans from 434 rrAD subjects. All techniques were rated based on network reproducibility, network identifiability, edge activity, spatial smoothness, and loss of temporal degrees of freedom (tDOF). We found that non-aggressive ICA-AROMA did not perform as well as the other four techniques, which performed table with marginal differences, demonstrating the validity of these techniques. Considering reproducibility as the most important factor for longitudinal studies, given low false-positive rates and a better preserved, more cohesive temporal structure, currently aggressive ICA-AROMA is likely the most suitable noise regression technique for rs-fMRI studies of older adults
Rationale and Methods for a Multicenter Clinical Trial Assessing Exercise and Intensive Vascular Risk Reduction in Preventing Dementia (rrAD Study)
Alzheimer\u27s Disease (AD) is an age-related disease with modifiable risk factors such as hypertension, hypercholesterolemia, obesity, and physical inactivity influencing the onset and progression. There is however, no direct evidence that reducing these risk factors prevents or slows AD. The Risk Reduction for Alzheimer\u27s Disease (rrAD) trial is designed to study the independent and combined effects of intensive pharmacological control of blood pressure and cholesterol and exercise training on neurocognitive function. Six hundred and forty cognitively normal older adults age 60 to 85 years with hypertension and increased risk for dementia will be enrolled. Participants are randomized into one of four intervention group for two years: usual care, Intensive Reduction of Vascular Risk factors (IRVR) with blood pressure and cholesterol reduction, exercise training (EX), and IRVR+EX. Neurocognitive function is measured at baseline, 6, 12, 18, and 24 months; brain MRIs are obtained at baseline and 24 months. We hypothesize that both IRVR and EX will improve global cognitive function, while IRVR+EX will provide a greater benefit than either IRVR or EX alone. We also hypothesize that IRVR and EX will slow brain atrophy, improve brain structural and functional connectivity, and improve brain perfusion. Finally, we will explore the mechanisms by which study interventions impact neurocognition and brain. If rrAD interventions are shown to be safe, practical, and successful, our study will have a significant impact on reducing the risks of AD in older adults.
NCT Registration: NCT02913664
Sensory Measurements: Coordination and Standardization
Do sensory measurements deserve the label of “measurement”? We argue that they do. They fit with an epistemological view of measurement held in current philosophy of science, and they face the same kinds of epistemological challenges as physical measurements do: the problem of coordination and the problem of standardization. These problems are addressed through the process of “epistemic iteration,” for all measurements. We also argue for distinguishing the problem of standardization from the problem of coordination. To exemplify our claims, we draw on olfactory performance tests, especially studies linking olfactory decline to neurodegenerative disorders
Evaluation of a blocking ELISA for the detection of antibodies against Lawsonia intracellularis in pig sera
<p>Abstract</p> <p>Background</p> <p><it>Lawsonia intracellularis </it>is a common cause of chronic diarrhoea and poor performance in young growing pigs. Diagnosis of this obligate intracellular bacterium is based on the demonstration of the microbe or microbial DNA in tissue specimens or faecal samples, or the demonstration of <it>L. intracellularis</it>-specific antibodies in sera. The aim of the present study was to evaluate a blocking ELISA in the detection of serum antibodies to <it>L. intracellularis</it>, by comparison to the previously widely used immunofluorescent antibody test (IFAT).</p> <p>Methods</p> <p>Sera were collected from 176 pigs aged 8-12 weeks originating from 24 herds with or without problems with diarrhoea and poor performance in young growing pigs. Sera were analyzed by the blocking ELISA and by IFAT. Bayesian modelling techniques were used to account for the absence of a gold standard test and the results of the blocking ELISA was modelled against the IFAT test with a "2 dependent tests, 2 populations, no gold standard" model.</p> <p>Results</p> <p>At the finally selected cut-off value of percent inhibition (PI) 35, the diagnostic sensitivity of the blocking ELISA was 72% and the diagnostic specificity was 93%. The positive predictive value was 0.82 and the negative predictive value was 0.89, at the observed prevalence of 33.5%.</p> <p>Conclusion</p> <p>The sensitivity and specificity as evaluated by Bayesian statistic techniques differed from that previously reported. Properties of diagnostic tests may well vary between countries, laboratories and among populations of animals. In the absence of a true gold standard, the importance of validating new methods by appropriate statistical methods and with respect to the target population must be emphasized.</p
Politics at the Cutting Edge: Intergovernmental Policy Innovation in the Affordable Care Act
In the eight years since the passage of the Patient Protection and Affordable Care Act (ACA), state governments have remained critical sites of contention over the law. Intense partisan conflict over ACA implementation has raised questions about traditional theories of intergovernmental relations, which posit that federal–state cooperation depends largely on policy design. Yet, few studies have examined how partisanship, as well as other important factors, shape state policy innovations under the ACA. This article examines the ACA’s State Innovation Models (SIM) initiative. SIM is specifically geared towards incentivizing states to experiment with new models of payment and delivery that can improve health outcomes and/or reduce health-care costs. Drawing on a combination of quantitative and qualitative evidence, we find that states’ participation in SIM is shaped by partisanship, administrative capacity, and state policy legacies. Our findings have implications for future efforts at intergovernmental health reforms
Serum MicroRNA Signatures Identified by Solexa Sequencing Predict Sepsis Patients’ Mortality: A Prospective Observational Study
Sepsis is the leading cause of death in Intensive Care Units. Novel sepsis biomarkers and targets for treatment are needed to improve mortality from sepsis. MicroRNAs (miRNAs) have recently been used as finger prints for sepsis, and our goal in this prospective study was to investigate if serum miRNAs identified in genome-wide scans could predict sepsis mortality.We enrolled 214 sepsis patients (117 survivors and 97 non-survivors based on 28-day mortality). Solexa sequencing followed by quantitative reverse transcriptase polymerase chain reaction assays was used to test for differences in the levels of miRNAs between survivors and non-survivors. miR-223, miR-15a, miR-16, miR-122, miR-193*, and miR-483-5p were significantly differentially expressed. Receiver operating characteristic curves were generated and the areas under the curve (AUC) for these six miRNAs for predicting sepsis mortality ranged from 0.610 (95%CI: 0.523-0.697) to 0.790 (95%CI: 0.719-0.861). Logistic regression analysis showed that sepsis stage, Sequential Organ Failure Assessment scores, Acute Physiology and Chronic Health Evaluation II scores, miR-15a, miR-16, miR-193b*, and miR-483-5p were associated with death from sepsis. An analysis was done using these seven variables combined. The AUC for these combined variables' predictive probability was 0.953 (95% CI: 0.923-0.983), which was much higher than the AUCs for Acute Physiology and Chronic Health Evaluation II scores (0.782; 95% CI: 0.712-0.851), Sequential Organ Failure Assessment scores (0.752; 95% CI: 0.672-0.832), and procalcitonin levels (0.689; 95% CI: 0.611-0.784). With a cut-off point of 0.550, the predictive value of the seven variables had a sensitivity of 88.5% and a specificity of 90.4%. Additionally, miR-193b* had the highest odds ratio for sepsis mortality of 9.23 (95% CI: 1.20-71.16).Six serum miRNA's were identified as prognostic predictors for sepsis patients.ClinicalTrials.gov NCT01207531
Expert Panel Curation of 113 Primary Mitochondrial Disease Genes for the Leigh Syndrome Spectrum
OBJECTIVE: Primary mitochondrial diseases (PMDs) are heterogeneous disorders caused by inherited mitochondrial dysfunction. Classically defined neuropathologically as subacute necrotizing encephalomyelopathy, Leigh syndrome spectrum (LSS) is the most frequent manifestation of PMD in children, but may also present in adults. A major challenge for accurate diagnosis of LSS in the genomic medicine era is establishing gene-disease relationships (GDRs) for this syndrome with >100 monogenic causes across both nuclear and mitochondrial genomes. METHODS: The Clinical Genome Resource (ClinGen) Mitochondrial Disease Gene Curation Expert Panel (GCEP), comprising 40 international PMD experts, met monthly for 4 years to review GDRs for LSS. The GCEP standardized gene curation for LSS by refining the phenotypic definition, modifying the ClinGen Gene-Disease Clinical Validity Curation Framework to improve interpretation for LSS, and establishing a scoring rubric for LSS. RESULTS: The GDR with LSS across the nuclear and mitochondrial genomes was classified as definitive for 31/114 gene-disease relationships curated (27%); moderate for 38 (33%); limited for 43 (38%); and 2 as disputed (2%). Ninety genes were associated with autosomal recessive inheritance, 16 were maternally inherited, 5 autosomal dominant, and 3 X-linked. INTERPRETATION: GDRs for LSS were established for genes across both nuclear and mitochondrial genomes. Establishing these GDRs will allow accurate variant interpretation, expedite genetic diagnosis of LSS, and facilitate precision medicine, multi-system organ surveillance, recurrence risk counselling, reproductive choice, natural history studies and eligibility for interventional clinical trials. This article is protected by copyright. All rights reserved
- …