1,096 research outputs found

    Mig6 haploinsufficiency protects mice against streptozotocin-induced diabetes

    Get PDF
    AIMS/HYPOTHESIS: EGF and gastrin co-administration reverses type 1 diabetes in rodent models. However, the failure of this to translate into a clinical treatment suggests that EGF-mediated tissue repair is a complicated process and warrants further investigation. Thus, we aimed to determine whether EGF receptor (EGFR) feedback inhibition by mitogen-inducible gene 6 protein (MIG6) limits the effectiveness of EGF therapy and promotes type 1 diabetes development. METHODS: We treated Mig6 (also known as Errfi1) haploinsufficient mice (Mig6 (+/-)) and their wild-type littermates (Mig6 (+/+)) with multiple low doses of streptozotocin (STZ), and monitored diabetes development via glucose homeostasis tests and histological analyses. We also investigated MIG6-mediated cytokine-induced desensitisation of EGFR signalling and the DNA damage repair response in 832/13 INS-1 beta cells. RESULTS: Whereas STZ-treated Mig6 (+/+) mice became diabetic, STZ-treated Mig6 (+/-) mice remained glucose tolerant. In addition, STZ-treated Mig6 (+/-) mice exhibited preserved circulating insulin levels following a glucose challenge. As insulin sensitivity was similar between Mig6 (+/-) and Mig6 (+/+) mice, the preserved glucose tolerance in STZ-treated Mig6 (+/-) mice probably results from preserved beta cell function. This is supported by elevated Pdx1 and Irs2 mRNA levels in islets isolated from STZ-treated Mig6 (+/-) mice. Conversely, MIG6 overexpression in isolated islets compromises glucose-stimulated insulin secretion. Studies in 832/13 cells suggested that cytokine-induced MIG6 hinders EGFR activation and inhibits DNA damage repair. STZ-treated Mig6 (+/-) mice also have increased beta cell mass recovery. CONCLUSIONS/INTERPRETATION: Reducing Mig6 expression promotes beta cell repair and abates the development of experimental diabetes, suggesting that MIG6 may be a novel therapeutic target for preserving beta cell

    Climatological Study of the Short-Term Variation of the 0 C, -10 C, and -20 C Altitude Levels over the Florida Spaceport

    Get PDF
    For evaluation of the potential of cloud electrification, it is necessary to know the altitude of the 0, -10 and -20 degree Celsius levels. Cape Canaveral Air Force Station has recorded balloon launch data back to 1989. In support of rocket launches, often multiple balloons are launched within minutes of each other in the 4-6 hours leading up to launch. In the past, temperature data from sondes was typically available every hour or so through the launch countdown, allowing for frequent updates of these critical temperature thresholds. Recently, launch customers are relying on Jimsphere and wind profiler data that do not have a thermodynamic component in the latter 4-6 hours of a countdown. This study compares the altitude differences of the 0, -10 and -20 degree Celsius levels from consecutive balloon pairs not to exceed 6 hours apart. The analysis uses 9685 soundings from 1989 to 2013. Approximately 5 of the time the altitude of the temperature level in question (0, -10, -20 degrees C), varies by more than 500 feet (operationally significant threshold) within 6 hours. This study analyzes the altitude variability as a function of several meteorological parameters, such as change in sonde type, dew point depression, and solar zenith angle. Additionally, the study concludes with impacts to launch operations

    Massive Galaxies in COSMOS: Evolution of Black hole versus bulge mass but not versus total stellar mass over the last 9 Gyrs?

    Get PDF
    We constrain the ratio of black hole (BH) mass to total stellar mass of type-1 AGN in the COSMOS survey at 1<z<2. For 10 AGN at mean redshift z~1.4 with both HST/ACS and HST/NICMOS imaging data we are able to compute total stellar mass M_(*,total), based on restframe UV-to-optical host galaxy colors which constrain mass-to-light ratios. All objects have virial BH mass-estimates available from the COSMOS Magellan/IMACS and zCOSMOS surveys. We find zero difference between the M_BH--M_(*,total)-relation at z~1.4 and the M_BH--M_(*,bulge)-relation in the local Universe. Our interpretation is: (a) If our objects were purely bulge-dominated, the M_BH--M_(*,bulge)-relation has not evolved since z~1.4. However, (b) since we have evidence for substantial disk components, the bulges of massive galaxies (logM_(*,total)=11.1+-0.25 or logM_BH~8.3+-0.2) must have grown over the last 9 Gyrs predominantly by redistribution of disk- into bulge-mass. Since all necessary stellar mass exists in the galaxy at z=1.4, no star-formation or addition of external stellar material is required, only a redistribution e.g. induced by minor and major merging or through disk instabilities. Merging, in addition to redistributing mass in the galaxy, will add both BH and stellar/bulge mass, but does not change the overall final M_BH/M_(*,bulge) ratio. Since the overall cosmic stellar and BH mass buildup trace each other tightly over time, our scenario of bulge-formation in massive galaxies is independent of any strong BH-feedback and means that the mechanism coupling BH and bulge mass until the present is very indirect.Comment: Published in ApJL; 7 pages, 2 figures; updated to accepted version (methods changed, results unchanged

    Detailed monitoring reveals the nature of submarine turbidity currents

    Get PDF
    Seafloor sediment flows, called turbidity currents, form the largest sediment accumulations, deepest canyons, and longest channels on Earth. It was once thought that turbidity currents were impractical to measure in action, especially due to their ability to damage sensors in their path, but direct monitoring since the mid 2010s has measured them in detail. In this Review, we summarise knowledge of turbidity currents gleaned from this direct monitoring. Monitoring identifies triggering mechanisms from dilute river-plumes, and shows how rapid sediment accumulation can precondition slope failure, but the final triggers can be delayed and subtle. Turbidity currents are consistently more frequent than predicted by past sequence stratigraphic models, including at sites >300 km from any coast. Faster (>~1.5 m s–1) flows are driven by a dense near-bed layer at their front, whereas slower flows are entirely dilute. This frontal layer sometimes erodes large (>2.5 km3) volumes of sediment, yet maintains a near-uniform speed, leading to a travelling wave model. Monitoring shows that flows sculpt canyons and channels through fast-moving knickpoints, and how deposits originate. Emerging technologies with reduced cost and risk can lead to widespread monitoring of turbidity currents, so their sediment and carbon fluxes can be compared with other major global transport processes

    A site assessment tool for inpatient controlled human infection models for enteric disease pathogens

    Get PDF
    The use of the controlled human infection model to facilitate product development and to advance understanding of host-pathogen interactions is of increasing interest. While administering a virulent (or infective) organism to a susceptible host necessitates an ongoing evaluation of safety and ethical considerations, a central theme in conducting these studies in a safe and ethical manner that yields actionable data is their conduct in facilities well-suited to address their unique attributes. To that end, we have developed a framework for evaluating potential sites in which to conduct inpatient enteric controlled human infection model to ensure consistency and increase the likelihood of success.publishedVersio

    Alleviating Environmental Health Disparities Through Community Science and Data Integration

    Get PDF
    Environmental contamination is a fundamental determinant of health and well-being, and when the environment is compromised, vulnerabilities are generated. The complex challenges associated with environmental health and food security are influenced by current and emerging political, social, economic, and environmental contexts. To solve these “wicked” dilemmas, disparate public health surveillance efforts are conducted by local, state, and federal agencies. More recently, citizen/community science (CS) monitoring efforts are providing site-specific data. One of the biggest challenges in using these government datasets, let alone incorporating CS data, for a holistic assessment of environmental exposure is data management and interoperability. To facilitate a more holistic perspective and approach to solution generation, we have developed a method to provide a common data model that will allow environmental health researchers working at different scales and research domains to exchange data and ask new questions. We anticipate that this method will help to address environmental health disparities, which are unjust and avoidable, while ensuring CS datasets are ethically integrated to achieve environmental justice. Specifically, we used a transdisciplinary research framework to develop a methodology to integrate CS data with existing governmental environmental monitoring and social attribute data (vulnerability and resilience variables) that span across 10 different federal and state agencies. A key challenge in integrating such different datasets is the lack of widely adopted ontologies for vulnerability and resiliency factors. In addition to following the best practice of submitting new term requests to existing ontologies to fill gaps, we have also created an application ontology, the Superfund Research Project Data Interface Ontology (SRPDIO)

    A primary care, multi-disciplinary disease management program for opioid-treated patients with chronic non-cancer pain and a high burden of psychiatric comorbidity

    Get PDF
    BACKGROUND: Chronic non-cancer pain is a common problem that is often accompanied by psychiatric comorbidity and disability. The effectiveness of a multi-disciplinary pain management program was tested in a 3 month before and after trial. METHODS: Providers in an academic general medicine clinic referred patients with chronic non-cancer pain for participation in a program that combined the skills of internists, clinical pharmacists, and a psychiatrist. Patients were either receiving opioids or being considered for opioid therapy. The intervention consisted of structured clinical assessments, monthly follow-up, pain contracts, medication titration, and psychiatric consultation. Pain, mood, and function were assessed at baseline and 3 months using the Brief Pain Inventory (BPI), the Center for Epidemiological Studies-Depression Scale scale (CESD) and the Pain Disability Index (PDI). Patients were monitored for substance misuse. RESULTS: Eighty-five patients were enrolled. Mean age was 51 years, 60% were male, 78% were Caucasian, and 93% were receiving opioids. Baseline average pain was 6.5 on an 11 point scale. The average CESD score was 24.0, and the mean PDI score was 47.0. Sixty-three patients (73%) completed 3 month follow-up. Fifteen withdrew from the program after identification of substance misuse. Among those completing 3 month follow-up, the average pain score improved to 5.5 (p = 0.003). The mean PDI score improved to 39.3 (p < 0.001). Mean CESD score was reduced to 18.0 (p < 0.001), and the proportion of depressed patients fell from 79% to 54% (p = 0.003). Substance misuse was identified in 27 patients (32%). CONCLUSIONS: A primary care disease management program improved pain, depression, and disability scores over three months in a cohort of opioid-treated patients with chronic non-cancer pain. Substance misuse and depression were common, and many patients who had substance misuse identified left the program when they were no longer prescribed opioids. Effective care of patients with chronic pain should include rigorous assessment and treatment of these comorbid disorders and intensive efforts to insure follow up
    corecore