317 research outputs found

    The transformation in biomarker detection and management of drug-induced liver injury

    Get PDF
    Drug-induced liver injury (DILI) is a major concern for patients, care givers and the pharmaceutical industry. Interpretation of the serum biomarkers routinely used to detect and monitor DILI, which have not changed in almost 50 years, can be improved with recently proposed models employing quantitative systems pharmacology. In addition, several newer serum biomarkers are showing great promise. Studies in rodents indicate that the ratio of the caspase cleaved fragment of cytokeratin 18 to total K18 in serum (termed the “apoptotic index”) estimates the relative proportions of apoptosis vs necrosis during drug-induced liver injury. Glutamate dehydrogenase can reliably differentiate liver from muscle injury and, when serum is properly prepared, may also detect mitochondrial toxicity as a mechanism of liver injury. MicroRNA-122 is liver-specific, but recent data suggests it can be actively released from hepatocytes in the absence of overt toxicity limiting enthusiasm for it as a DILI biomarker. Finally, damage associated molecular patterns, particularly high mobility group box 1 and its various modified forms, are promising biomarkers of innate immune activation, which may be useful in distinguishing benign elevations in aminotransferases from those that portend clinically important liver injury. These new biomarkers are already being measured in early clinical trials, but broad acceptance will require widespread archiving of serum from diverse clinical trials and probably pre-competitive analysis efforts. We believe that utilization of a panel of traditional and newer biomarkers in conjunction with quantitative systems pharmacology modeling approaches will transform DILI detection and risk management

    The Role of FRMD7 in Idiopathic Infantile Nystagmus

    Get PDF
    Idiopathic infantile nystagmus (IIN) is an inherited disorder in which the nystagmus arises independently of any other symptoms, leading to the speculation that the disorder represents a primary defect in the area of the brain responsible for ocular motor control. The inheritance patterns are heterogeneous, however the most common form is X-linked. FRMD7 resides at Xq26-27 and approximately 50% of X-linked IIN families map to this region. Currently 45 mutations within FRMD7 have been associated with IIN, confirming the importance of FRMD7 in the pathogenesis of the disease. Although mutations in FRMD7 are known to cause IIN, very little is known about the function of the protein. FRMD7 contains a conserved N-terminal FERM domain suggesting that it may provide a link between the plasma membrane and actin cytoskeleton. Limited studies together with the knowledge of the function of other FERM domain containing proteins, suggest that FRMD7 may play a role in membrane extension during neuronal development through remodeling of the actin cytoskeleton

    The Development and Internal Evaluation of a Predictive Model to Identify for Whom Mindfulness-Based Cognitive Therapy Offers Superior Relapse Prevention for Recurrent Depression Versus Maintenance Antidepressant Medication

    Get PDF
    Depression is highly recurrent, even following successful pharmacological and/or psychological intervention. We aimed to develop clinical prediction models to inform adults with recurrent depression choosing between antidepressant medication (ADM) maintenance or switching to mindfulness-based cognitive therapy (MBCT). Using previously published data ( N = 424), we constructed prognostic models using elastic-net regression that combined demographic, clinical, and psychological factors to predict relapse at 24 months under ADM or MBCT. Only the ADM model (discrimination performance: area under the curve [AUC] = .68) predicted relapse better than baseline depression severity (AUC = .54; one-tailed DeLong’s test: z = 2.8, p = .003). Individuals with the poorest ADM prognoses who switched to MBCT had better outcomes compared with individuals who maintained ADM (48% vs. 70% relapse, respectively; superior survival times, z = −2.7, p = .008). For individuals with moderate to good ADM prognoses, both treatments resulted in similar likelihood of relapse. If replicated, the results suggest that predictive modeling can inform clinical decision-making around relapse prevention in recurrent depression

    Direct and Absolute Quantification of over 1800 Yeast Proteins via Selected Reaction Monitoring

    Get PDF
    Defining intracellular protein concentration is critical in molecular systems biology. Although strategies for determining relative protein changes are available, defining robust absolute values in copies per cell has proven significantly more challenging. Here we present a reference data set quantifying over 1800 Saccharomyces cerevisiae proteins by direct means using protein-specific stable-isotope labeled internal standards and selected reaction monitoring (SRM) mass spectrometry, far exceeding any previous study. This was achieved by careful design of over 100 QconCAT recombinant proteins as standards, defining 1167 proteins in terms of copies per cell and upper limits on a further 668, with robust CVs routinely less than 20%. The selected reaction monitoring-derived proteome is compared with existing quantitative data sets, highlighting the disparities between methodologies. Coupled with a quantification of the transcriptome by RNA-seq taken from the same cells, these data support revised estimates of several fundamental molecular parameters: a total protein count of ∼100 million molecules-per-cell, a median of ∼1000 proteins-per-transcript, and a linear model of protein translation explaining 70% of the variance in translation rate. This work contributes a “gold-standard” reference yeast proteome (including 532 values based on high quality, dual peptide quantification) that can be widely used in systems models and for other comparative studies. Reliable and accurate quantification of the proteins present in a cell or tissue remains a major challenge for post-genome scientists. Proteins are the primary functional molecules in biological systems and knowledge of their abundance and dynamics is an important prerequisite to a complete understanding of natural physiological processes, or dysfunction in disease. Accordingly, much effort has been spent in the development of reliable, accurate and sensitive techniques to quantify the cellular proteome, the complement of proteins expressed at a given time under defined conditions (1). Moreover, the ability to model a biological system and thus characterize it in kinetic terms, requires that protein concentrations be defined in absolute numbers (2, 3). Given the high demand for accurate quantitative proteome data sets, there has been a continual drive to develop methodology to accomplish this, typically using mass spectrometry (MS) as the analytical platform. Many recent studies have highlighted the capabilities of MS to provide good coverage of the proteome at high sensitivity often using yeast as a demonstrator system (4⇓⇓⇓⇓⇓–10), suggesting that quantitative proteomics has now “come of age” (1). However, given that MS is not inherently quantitative, most of the approaches produce relative quantitation and do not typically measure the absolute concentrations of individual molecular species by direct means. For the yeast proteome, epitope tagging studies using green fluorescent protein or tandem affinity purification tags provides an alternative to MS. Here, collections of modified strains are generated that incorporate a detectable, and therefore quantifiable, tag that supports immunoblotting or fluorescence techniques (11, 12). However, such strategies for copies per cell (cpc) quantification rely on genetic manipulation of the host organism and hence do not quantify endogenous, unmodified protein. Similarly, the tagging can alter protein levels - in some instances hindering protein expression completely (11). Even so, epitope tagging methods have been of value to the community, yielding high coverage quantitative data sets for the majority of the yeast proteome (11, 12). MS-based methods do not rely on such nonendogenous labels, and can reach genome-wide levels of coverage. Accurate estimation of absolute concentrations i.e. protein copy number per cell, also usually necessitates the use of (one or more) external or internal standards from which to derive absolute abundance (4). Examples include a comprehensive quantification of the Leptospira interrogans proteome that used a 19 protein subset quantified using selected reaction monitoring (SRM)1 to calibrate their label-free data (8, 13). It is worth noting that epitope tagging methods, although also absolute, rely on a very limited set of standards for the quantitative western blots and necessitate incorporation of a suitable immunogenic tag (11). Other recent, innovative approaches exploiting total ion signal and internal scaling to estimate protein cellular abundance (10, 14), avoid the use of internal standards, though they do rely on targeted proteomic data to validate their approach. The use of targeted SRM strategies to derive proteomic calibration standards highlights its advantages in comparison to label-free in terms of accuracy, precision, dynamic range and limit of detection and has gained currency for its reliability and sensitivity (3, 15⇓–17). Indeed, SRM is often referred to as the “gold standard proteomic quantification method,” being particularly well-suited when the proteins to be quantified are known, when appropriate surrogate peptides for protein quantification can be selected a priori, and matched with stable isotope-labeled (SIL) standards (18⇓–20). In combination with SIL peptide standards that can be generated through a variety of means (3, 15), SRM can be used to quantify low copy number proteins, reaching down to ∼50 cpc in yeast (5). However, although SRM methodology has been used extensively for S. cerevisiae protein quantification by us and others (19, 21, 22), it has not been used for large protein cohorts because of the requirement to generate the large numbers of attendant SIL peptide standards; the largest published data set is only for a few tens of proteins. It remains a challenge therefore to robustly quantify an entire eukaryotic proteome in absolute terms by direct means using targeted MS and this is the focus of our present study, the Census Of the Proteome of Yeast (CoPY). We present here direct and absolute quantification of nearly 2000 endogenous proteins from S. cerevisiae grown in steady state in a chemostat culture, using the SRM-based QconCAT approach. Although arguably not quantification of the entire proteome, this represents an accurate and rigorous collection of direct yeast protein quantifications, providing a gold-standard data set of endogenous protein levels for future reference and comparative studies. The highly reproducible SIL-SRM MS data, with robust CVs typically less than 20%, is compared with other extant data sets that were obtained via alternative analytical strategies. We also report a matched high quality transcriptome from the same cells using RNA-seq, which supports additional calculations including a refined estimate of the total protein content in yeast cells, and a simple linear model of translation explaining 70% of the variance between RNA and protein levels in yeast chemostat cultures. These analyses confirm the validity of our data and approach, which we believe represents a state-of-the-art absolute quantification compendium of a significant proportion of a model eukaryotic proteome

    Protocol for a multicentre randomised controlled trial of STeroid Administration Routes For Idiopathic Sudden sensorineural Hearing loss:The STARFISH trial

    Get PDF
    Idiopathic sudden sensorineural hearing loss (ISSNHL) is the rapid onset of reduced hearing due to loss of function of the inner ear or hearing nerve of unknown aetiology. Evidence supports improved hearing recovery with early steroid treatment, via oral, intravenous, intratympanic or a combination of routes. The STARFISH trial aims to identify the most clinically and cost-effective route of administration of steroids as first-line treatment for ISSNHL. STARFISH is a pragmatic, multicentre, assessor-blinded, three-arm intervention, superiority randomised controlled trial (1:1:1) with an internal pilot (ISRCTN10535105, IRAS 1004878). 525 participants with ISSNHL will be recruited from approximately 75 UK Ear, Nose and Throat units. STARFISH will recruit adults with sensorineural hearing loss averaging 30dBHL or greater across three contiguous frequencies (confirmed via pure tone audiogram), with onset over a ≤3-day period, within four weeks of randomisation. Participants will be randomised to 1) oral prednisolone 1mg/Kg/day up to 60mg/day for 7 days; 2) intratympanic dexamethasone: three intratympanic injections 3.3mg/ml or 3.8mg/ml spaced 7±2 days apart; or 3) combined oral and intratympanic steroids. The primary outcome will be absolute improvement in pure tone audiogram average at 12-weeks following randomisation (0.5, 1.0, 2.0 and 4.0kHz). Secondary outcomes at 6 and 12 weeks will include: Speech, Spatial and Qualities of hearing scale, high frequency pure tone average thresholds (4.0, 6.0 and 8.0kHz), Arthur Boothroyd speech test, Vestibular Rehabilitation Benefit Questionnaire, Tinnitus Functional Index, adverse events and optional weekly online speech and pure tone hearing tests. A health economic assessment will be performed, and presented in terms of incremental cost effectiveness ratios, and cost per quality-adjusted life-year. Primary analyses will be by intention-to-treat. Oral prednisolone will be the reference. For the primary outcome, the difference between group means and 97.5% confidence intervals at each time-point will be estimated via a repeated measures mixed-effects linear regression model

    Application of a Mechanistic Model to Evaluate Putative Mechanisms of Tolvaptan Drug-Induced Liver Injury and Identify Patient Susceptibility Factors

    Get PDF
    Tolvaptan is a selective vasopressin V2 receptor antagonist, approved in several countries for the treatment of hyponatremia and autosomal dominant polycystic kidney disease (ADPKD). No liver injury has been observed with tolvaptan treatment in healthy subjects and in non-ADPKD indications, but ADPKD clinical trials showed evidence of drug-induced liver injury (DILI). Although all DILI events resolved, additional monitoring in tolvaptan-treated ADPKD patients is required. In vitro assays identified alterations in bile acid disposition and inhibition of mitochondrial respiration as potential mechanisms underlying tolvaptan hepatotoxicity. This report details the application of DILIsym software to determine whether these mechanisms could account for the liver safety profile of tolvaptan observed in ADPKD clinical trials. DILIsym simulations included physiologically based pharmacokinetic estimates of hepatic exposure for tolvaptan and2 metabolites, and their effects on hepatocyte bile acid transporters and mitochondrial respiration. The frequency of predicted alanine aminotransferase (ALT) elevations, following simulated 90/30 mg split daily dosing, was 7.9% compared with clinical observations of 4.4% in ADPKD trials. Toxicity was multifactorial as inhibition of bile acid transporters and mitochondrial respiration contributed to the simulated DILI. Furthermore, simulation analysis identified both pre-treatment risk factors and on-treatment biomarkers predictive of simulated DILI. The simulations demonstrated that in vivo hepatic exposure to tolvaptan and the DM-4103 metabolite, combined with these 2 mechanisms of toxicity, were sufficient to account for the initiation of tolvaptan-mediated DILI. Identification of putative risk-factors and potential novel biomarkers provided insight for the development of mechanism-based tolvaptan risk-mitigation strategies

    The PTTG1-binding factor (PBF/PTTG1IP) regulates p53 activity in thyroid cells

    Get PDF
    The PTTG1-Binding Factor (PBF/PTTG1IP) has an emerging repertoire of roles, especially in thyroid biology, and functions as a proto-oncogene. High PBF expression is independently associated with poor prognosis and lower disease-specific survival in human thyroid cancer. However, the precise role of PBF in thyroid tumorigenesis is unclear. Here, we present extensive evidence demonstrating that PBF is a novel regulator of p53, a tumor suppressor protein with a key role in maintaining genetic stability, which is infrequently mutated in differentiated thyroid cancer. By coimmunoprecipitation and proximity ligation assays, we show that PBF binds specifically to p53 in thyroid cells, and significantly represses transactivation of responsive promoters. Further, we identify that PBF decreases p53 stability by enhancing ubiquitination, which appears dependent on the E3 ligase activity of Mdm2. Impaired p53 function was evident in a transgenic mouse model with thyroid-specific PBF over-expression (PBF-Tg), which had significantly increased genetic instability as indicated by FISSR-PCR analysis. Consistent with this, ~40% of all DNA repair genes examined were repressed in PBF-Tg primary cultures, including genes with critical roles in maintaining genomic integrity such as Mgmt, Rad51 and Xrcc3. Our data also revealed that PBF induction resulted in upregulation of the E2 enzyme Rad6 in murine thyrocytes, and was associated with Rad6 expression in human thyroid tumors. Overall, this work provides novel insights into the role of the proto-oncogene PBF as a negative regulator of p53 function in thyroid tumorigenesis, where PBF is generally over-expressed and p53 mutations are rare compared to other tumor types

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Centaurs and Scattered Disk Objects in the Thermal Infrared: Analysis of WISE/NEOWISE Observations

    Get PDF
    The Wide-field Infrared Survey Explorer (WISE) observed 52 Centaurs and scattered disk objects (SDOs) in the thermal infrared, including 15 new discoveries. We present analyses of these observations to estimate sizes and mean optical albedos. We find mean albedos of 0.08 ± 0.04 for the entire data set. Thermal fits yield average beaming parameters of 0.9 ± 0.2 that are similar for both SDO and Centaur sub-classes. Biased cumulative size distributions yield size-frequency distribution power law indices of ~–1.7 ± 0.3. The data also reveal a relation between albedo and color at the 3σ level. No significant relation between diameter and albedos is found

    Building the health-economic case for scaling up the WHO-HEARTS hypertension control package in low- and middle-income countries

    Get PDF
    Generally, hypertension control programs are cost-effective, including in low- and middle-income countries, but country governments and civil society are not likely to support hypertension control programs unless value is demonstrated in terms of public health benefits, budget impact, and value-for-investment for the individual country context. The World Health Organization (WHO) and the Pan American Health Organization (PAHO) established a standard, simplified Global HEARTS approach to hypertension control, including preferred antihypertensive medicines and blood pressure measurement devices. The objective of this study is to report on health economic studies of HEARTS hypertension control package cost (especially medication costs), cost-effectiveness, and budget impact and describe mathematical models designed to translate hypertension control program data into the optimal approach to hypertension care service delivery and financing, especially in lowand middle-income countries. Early results suggest that HEARTS hypertension control interventions are either cost-saving or cost-effective, that the HEARTS package is affordable at between US1844perpersontreatedperyear,andthatantihypertensivemedicinescouldbepricedlowenoughtoreachaglobalstandardofanaverage<US 18-44 per person treated per year, and that antihypertensive medicines could be priced low enough to reach a global standard of an average <US 5 per patient per year in the public sector. This health economic evidence will make a compelling case for government ownership and financial support for national scale hypertension control programs
    corecore