144 research outputs found

    Bariatric surgery in morbidly obese insulin resistant humans normalises insulin signalling but not insulin-stimulated glucose disposal.

    Get PDF
    This is the final published version. Available from PLoS via the DOI in this record.All relevant data are available from Figshare, under the DOI http://dx.doi. org/10.6084/m9.figshare.1292883.AIMS: Weight-loss after bariatric surgery improves insulin sensitivity, but the underlying molecular mechanism is not clear. To ascertain the effect of bariatric surgery on insulin signalling, we examined glucose disposal and Akt activation in morbidly obese volunteers before and after Roux-en-Y gastric bypass surgery (RYGB), and compared this to lean volunteers. MATERIALS AND METHODS: The hyperinsulinaemic euglycaemic clamp, at five infusion rates, was used to determine glucose disposal rates (GDR) in eight morbidly obese (body mass index, BMI=47.3 ± 2.2 kg/m(2)) patients, before and after RYGB, and in eight lean volunteers (BMI=20.7 ± 0.7 kg/m2). Biopsies of brachioradialis muscle, taken at fasting and insulin concentrations that induced half-maximal (GDR50) and maximal (GDR100) GDR in each subject, were used to examine the phosphorylation of Akt-Thr308, Akt-473, and pras40, in vivo biomarkers for Akt activity. RESULTS: Pre-operatively, insulin-stimulated GDR was lower in the obese compared to the lean individuals (P<0.001). Weight-loss of 29.9 ± 4 kg after surgery significantly improved GDR50 (P=0.004) but not GDR100 (P=0.3). These subjects still remained significantly more insulin resistant than the lean individuals (p<0.001). Weight loss increased insulin-stimulated skeletal muscle Akt-Thr308 and Akt-Ser473 phosphorylation, P=0.02 and P=0.03 respectively (MANCOVA), and Akt activity towards the substrate PRAS40 (P=0.003, MANCOVA), and in contrast to GDR, were fully normalised after the surgery (obese vs lean, P=0.6, P=0.35, P=0.46, respectively). CONCLUSIONS: Our data show that although Akt activity substantially improved after surgery, it did not lead to a full restoration of insulin-stimulated glucose disposal. This suggests that a major defect downstream of, or parallel to, Akt signalling remains after significant weight-loss.Diabetes Research and Wellness FoundatioMedical Research Council (MRC)Astra Zenec

    [89Zr]Oxinate4 for long-term in vivo cell tracking by positron emission tomography

    Get PDF
    Purpose 111In (typically as [111In]oxinate3) is a gold standard radiolabel for cell tracking in humans by scintigraphy. A long half-life positron-emitting radiolabel to serve the same purpose using positron emission tomography (PET) has long been sought. We aimed to develop an 89Zr PET tracer for cell labelling and compare it with [111In]oxinate3 single photon emission computed tomography (SPECT). Methods [89Zr]Oxinate4 was synthesised and its uptake and efflux were measured in vitro in three cell lines and in human leukocytes. The in vivo biodistribution of eGFP-5T33 murine myeloma cells labelled using [89Zr]oxinate4 or [111In]oxinate3 was monitored for up to 14 days. 89Zr retention by living radiolabelled eGFP-positive cells in vivo was monitored by FACS sorting of liver, spleen and bone marrow cells followed by gamma counting. Results Zr labelling was effective in all cell types with yields comparable with 111In labelling. Retention of 89Zr in cells in vitro after 24 h was significantly better (range 71 to >90 %) than 111In (43–52 %). eGFP-5T33 cells in vivo showed the same early biodistribution whether labelled with 111In or 89Zr (initial pulmonary accumulation followed by migration to liver, spleen and bone marrow), but later translocation of radioactivity to kidneys was much greater for 111In. In liver, spleen and bone marrow at least 92 % of 89Zr remained associated with eGFP-positive cells after 7 days in vivo. Conclusion [89Zr]Oxinate4 offers a potential solution to the emerging need for a long half-life PET tracer for cell tracking in vivo and deserves further evaluation of its effects on survival and behaviour of different cell types

    A Simulated Annealing Approach to Approximate Bayes Computations

    Full text link
    Approximate Bayes Computations (ABC) are used for parameter inference when the likelihood function of the model is expensive to evaluate but relatively cheap to sample from. In particle ABC, an ensemble of particles in the product space of model outputs and parameters is propagated in such a way that its output marginal approaches a delta function at the data and its parameter marginal approaches the posterior distribution. Inspired by Simulated Annealing, we present a new class of particle algorithms for ABC, based on a sequence of Metropolis kernels, associated with a decreasing sequence of tolerances w.r.t. the data. Unlike other algorithms, our class of algorithms is not based on importance sampling. Hence, it does not suffer from a loss of effective sample size due to re-sampling. We prove convergence under a condition on the speed at which the tolerance is decreased. Furthermore, we present a scheme that adapts the tolerance and the jump distribution in parameter space according to some mean-fields of the ensemble, which preserves the statistical independence of the particles, in the limit of infinite sample size. This adaptive scheme aims at converging as close as possible to the correct result with as few system updates as possible via minimizing the entropy production in the system. The performance of this new class of algorithms is compared against two other recent algorithms on two toy examples.Comment: 20 pages, 2 figure

    Identification and correction of previously unreported spatial phenomena using raw Illumina BeadArray data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A key stage for all microarray analyses is the extraction of feature-intensities from an image. If this step goes wrong, then subsequent preprocessing and processing stages will stand little chance of rectifying the matter. Illumina employ random construction of their BeadArrays, making feature-intensity extraction even more important for the Illumina platform than for other technologies. In this paper we show that using raw Illumina data it is possible to identify, control, and perhaps correct for a range of spatial-related phenomena that affect feature-intensity extraction.</p> <p>Results</p> <p>We note that feature intensities can be unnaturally high when in the proximity of a number of phenomena relating either to the images themselves or to the layout of the beads on an array. Additionally we note that beads neighbour beads of the same type more often than one might expect, which may cause concern in some models of hybridization. We highlight issues in the identification of a bead's location, and in particular how this both affects and is affected by its intensity. Finally we show that beads can be wrongly identified in the image on either a local or array-wide scale, with obvious implications for data quality.</p> <p>Conclusions</p> <p>The image processing issues identified will often pass unnoticed by an analysis of the standard data returned from an experiment. We detail some simple diagnostics that can be implemented to identify problems of this nature, and outline approaches to correcting for such problems. These approaches require access to the raw data from the arrays, not just the summarized data usually returned, making the acquisition of such raw data highly desirable.</p

    Bayesian model comparison with un-normalised likelihoods

    Get PDF
    Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates

    ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Get PDF
    BACKGROUND: The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. RESULTS: Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. CONCLUSION: ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results

    Analytical Framework for Identifying and Differentiating Recent Hitchhiking and Severe Bottleneck Effects from Multi-Locus DNA Sequence Data

    Get PDF
    Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This paper develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction with constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. The results show that significant discrepancies can exist between the estimates

    Modeling Evolutionary Dynamics of Epigenetic Mutations in Hierarchically Organized Tumors

    Get PDF
    The cancer stem cell (CSC) concept is a highly debated topic in cancer research. While experimental evidence in favor of the cancer stem cell theory is apparently abundant, the results are often criticized as being difficult to interpret. An important reason for this is that most experimental data that support this model rely on transplantation studies. In this study we use a novel cellular Potts model to elucidate the dynamics of established malignancies that are driven by a small subset of CSCs. Our results demonstrate that epigenetic mutations that occur during mitosis display highly altered dynamics in CSC-driven malignancies compared to a classical, non-hierarchical model of growth. In particular, the heterogeneity observed in CSC-driven tumors is considerably higher. We speculate that this feature could be used in combination with epigenetic (methylation) sequencing studies of human malignancies to prove or refute the CSC hypothesis in established tumors without the need for transplantation. Moreover our tumor growth simulations indicate that CSC-driven tumors display evolutionary features that can be considered beneficial during tumor progression. Besides an increased heterogeneity they also exhibit properties that allow the escape of clones from local fitness peaks. This leads to more aggressive phenotypes in the long run and makes the neoplasm more adaptable to stringent selective forces such as cancer treatment. Indeed when therapy is applied the clone landscape of the regrown tumor is more aggressive with respect to the primary tumor, whereas the classical model demonstrated similar patterns before and after therapy. Understanding these often counter-intuitive fundamental properties of (non-)hierarchically organized malignancies is a crucial step in validating the CSC concept as well as providing insight into the therapeutical consequences of this model

    Cell Cycle Genes Are the Evolutionarily Conserved Targets of the E2F4 Transcription Factor

    Get PDF
    Maintaining quiescent cells in G0 phase is achieved in part through the multiprotein subunit complex known as DREAM, and in human cell lines the transcription factor E2F4 directs this complex to its cell cycle targets. We found that E2F4 binds a highly overlapping set of human genes among three diverse primary tissues and an asynchronous cell line, which suggests that tissue-specific binding partners and chromatin structure have minimal influence on E2F4 targeting. To investigate the conservation of these transcription factor binding events, we identified the mouse genes bound by E2f4 in seven primary mouse tissues and a cell line. E2f4 bound a set of mouse genes that was common among mouse tissues, but largely distinct from the genes bound in human. The evolutionarily conserved set of E2F4 bound genes is highly enriched for functionally relevant regulatory interactions important for maintaining cellular quiescence. In contrast, we found minimal mRNA expression perturbations in this core set of E2f4 bound genes in the liver, kidney, and testes of E2f4 null mice. Thus, the regulatory mechanisms maintaining quiescence are robust even to complete loss of conserved transcription factor binding events
    corecore