2,503 research outputs found

    A novel approach to simulate gene-environment interactions in complex diseases

    Get PDF
    Background: Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. Results: We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. Conclusions: By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study

    SUP: an extension to SLINK to allow a larger number of marker loci to be simulated in pedigrees conditional on trait values

    Get PDF
    BACKGROUND: With the recent advances in high-throughput genotyping technologies that allow for large-scale association mapping of human complex traits, promising statistical designs and methods have been emerging. Efficient simulation software are key elements for the evaluation of the properties of new statistical tests. SLINK is a flexible simulation tool that has been widely used to generate the segregation and recombination processes of markers linked to, and possibly associated with, a trait locus, conditional on trait values in arbitrary pedigrees. In practice, its most serious limitation is the small number of loci that can be simulated, since the complexity of the algorithm scales exponentially with this number. RESULTS: I describe the implementation of a two-step algorithm to be used in conjunction with SLINK to enable the simulation of a large number of marker loci linked to a trait locus and conditional on trait values in families, with the possibility for the loci to be in linkage disequilibrium. SLINK is used in the first step to simulate genotypes at the trait locus conditional on the observed trait values, and also to generate an indicator of the descent path of the simulated alleles. In the second step, marker alleles or haplotypes are generated in the founders, conditional on the trait locus genotypes simulated in the first step. Then the recombination process between the marker loci takes place conditionally on the descent path and on the trait locus genotypes. This two-step implementation is often computationally faster than other software that are designed to generate marker data linked to, and possibly associated with, a trait locus. CONCLUSION: Because the proposed method uses SLINK to simulate the segregation process, it benefits from its flexibility: the trait may be qualitative with the possibility of defining different liability classes (which allows for the simulation of gene-environment interactions or even the simulation of multi-locus effects between unlinked susceptibility regions) or it may be quantitative and normally distributed. In particular, this implementation is the only one available that can generate a large number of marker loci conditional on the set of observed quantitative trait values in pedigrees

    Ion Trap in a Semiconductor Chip

    Get PDF
    The electromagnetic manipulation of isolated atoms has led to many advances in physics, from laser cooling and Bose-Einstein condensation of cold gases to the precise quantum control of individual atomic ion. Work on miniaturizing electromagnetic traps to the micrometer scale promises even higher levels of control and reliability. Compared with 'chip traps' for confining neutral atoms, ion traps with similar dimensions and power dissipation offer much higher confinement forces and allow unparalleled control at the single-atom level. Moreover, ion microtraps are of great interest in the development of miniature mass spectrometer arrays, compact atomic clocks, and most notably, large scale quantum information processors. Here we report the operation of a micrometer-scale ion trap, fabricated on a monolithic chip using semiconductor micro-electromechanical systems (MEMS) technology. We confine, laser cool, and measure heating of a single 111Cd+ ion in an integrated radiofrequency trap etched from a doped gallium arsenide (GaAs) heterostructure.Comment: 4 pages, 4 figure

    Raman spectroscopy of GaSe and InSe post-transition metal chalcogenides layers

    Get PDF
    This is the author accepted manuscript. The final version is available on open access from the Royal Society of Chemistry via the DOI in this recordIII-VI post-transition metal chalcogenides (InSe and GaSe) are a new class of layered semiconductors, which feature a strong variation of size and type of their band gaps as a function of number of layers (N). Here, we investigate exfoliated layers of InSe and GaSe ranging from bulk crystals down to monolayer, encapsulated in hexagonal boron nitride, using Raman spectroscopy. We present the N-dependence of both intralayer vibrations within each atomic layer, as well as of the interlayer shear and layer breathing modes. A linear chain model can be used to describe the evolution of the peak positions as a function of N, consistent with first principles calculationsNational Science Centre, PolandEngineering and Physical Sciences Research Council (EPSRC)Scientific and Technological Research Council of Turkey (TUBITAK)Royal SocietySamsung Advanced Institute of Technology (SAIT)European Research Council (ERC

    Quantum Criticality in Heavy Fermion Metals

    Full text link
    Quantum criticality describes the collective fluctuations of matter undergoing a second-order phase transition at zero temperature. Heavy fermion metals have in recent years emerged as prototypical systems to study quantum critical points. There have been considerable efforts, both experimental and theoretical, which use these magnetic systems to address problems that are central to the broad understanding of strongly correlated quantum matter. Here, we summarize some of the basic issues, including i) the extent to which the quantum criticality in heavy fermion metals goes beyond the standard theory of order-parameter fluctuations, ii) the nature of the Kondo effect in the quantum critical regime, iii) the non-Fermi liquid phenomena that accompany quantum criticality, and iv) the interplay between quantum criticality and unconventional superconductivity.Comment: (v2) 39 pages, 8 figures; shortened per the editorial mandate; to appear in Nature Physics. (v1) 43 pages, 8 figures; Non-technical review article, intended for general readers; the discussion part contains more specialized topic

    Design Considerations for Massively Parallel Sequencing Studies of Complex Human Disease

    Get PDF
    Massively Parallel Sequencing (MPS) allows sequencing of entire exomes and genomes to now be done at reasonable cost, and its utility for identifying genes responsible for rare Mendelian disorders has been demonstrated. However, for a complex disease, study designs need to accommodate substantial degrees of locus, allelic, and phenotypic heterogeneity, as well as complex relationships between genotype and phenotype. Such considerations include careful selection of samples for sequencing and a well-developed strategy for identifying the few “true” disease susceptibility genes from among the many irrelevant genes that will be found to harbor rare variants. To examine these issues we have performed simulation-based analyses in order to compare several strategies for MPS sequencing in complex disease. Factors examined include genetic architecture, sample size, number and relationship of individuals selected for sequencing, and a variety of filters based on variant type, multiple observations of genes and concordance of genetic variants within pedigrees. A two-stage design was assumed where genes from the MPS analysis of high-risk families are evaluated in a secondary screening phase of a larger set of probands with more modest family histories. Designs were evaluated using a cost function that assumes the cost of sequencing the whole exome is 400 times that of sequencing a single candidate gene. Results indicate that while requiring variants to be identified in multiple pedigrees and/or in multiple individuals in the same pedigree are effective strategies for reducing false positives, there is a danger of over-filtering so that most true susceptibility genes are missed. In most cases, sequencing more than two individuals per pedigree results in reduced power without any benefit in terms of reduced overall cost. Further, our results suggest that although no single strategy is optimal, simulations can provide important guidelines for study design

    Chronic non-specific low back pain - sub-groups or a single mechanism?

    Get PDF
    Copyright 2008 Wand and O'Connell; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Background: Low back pain is a substantial health problem and has subsequently attracted a considerable amount of research. Clinical trials evaluating the efficacy of a variety of interventions for chronic non-specific low back pain indicate limited effectiveness for most commonly applied interventions and approaches. Discussion: Many clinicians challenge the results of clinical trials as they feel that this lack of effectiveness is at odds with their clinical experience of managing patients with back pain. A common explanation for this discrepancy is the perceived heterogeneity of patients with chronic non-specific low back pain. It is felt that the effects of treatment may be diluted by the application of a single intervention to a complex, heterogeneous group with diverse treatment needs. This argument presupposes that current treatment is effective when applied to the correct patient. An alternative perspective is that the clinical trials are correct and current treatments have limited efficacy. Preoccupation with sub-grouping may stifle engagement with this view and it is important that the sub-grouping paradigm is closely examined. This paper argues that there are numerous problems with the sub-grouping approach and that it may not be an important reason for the disappointing results of clinical trials. We propose instead that current treatment may be ineffective because it has been misdirected. Recent evidence that demonstrates changes within the brain in chronic low back pain sufferers raises the possibility that persistent back pain may be a problem of cortical reorganisation and degeneration. This perspective offers interesting insights into the chronic low back pain experience and suggests alternative models of intervention. Summary: The disappointing results of clinical research are commonly explained by the failure of researchers to adequately attend to sub-grouping of the chronic non-specific low back pain population. Alternatively, current approaches may be ineffective and clinicians and researchers may need to radically rethink the nature of the problem and how it should best be managed

    Lumbar spine and total-body dual-energy X-ray absorptiometry in children with severe neurological impairment and intellectual disability: a pilot study of artefacts and disrupting factors

    Get PDF
    Background Children with severe neurological impairment and intellectual disability (ID) are susceptible for developing low bone mineral density (BMD) and fractures. BMD is generally measured with dual-energy X-ray absorptiometry (DXA). Objective To describe the occurrence of factors that may influence the feasibility of DXA and the accuracy of DXA outcome in children with severe neurological impairment and ID. Materials and methods Based on literature and expert opinion, a list of disrupting factors was developed. Occurrence of these factors was assessed in 27 children who underwent DXA measurement. Results Disrupting factors that occurred most frequently were movement during measurement (82%), aberrant body composition (67%), small length for age (56%) and scoliosis (37%). The number of disrupting factors per child was mean 5.3 (range 1-8). No correlation was found between DXA outcomes and the number of disrupting factors. Conclusion Factors that may negatively influence the accuracy of DXA outcome are frequently present in children with severe neurological impairment and ID. No systematic deviation of DXA outcome in coherence with the amount of disrupting factors was found, but physicians should be aware of the possible influence of disrupting factors on the accuracy of DXA

    Nonspecific synaptic plasticity improves the recognition of sparse patterns degraded by local noise

    Get PDF
    Safaryan, K. et al. Nonspecific synaptic plasticity improves the recognition of sparse patterns degraded by local noise. Sci. Rep. 7, 46550; doi: 10.1038/srep46550 (2017). This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ © The Author(s) 2017.Many forms of synaptic plasticity require the local production of volatile or rapidly diffusing substances such as nitric oxide. The nonspecific plasticity these neuromodulators may induce at neighboring non-active synapses is thought to be detrimental for the specificity of memory storage. We show here that memory retrieval may benefit from this non-specific plasticity when the applied sparse binary input patterns are degraded by local noise. Simulations of a biophysically realistic model of a cerebellar Purkinje cell in a pattern recognition task show that, in the absence of noise, leakage of plasticity to adjacent synapses degrades the recognition of sparse static patterns. However, above a local noise level of 20 %, the model with nonspecific plasticity outperforms the standard, specific model. The gain in performance is greatest when the spatial distribution of noise in the input matches the range of diffusion-induced plasticity. Hence non-specific plasticity may offer a benefit in noisy environments or when the pressure to generalize is strong.Peer reviewe
    corecore