3,743 research outputs found

    Inter-individual variation of the human epigenome & applications

    Get PDF

    Disentangling the effects of terroir, season, and vintage on the grapevine fungal pathobiome

    Get PDF
    The composition, diversity and dynamics of microbial communities associated with grapevines may be influenced by various environmental factors, including terroir, vintage, and season. Among these factors, terroir stands out as a unique possible determinant of the pathobiome, the community of plant-associated pathogens. This study employed high-throughput molecular techniques, including metabarcoding and network analysis, to investigate the compositional dynamics of grapevine fungal pathobiome across three microhabitats (soil, woody tissue, and bark) using the Furmint cultivar. Samples were collected during late winter and late summer in 2020 and 2021, across three distinct terroirs in Hungary’s Tokaj wine region. Of the 123 plant pathogenic genera found, Diplodia, Phaeomoniella, and Fusarium displayed the highest richness in bark, wood, and soil, respectively. Both richness and abundance exhibited significant disparities across microhabitats, with plant pathogenic fungi known to cause grapevine trunk diseases (GTDs) demonstrating highest richness and abundance in wood and bark samples, and non-GTD pathogens prevailed soil. Abundance and richness, however, followed distinct patterns Terroir accounted for a substantial portion of the variance in fungal community composition, ranging from 14.46 to 24.67%. Season and vintage also contributed to the variation, explaining 1.84 to 2.98% and 3.67 to 6.39% of the variance, respectively. Notably, significant compositional differences in fungi between healthy and diseased grapevines were only identified in wood and bark samples. Cooccurrence networks analysis, using both unweighted and weighted metrics, revealed intricate relationships among pathogenic fungal genera. This involved mostly positive associations, potentially suggesting synergism, and a few negative relationships, potentially suggesting antagonistic interactions. In essence, the observed differences among terroirs may stem from environmental filtering due to varied edaphic and mesoclimatic conditions. Temporal weather and vine management practices could explain seasonal and vintage fungal dynamics. This study provides insights into the compositional dynamics of grapevine fungal pathobiome across different microhabitats, terroirs, seasons, and health statuses. The findings emphasize the importance of considering network-based approaches in studying microbial communities and have implications for developing improved viticultural plant health strategies

    Climate Change and Critical Agrarian Studies

    Full text link
    Climate change is perhaps the greatest threat to humanity today and plays out as a cruel engine of myriad forms of injustice, violence and destruction. The effects of climate change from human-made emissions of greenhouse gases are devastating and accelerating; yet are uncertain and uneven both in terms of geography and socio-economic impacts. Emerging from the dynamics of capitalism since the industrial revolution — as well as industrialisation under state-led socialism — the consequences of climate change are especially profound for the countryside and its inhabitants. The book interrogates the narratives and strategies that frame climate change and examines the institutionalised responses in agrarian settings, highlighting what exclusions and inclusions result. It explores how different people — in relation to class and other co-constituted axes of social difference such as gender, race, ethnicity, age and occupation — are affected by climate change, as well as the climate adaptation and mitigation responses being implemented in rural areas. The book in turn explores how climate change – and the responses to it - affect processes of social differentiation, trajectories of accumulation and in turn agrarian politics. Finally, the book examines what strategies are required to confront climate change, and the underlying political-economic dynamics that cause it, reflecting on what this means for agrarian struggles across the world. The 26 chapters in this volume explore how the relationship between capitalism and climate change plays out in the rural world and, in particular, the way agrarian struggles connect with the huge challenge of climate change. Through a huge variety of case studies alongside more conceptual chapters, the book makes the often-missing connection between climate change and critical agrarian studies. The book argues that making the connection between climate and agrarian justice is crucial

    Inter-individual variation of the human epigenome & applications

    Get PDF
    Genome-wide association studies (GWAS) have led to the discovery of genetic variants influencing human phenotypes in health and disease. However, almost two decades later, most human traits can still not be accurately predicted from common genetic variants. Moreover, genetic variants discovered via GWAS mostly map to the non-coding genome and have historically resisted interpretation via mechanistic models. Alternatively, the epigenome lies in the cross-roads between genetics and the environment. Thus, there is great excitement towards the mapping of epigenetic inter-individual variation since its study may link environmental factors to human traits that remain unexplained by genetic variants. For instance, the environmental component of the epigenome may serve as a source of biomarkers for accurate, robust and interpretable phenotypic prediction on low-heritability traits that cannot be attained by classical genetic-based models. Additionally, its research may provide mechanisms of action for genetic associations at non-coding regions that mediate their effect via the epigenome. The aim of this thesis was to explore epigenetic inter-individual variation and to mitigate some of the methodological limitations faced towards its future valorisation.Chapter 1 is dedicated to the scope and aims of the thesis. It begins by describing historical milestones and basic concepts in human genetics, statistical genetics, the heritability problem and polygenic risk scores. It then moves towards epigenetics, covering the several dimensions it encompasses. It subsequently focuses on DNA methylation with topics like mitotic stability, epigenetic reprogramming, X-inactivation or imprinting. This is followed by concepts from epigenetic epidemiology such as epigenome-wide association studies (EWAS), epigenetic clocks, Mendelian randomization, methylation risk scores and methylation quantitative trait loci (mQTL). The chapter ends by introducing the aims of the thesis.Chapter 2 focuses on stochastic epigenetic inter-individual variation resulting from processes occurring post-twinning, during embryonic development and early life. Specifically, it describes the discovery and characterisation of hundreds of variably methylated CpGs in the blood of healthy adolescent monozygotic (MZ) twins showing equivalent variation among co-twins and unrelated individuals (evCpGs) that could not be explained only by measurement error on the DNA methylation microarray. DNA methylation levels at evCpGs were shown to be stable short-term but susceptible to aging and epigenetic drift in the long-term. The identified sites were significantly enriched at the clustered protocadherin loci, known for stochastic methylation in neurons in the context of embryonic neurodevelopment. Critically, evCpGs were capable of clustering technical and longitudinal replicates while differentiating young MZ twins. Thus, discovered evCpGs can be considered as a first prototype towards universal epigenetic fingerprint, relevant in the discrimination of MZ twins for forensic purposes, currently impossible with standard DNA profiling. Besides, DNA methylation microarrays are the preferred technology for EWAS and mQTL mapping studies. However, their probe design inherently assumes that the assayed genomic DNA is identical to the reference genome, leading to genetic artifacts whenever this assumption is not fulfilled. Building upon the previous experience analysing microarray data, Chapter 3 covers the development and benchmarking of UMtools, an R-package for the quantification and qualification of genetic artifacts on DNA methylation microarrays based on the unprocessed fluorescence intensity signals. These tools were used to assemble an atlas on genetic artifacts encountered on DNA methylation microarrays, including interactions between artifacts or with X-inactivation, imprinting and tissue-specific regulation. Additionally, to distinguish artifacts from genuine epigenetic variation, a co-methylation-based approach was proposed. Overall, this study revealed that genetic artifacts continue to filter through into the reported literature since current methodologies to address them have overlooked this challenge.Furthermore, EWAS, mQTL and allele-specific methylation (ASM) mapping studies have all been employed to map epigenetic variation but require matching phenotypic/genotypic data and can only map specific components of epigenetic inter-individual variation. Inspired by the previously proposed co-methylation strategy, Chapter 4 describes a novel method to simultaneously map inter-haplotype, inter-cell and inter-individual variation without these requirements. Specifically, binomial likelihood function-based bootstrap hypothesis test for co-methylation within reads (Binokulars) is a randomization test that can identify jointly regulated CpGs (JRCs) from pooled whole genome bisulfite sequencing (WGBS) data by solely relying on joint DNA methylation information available in reads spanning multiple CpGs. Binokulars was tested on pooled WGBS data in whole blood, sperm and combined, and benchmarked against EWAS and ASM. Our comparisons revealed that Binokulars can integrate a wide range of epigenetic phenomena under the same umbrella since it simultaneously discovered regions associated with imprinting, cell type- and tissue-specific regulation, mQTL, ageing or even unknown epigenetic processes. Finally, we verified examples of mQTL and polymorphic imprinting by employing another novel tool, JRC_sorter, to classify regions based on epigenotype models and non-pooled WGBS data in cord blood. In the future, we envision how this cost-effective approach can be applied on larger pools to simultaneously highlight regions of interest in the methylome, a highly relevant task in the light of the post-GWAS era.Moving towards future applications of epigenetic inter-individual variation, Chapters 5 and 6 are dedicated to solving some of methodological issues faced in translational epigenomics.Firstly, due to its simplicity and well-known properties, linear regression is the starting point methodology when performing prediction of a continuous outcome given a set of predictors. However, linear regression is incompatible with missing data, a common phenomenon and a huge threat to the integrity of data analysis in empirical sciences, including (epi)genomics. Chapter 5 describes the development of combinatorial linear models (cmb-lm), an imputation-free, CPU/RAM-efficient and privacy-preserving statistical method for linear regression prediction on datasets with missing values. Cmb-lm provide prediction errors that take into account the pattern of missing values in the incomplete data, even at extreme missingness. As a proof-of-concept, we tested cmb-lm in the context of epigenetic ageing clocks, one of the most popular applications of epigenetic inter-individual variation. Overall, cmb-lm offer a simple and flexible methodology with a wide range of applications that can provide a smooth transition towards the valorisation of linear models in the real world, where missing data is almost inevitable. Beyond microarrays, due to its high accuracy, reliability and sample multiplexing capabilities, massively parallel sequencing (MPS) is currently the preferred methodology of choice to translate prediction models for traits of interests into practice. At the same time, tobacco smoking is a frequent habit sustained by more than 1.3 billion people in 2020 and a leading (and preventable) health risk factor in the modern world. Predicting smoking habits from a persistent biomarker, such as DNA methylation, is not only relevant to account for self-reporting bias in public health and personalized medicine studies, but may also allow broadening forensic DNA phenotyping. Previously, a model to predict whether someone is a current, former, or never smoker had been published based on solely 13 CpGs from the hundreds of thousands included in the DNA methylation microarray. However, a matching lab tool with lower marker throughput, and higher accuracy and sensitivity was missing towards translating the model in practice. Chapter 6 describes the development of an MPS assay and data analysis pipeline to quantify DNA methylation on these 13 smoking-associated biomarkers for the prediction of smoking status. Though our systematic evaluation on DNA standards of known methylation levels revealed marker-specific amplification bias, our novel tool was still able to provide highly accurate and reproducible DNA methylation quantification and smoking habit prediction. Overall, our MPS assay allows the technological transfer of DNA methylation microarray findings and models to practical settings, one step closer towards future applications.Finally, Chapter 7 provides a general discussion on the results and topics discussed across Chapters 2-6. It begins by summarizing the main findings across the thesis, including proposals for follow-up studies. It then covers technical limitations pertaining bisulfite conversion and DNA methylation microarrays, but also more general considerations such as restricted data access. This chapter ends by covering the outlook of this PhD thesis, including topics such as bisulfite-free methods, third-generation sequencing, single-cell methylomics, multi-omics and systems biology.<br/

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Mapping the Focal Points of WordPress: A Software and Critical Code Analysis

    Get PDF
    Programming languages or code can be examined through numerous analytical lenses. This project is a critical analysis of WordPress, a prevalent web content management system, applying four modes of inquiry. The project draws on theoretical perspectives and areas of study in media, software, platforms, code, language, and power structures. The applied research is based on Critical Code Studies, an interdisciplinary field of study that holds the potential as a theoretical lens and methodological toolkit to understand computational code beyond its function. The project begins with a critical code analysis of WordPress, examining its origins and source code and mapping selected vulnerabilities. An examination of the influence of digital and computational thinking follows this. The work also explores the intersection of code patching and vulnerability management and how code shapes our sense of control, trust, and empathy, ultimately arguing that a rhetorical-cultural lens can be used to better understand code\u27s controlling influence. Recurring themes throughout these analyses and observations are the connections to power and vulnerability in WordPress\u27 code and how cultural, processual, rhetorical, and ethical implications can be expressed through its code, creating a particular worldview. Code\u27s emergent properties help illustrate how human values and practices (e.g., empathy, aesthetics, language, and trust) become encoded in software design and how people perceive the software through its worldview. These connected analyses reveal cultural, processual, and vulnerability focal points and the influence these entanglements have concerning WordPress as code, software, and platform. WordPress is a complex sociotechnical platform worthy of further study, as is the interdisciplinary merging of theoretical perspectives and disciplines to critically examine code. Ultimately, this project helps further enrich the field by introducing focal points in code, examining sociocultural phenomena within the code, and offering techniques to apply critical code methods

    Trade-off between Noise and Banding in a Quantum Adder with Qudits

    Full text link
    Quantum addition based on the quantum Fourier transform can be an integral part of a quantum circuit and proved to be more efficient than the existing classical ripple carry adder. Our study includes identifying the quantum resource required in a quantum adder in any arbitrary dimension and its relationship with the performance indicator in the presence of local noise acting on the circuit and when a limited number of controlled rotation operations is permitted, a procedure known as banding. We analytically prove an upper bound on the number of the controlled rotation gates required to accomplish the quantum addition up to an arbitrary defect in the fidelity between the desired and imperfect output. When the environment interacts with individual qudits, we establish a connection between quantum coherence and fidelity of the output. Interestingly, we demonstrate that when banding is employed in the presence of noise, approximate circuits of constant depth outperform circuits with a higher number of controlled rotations, establishing a complementary relationship between the approximate quantum adder and the strength of the noise. We exhibit that utilizing magnetic fields to prepare an initial state that evolves according to a one-dimensional spin chain for a specific amount of time can be a potential technique to implement quantum addition circuits in many-body systems.Comment: 17 pages, 12 figure

    An empirical investigation of the relationship between integration, dynamic capabilities and performance in supply chains

    Get PDF
    This research aimed to develop an empirical understanding of the relationships between integration, dynamic capabilities and performance in the supply chain domain, based on which, two conceptual frameworks were constructed to advance the field. The core motivation for the research was that, at the stage of writing the thesis, the combined relationship between the three concepts had not yet been examined, although their interrelationships have been studied individually. To achieve this aim, deductive and inductive reasoning logics were utilised to guide the qualitative study, which was undertaken via multiple case studies to investigate lines of enquiry that would address the research questions formulated. This is consistent with the author’s philosophical adoption of the ontology of relativism and the epistemology of constructionism, which was considered appropriate to address the research questions. Empirical data and evidence were collected, and various triangulation techniques were employed to ensure their credibility. Some key features of grounded theory coding techniques were drawn upon for data coding and analysis, generating two levels of findings. These revealed that whilst integration and dynamic capabilities were crucial in improving performance, the performance also informed the former. This reflects a cyclical and iterative approach rather than one purely based on linearity. Adopting a holistic approach towards the relationship was key in producing complementary strategies that can deliver sustainable supply chain performance. The research makes theoretical, methodological and practical contributions to the field of supply chain management. The theoretical contribution includes the development of two emerging conceptual frameworks at the micro and macro levels. The former provides greater specificity, as it allows meta-analytic evaluation of the three concepts and their dimensions, providing a detailed insight into their correlations. The latter gives a holistic view of their relationships and how they are connected, reflecting a middle-range theory that bridges theory and practice. The methodological contribution lies in presenting models that address gaps associated with the inconsistent use of terminologies in philosophical assumptions, and lack of rigor in deploying case study research methods. In terms of its practical contribution, this research offers insights that practitioners could adopt to enhance their performance. They can do so without necessarily having to forgo certain desired outcomes using targeted integrative strategies and drawing on their dynamic capabilities

    Making Connections: A Handbook for Effective Formal Mentoring Programs in Academia

    Get PDF
    This book, Making Connections: A Handbook for Effective Formal Mentoring Programs in Academia, makes a unique and needed contribution to the mentoring field as it focuses solely on mentoring in academia. This handbook is a collaborative institutional effort between Utah State University’s (USU) Empowering Teaching Open Access Book Series and the Mentoring Institute at the University of New Mexico (UNM). This book is available through (a) an e-book through Pressbooks, (b) a downloadable PDF version on USU’s Open Access Book Series website), and (c) a print version available for purchase on the USU Empower Teaching Open Access page, and on Amazon

    Towards an integrated vulnerability-based approach for evaluating, managing and mitigating earthquake risk in urban areas

    Get PDF
    Tese de doutoramento em Civil EngineeringSismos de grande intensidade, como aqueles que ocorreram na Turquía-Síria (2023) ou México (2017) deviam chamar a atenção para o projeto e implementação de ações proativas que conduzam à identificação de bens vulneráveis. A presente tese propõe um fluxo de trabalho relativamente simples para efetuar avaliações da vulnerabilidade sísmica à escala urbana mediante ferramentas digitais. Um modelo de vulnerabilidade baseado em parâmetros é adotado devido à afinidade que possui com o Catálogo Nacional de Monumentos Históricos mexicano. Uma primeira implementação do método (a grande escala) foi efetuada na cidade histórica de Atlixco (Puebla, México), demonstrando a sua aplicabilidade e algumas limitações, o que permitiu o desenvolvimento de uma estratégia para quantificar e considerar as incertezas epistémicas encontradas nos processos de aquisição de dados. Devido ao volume de dados tratado, foi preciso desenvolver meios robustos para obter, armazenar e gerir informações. O uso de Sistemas de Informação Geográfica, com programas à medida baseados em linguagem Python e a distribuição de ficheiros na ”nuvem”, facilitou a criação de bases de dados de escala urbana para facilitar a aquisição de dados em campo, os cálculos de vulnerabilidade e dano e, finalmente, a representação dos resultados. Este desenvolvimento foi a base para um segundo conjunto de trabalhos em municípios do estado de Morelos (México). A caracterização da vulnerabilidade sísmica de mais de 160 construções permitiu a avaliação da representatividade do método paramétrico pela comparação entre os níveis de dano teórico e os danos observados depois do terramoto de Puebla-Morelos (2017). Esta comparação foi a base para efetuar processos de calibração e ajuste assistidos por algoritmos de aprendizagem de máquina (Machine Learning), fornecendo bases para o desenvolvimento de modelos de vulnerabilidade à medida (mediante o uso de Inteligência Artificial), apoiados nas evidências de eventos sísmicos prévios.Strong seismic events like the ones of Türkiye-Syria (2023) or Mexico (2017) should guide our attention to the design and implementation of proactive actions aimed to identify vulnerable assets. This work is aimed to propose a suitable and easy-to-implement workflow for performing large-scale seismic vulnerability assessments in historic environments by means of digital tools. A vulnerability-oriented model based on parameters is adopted given its affinity with the Mexican Catalogue of Historical Monuments. A first large-scale implementation of this method in the historical city of Atlixco (Puebla, Mexico) demonstrated its suitability and some limitations, which lead to develop a strategy for quantifying and involving the epistemic uncertainties found during the data acquisition process. Given the volume of data that these analyses involve, it was necessary to develop robust data acquisition, storing and management strategies. The use of Geographical Information System environments together with customised Python-based programs and cloud-based distribution permitted to assemble urban databases for facilitating field data acquisition, performing vulnerability and damage calculations, and representing outcomes. This development was the base for performing a second large-scale assessment in selected municipalities of the state of Morelos (Mexico). The characterisation of the seismic vulnerability of more than 160 buildings permitted to assess the representativeness of the parametric vulnerability approach by comparing the theoretical damage estimations against the damages observed after the Puebla-Morelos 2017 Earthquakes. Such comparison is the base for performing a Machine Learning assisted process of calibration and adjustment, representing a feasible strategy for calibrating these vulnerability models by using Machine-Learning algorithms and the empirical evidence of damage in post-seismic scenarios.This work was partly financed by FCT/MCTES through national funds (PIDDAC) under the R&D Unit Institute for Sustainability and Innovation in Structural Engineering (ISISE), reference UIDB/04029/2020. This research had financial support provided by the Portuguese Foundation of Science and Technology (FCT) through the Analysis and Mitigation of Risks in Infrastructures (InfraRisk) program under the PhD grant PD/BD/150385/2019
    corecore