12,753 research outputs found

    Unraveling the effect of sex on human genetic architecture

    Get PDF
    Sex is arguably the most important differentiating characteristic in most mammalian species, separating populations into different groups, with varying behaviors, morphologies, and physiologies based on their complement of sex chromosomes, amongst other factors. In humans, despite males and females sharing nearly identical genomes, there are differences between the sexes in complex traits and in the risk of a wide array of diseases. Sex provides the genome with a distinct hormonal milieu, differential gene expression, and environmental pressures arising from gender societal roles. This thus poses the possibility of observing gene by sex (GxS) interactions between the sexes that may contribute to some of the phenotypic differences observed. In recent years, there has been growing evidence of GxS, with common genetic variation presenting different effects on males and females. These studies have however been limited in regards to the number of traits studied and/or statistical power. Understanding sex differences in genetic architecture is of great importance as this could lead to improved understanding of potential differences in underlying biological pathways and disease etiology between the sexes and in turn help inform personalised treatments and precision medicine. In this thesis we provide insights into both the scope and mechanism of GxS across the genome of circa 450,000 individuals of European ancestry and 530 complex traits in the UK Biobank. We found small yet widespread differences in genetic architecture across traits through the calculation of sex-specific heritability, genetic correlations, and sex-stratified genome-wide association studies (GWAS). We further investigated whether sex-agnostic (non-stratified) efforts could potentially be missing information of interest, including sex-specific trait-relevant loci and increased phenotype prediction accuracies. Finally, we studied the potential functional role of sex differences in genetic architecture through sex biased expression quantitative trait loci (eQTL) and gene-level analyses. Overall, this study marks a broad examination of the genetics of sex differences. Our findings parallel previous reports, suggesting the presence of sexual genetic heterogeneity across complex traits of generally modest magnitude. Furthermore, our results suggest the need to consider sex-stratified analyses in future studies in order to shed light into possible sex-specific molecular mechanisms

    How to Be a God

    Get PDF
    When it comes to questions concerning the nature of Reality, Philosophers and Theologians have the answers. Philosophers have the answers that can’t be proven right. Theologians have the answers that can’t be proven wrong. Today’s designers of Massively-Multiplayer Online Role-Playing Games create realities for a living. They can’t spend centuries mulling over the issues: they have to face them head-on. Their practical experiences can indicate which theoretical proposals actually work in practice. That’s today’s designers. Tomorrow’s will have a whole new set of questions to answer. The designers of virtual worlds are the literal gods of those realities. Suppose Artificial Intelligence comes through and allows us to create non-player characters as smart as us. What are our responsibilities as gods? How should we, as gods, conduct ourselves? How should we be gods

    BECOMEBECOME - A TRANSDISCIPLINARY METHODOLOGY BASED ON INFORMATION ABOUT THE OBSERVER

    Get PDF
    ABSTRACT Andrea T. R. Traldi BECOMEBECOME A Transdisciplinary Methodology Based on Information about the Observer The present research dissertation has been developed with the intention to provide practical strategies and discover new intellectual operations which can be used to generate Transdisciplinary insight. For this reason, this thesis creates access to new knowledge at different scales. Firstly, as it pertains to the scale of new knowledge generated by those who attend Becomebecome events. The open-source nature of the Becomebecome methodology makes it possible for participants in Becomebecome workshops, training programmes and residencies to generate new insight about the specific project they are working on, which then reinforce and expand the foundational principles of the theoretical background. Secondly, as it pertains to the scale of the Becomebecome framework, which remains independent of location and moment in time. The method proposed to access Transdisciplinary knowledge constitutes new knowledge in itself because the sequence of activities, described as physical and mental procedures and listed as essential criteria, have never been found organised 6 in such a specific order before. It is indeed the order in time, i.e. the sequence of the ideas and activities proposed, which allows one to transform Disciplinary knowledge via a new Transdisciplinary frame of reference. Lastly, new knowledge about Transdisciplinarity as a field of study is created as a consequence of the heretofore listed two processes. The first part of the thesis is designated ‘Becomebecome Theory’ and focuses on the theoretical background and the intellectual operations necessary to support the creation of new Transdisciplinary knowledge. The second part of the thesis is designated ‘Becomebecome Practice’ and provides practical examples of the application of such operations. Crucially, the theoretical model described as the foundation for the Becomebecome methodology (Becomebecome Theory) is process-based and constantly checked against the insight generated through Becomebecome Practice. To this effect, ‘information about the observer’ is proposed as a key notion which binds together Transdisciplinary resources from several studies in the hard sciences and humanities. It is a concept that enables understanding about why and how information that is generated through Becomebecome Practice is considered of paramount importance for establishing the reference parameters necessary to access Transdisciplinary insight which is meaningful to a specific project, a specific person, or a specific moment in time

    The labour supply and retirement of older workers: an empirical analysis

    Get PDF
    This thesis examines the labour supply of older workers, their movement into retirement, and any movement out of retirement and back into work. In particular the labour force participation, labour supply and wage elasticity and other income elasticity of work hours are estimated for older workers and compared to younger workers. The thesis goes on to look at the movement into retirement for older workers as a whole by examining cohorts by gender, wave and age. The thesis also presents a descriptive and quantitative • examination of the changes in income and happiness that occur as an individual retires. Finally the thesis examines the reasons why an individual may return to work from v . retirement. The results of the findings suggest: that younger workers are significantly more responsive to wage and household income changes than older worker

    The empty space in abstract photography: a psychoanalytical perspective

    Get PDF
    The aim of the research that this thesis is based on is to explore the theoretical problems raised by the concept of photographic abstraction. These consist in the tension between the two aspects of the photographic sign, the indexical and iconic, and are examined in the context of the particular exploration of the empty space in abstract photography which I have pursued through my practice. The investigation draws mainly upon the psychoanalytic theory of transitional phenomena as proposed by Winnicott, as well as other art theories (Deleuze & Guattari, Ehrenzweig, Fer, Fuller, Greenberg, Joselit, Kuspit, Leider, Worringer) of abstraction. It explores the relationship of the abstract photographic image to notions of exteriority and interiority as these relate to the transition from the unconscious to conscious reality. The development of this research suggests the psychoanalytical concept of potential space as a contribution to an aesthetic model of abstraction. This concept is employed as a methodological tool in the development of the practical work and creates a framework for its interpretation. The concept of potential space is based on Winnicott's ideas around "playing with the real" in an intermediate area of experience between the internal and external reality, where creativity originates as a zone of fictive play that facilitates the subject's journey from "what is subjectively conceived of' to "what is objectively perceived. " The outcome of this investigation constitutes the production of a series of photographs describing an empty abstract space, one that is invested with a psychic dimension that produces the effect of ambiguity between its representational and abstract readings. It provides a redefinition of abstraction in a space of tension between the iconic and indexical aspects of the sign and opens up the space of abstraction in photography as one in which the relationship between inner and outer reality can be performed and can become a space of action and intervention

    Anytime algorithms for ROBDD symmetry detection and approximation

    Get PDF
    Reduced Ordered Binary Decision Diagrams (ROBDDs) provide a dense and memory efficient representation of Boolean functions. When ROBDDs are applied in logic synthesis, the problem arises of detecting both classical and generalised symmetries. State-of-the-art in symmetry detection is represented by Mishchenko's algorithm. Mishchenko showed how to detect symmetries in ROBDDs without the need for checking equivalence of all co-factor pairs. This work resulted in a practical algorithm for detecting all classical symmetries in an ROBDD in O(|G|³) set operations where |G| is the number of nodes in the ROBDD. Mishchenko and his colleagues subsequently extended the algorithm to find generalised symmetries. The extended algorithm retains the same asymptotic complexity for each type of generalised symmetry. Both the classical and generalised symmetry detection algorithms are monolithic in the sense that they only return a meaningful answer when they are left to run to completion. In this thesis we present efficient anytime algorithms for detecting both classical and generalised symmetries, that output pairs of symmetric variables until a prescribed time bound is exceeded. These anytime algorithms are complete in that given sufficient time they are guaranteed to find all symmetric pairs. Theoretically these algorithms reside in O(n³+n|G|+|G|³) and O(n³+n²|G|+|G|³) respectively, where n is the number of variables, so that in practice the advantage of anytime generality is not gained at the expense of efficiency. In fact, the anytime approach requires only very modest data structure support and offers unique opportunities for optimisation so the resulting algorithms are very efficient. The thesis continues by considering another class of anytime algorithms for ROBDDs that is motivated by the dearth of work on approximating ROBDDs. The need for approximation arises because many ROBDD operations result in an ROBDD whose size is quadratic in the size of the inputs. Furthermore, if ROBDDs are used in abstract interpretation, the running time of the analysis is related not only to the complexity of the individual ROBDD operations but also the number of operations applied. The number of operations is, in turn, constrained by the number of times a Boolean function can be weakened before stability is achieved. This thesis proposes a widening that can be used to both constrain the size of an ROBDD and also ensure that the number of times that it is weakened is bounded by some given constant. The widening can be used to either systematically approximate an ROBDD from above (i.e. derive a weaker function) or below (i.e. infer a stronger function). The thesis also considers how randomised techniques may be deployed to improve the speed of computing an approximation by avoiding potentially expensive ROBDD manipulation

    Understanding the Relationship among Durable Goods, Academic Achievement, and School Attendance in Colombia

    Get PDF
    A joint report from the United Nations Development Program and the Oxford Poverty and Human Development Initiative indicates that while the number of people living with less than 1.90adaydeclinedglobally,droppingfrom2billionin1990to736millionin2015,thenumberofpeoplewhoexperiencednon−incomepovertyreached1.3billionin2020.Non−incomepoverty,referredtoasmultidimensionalpoverty,assessestheextenttowhichpeoplearedeprivedfromaccessingbasicservicessuchashealth,education,orattainingdecentlivingstandards,despitehavingincomelevelswellabove1.90 a day declined globally, dropping from 2 billion in 1990 to 736 million in 2015, the number of people who experienced non-income poverty reached 1.3 billion in 2020. Non-income poverty, referred to as multidimensional poverty, assesses the extent to which people are deprived from accessing basic services such as health, education, or attaining decent living standards, despite having income levels well above 1.90. Research on development and welfare economics points to assets as the missing piece in the poverty puzzle because they can build capacity. In general, assets can be used to generate income or to enhance quality of life. Income-generating assets such as bonds, credit, or home ownership help people gain economic stability, acquire other assets, and prepare for economic shocks. Quality-of-life-enhancing assets help people improve their living standards, develop agency, and participate in political as well as in social life. Examples of quality-of-life-enhancing assets include education, social capital, and durable goods such as TVs or computers. Most research on assets examines the relationship either between financial assets and poverty or between financial assets and education. An exploration of durable goods and education was the focus of this dissertation. Although not a nascent field, most studies in this area have focused on analyzing how durable goods relate to academic achievement and school attendance mainly in African and Asian countries. From a methodological standpoint, these studies have modeled durable goods utilizing a binary approach, where ownership of durable goods is measured as possession of any durable good, or as an index, using principal component analysis (PCA), which research suggests is not the most robust method for index creation. Such methodological decisions have provided only a partial understanding of the relationship between durable goods and education. For example, findings indicate that possession of durable goods improves achievement in reading, but not in math. However, further research is needed to assess whether different types of durable goods have differential effects on educational outcomes. Hence, this study explored the relationship among durable goods, academic achievement, and school attendance in Colombia through three methodological approaches to operationalize durable goods: inventory, attributional, and index approaches. Data come from the 2017 SABER test, a nation-wide examination that assesses reading and math skills, for fifth and ninth grade students, (N = 621,218). Students with complete durable goods information (N = 364,436) were included. This research added to the existing literature on this field by using different methodological approaches to model durable goods, including the construction of a durable goods index employing exploratory factor analysis (EFA), and by expanding the geographic scope to Latin America. By using hierarchical linear and nonlinear modeling, this study found that, overall, durable goods were positively associated with reading and math outcomes, particularly for fifth graders. Similarly, results indicated that students whose families owned washing machines, computers, or who had Internet access were more likely to go to school

    Examining the Potential for Isotope Analyses of Carbon, Nitrogen, and Sulphur in Burned Bone from Experimental and Archaeological Contexts.

    Get PDF
    The aim of this project was to determine whether isotope analyses of carbon, nitrogen and sulphur can be conducted on collagen extracted from burned bone. This project was conducted in two phases: a controlled heating experiment and an archaeological application. The controlled heating experiment used cow (Bos taurus) bone to test the temperature thresholds for the conservation of δ13C, δ15N, and δ34S values. These samples were also used to test the efficacy of Fourier Transform Infrared spectroscopy (FTIR) and colour analysis, for determining the burning intensities experienced by bone burned in unknown conditions. The experiment showed that δ13C values were relatively unchanged up to 400°C (<2‰ variation), while δ15N values were relatively stable up to 200°C (0.5‰ variation). Values of δ34S were also relatively stable up to 200°C (1.4‰ variation). Colour change and FTIR data were well correlated with the change in isotope ratios. Models estimating burning intensities were created from the FTIR data. For the archaeological application, samples were selected from two early Anglo-Saxon cemetery sites: Elsham and Cleatham. Samples were selected from both inhumed and cremated individuals. Among the inhumed individuals δ13C values suggested a C3 terrestrial diet and δ15N values suggested protein derived largely from terrestrial herbivores, as expected for the early Anglo-Saxon period. However, δ34S values suggested the consumption of freshwater resources and that this consumption was related to both the age and sex of the individual. The experimental data shows that there is potential for isotope analyses of cremated remains, as during the cremation process heat exposures are not uniform across the body. The samples selected for the archaeological application, however, were not successful. Bone samples heated in controlled conditions produced viable collagen for isotope analysis; however, there are several differences between experiments conducted in a muffle furnace and open-air pyre cremation that need to be investigated further. Additionally, the influence of taphonomy on collagen survival in burned bone needs to be quantified. Finally, methods of sample selection need to be improved to find bone samples from archaeologically cremated remains that are most likely to retain viable collagen. While there is significant research that must be conducted before this research can be widely applied there are a multitude of cultures that practised cremation throughout history and around the world that could be investigated through the analyses proposed in this project

    Breaking Ub with Leishmania mexicana: a ubiquitin activating enzyme as a novel therapeutic target for leishmaniasis

    Get PDF
    Leishmaniasis is a neglected tropical disease, which inflicts a variety of gruesome pathologies on humans. The number of individuals afflicted with leishmaniasis is thought to vary between 0.7 and 1.2 million annually, of whom it is estimated that 20 to 40 thousand die. This problem is exemplary of inequality in healthcare – current leishmaniasis treatments are inadequate due to toxicity, cost, and ineffectiveness, so there is an urgent need for improved chemotherapies. Ubiquitination is a biochemical pathway that has received attention in cancer research. It is the process of adding the ubiquitin protein as a post-translational modification to substrate proteins, using an enzymatic cascade comprised of enzymes termed E1s, E2s, and E3s. Ubiquitination can lead to degradation of substrate proteins, or otherwise modulate their function. As the name suggests, this modification can be found across eukaryotic cell biology. As such, interfering with ubiquitination may interfere with essential biological processes, which means ubiquitination may present a new therapeutic target for leishmaniasis. Before ubiquitination inhibitors can be designed, components of the ubiquitination system must be identified. To this end, a bioinformatic screening campaign employed BLASTs and hidden Markov models, using characterised orthologs from model organisms as bait, to screen publicly-available Leishmania mexicana genome sequence databases, searching for genes encoding putative E1s, E2s, and E3s. To confirm some of these identifications on a protein level, activity-based probes, protein pulldowns, and mass spectrometry were used. Using an activity-based probe that emulates the structure of adenylated ubiquitin, E1s were identified, and their relative abundance quantified. A chemical crosslinker extended the reach of this probe, allowing the identification of an E2 (LmxM.33.0900). It is noted that L. mexicana has two E1s – unusual for a single celled organism. Of these E1s, LmxM.34.3060 was considerably more abundant than LmxM.23.0550 in both major life cycle stages of the in vitro Leishmania cultures. It is important to describe the wider context of these enzymes – what is their interactome, what are their substrates? To study this, CRISPR was used to fuse a proximity-based labelling system, BioID, on genes of interest – LmxM.34.3060 and LmxM.33.0900. The E2 (LmxM.33.0900) was shown to interact with the E1 (LmxM.34.3060), validating the results from the activity-based probe and crosslinker experiments. Due to sequence homology with characterised orthologs, the E2 was hypothesised to function in the endoplasmic reticulum degradation pathway. Immunoprecipitations of a ubiquitin motif, diglycine, were conducted with a view to gathering information on the substrates of ubiquitin. Anti-diglycine peptides included some of those identified by BioID. Experiments examining ubiquitin’s role in the DNA damage response were also initiated, as were improvements to the proximity-based labelling system, however these were not followed to completion due to a lack of time and resources. To examine the possibility of finding novel drug targets in the ubiquitination cascade, recombinant proteins were expressed. LmxM.34.3060 was expressed in a functional form, while a putative SUMO E2 (LmxM.02.0390) was functional after refolding. Expressed LmxM.33.0900 was not functional and could not be refolded into a functional form. Drug assays were conducted on LmxM.34.3060, which found an inhibitor of the human ortholog, TAK-243, to be 20-fold less effective against the Leishmania enzyme. Additional assays found an inhibitor that was 50-fold more effective at inhibiting the Leishmania enzyme as opposed to its human equivalent - 5'O-sulfamoyl adenosine. Furthermore, a new mechanism of action, inhibiting the E1, for was identified for drugs previously characterised to inhibit protein synthesis. LmxM.34.3060 underwent biophysical characterisation, with structural information obtained using SAXS and protein crystallography. A crystal structure was solved to 3.1 Å, with the in-solution SAXS structure complementary to this. TAK-243 was modelled into the LmxM.34.3060 structure and clashes were predicted, concurring with TAK-243’s reduced efficacy against the Leishmania enzyme in the drug assays. This project aimed to characterise the potential of an understudied biochemical system to provide novel therapeutic targets for a neglected tropical pathogen. To achieve this aim it presents the identifications of two E1s, an interactome, a structure, and a potent, selective inhibitor of a Leishmania ubiquitin activating enzyme
    • …
    corecore