505 research outputs found

    Default reasoning and neural networks

    Get PDF
    In this dissertation a formalisation of nonmonotonic reasoning, namely Default logic, is discussed. A proof theory for default logic and a variant of Default logic - Prioritised Default logic - is presented. We also pursue an investigation into the relationship between default reasoning and making inferences in a neural network. The inference problem shifts from the logical problem in Default logic to the optimisation problem in neural networks, in which maximum consistency is aimed at The inference is realised as an adaptation process that identifies and resolves conflicts between existing knowledge about the relevant world and external information. Knowledge and data are transformed into constraint equations and the nodes in the network represent propositions and constraint equations. The violation of constraints is formulated in terms of an energy function. The Hopfield network is shown to be suitable for modelling optimisation problems and default reasoning.Computer ScienceM.Sc. (Computer Science

    A Performance-Explainability-Fairness Framework For Benchmarking ML Models

    Get PDF
    Machine learning (ML) models have achieved remarkable success in various applications; however, ensuring their robustness and fairness remains a critical challenge. In this research, we present a comprehensive framework designed to evaluate and benchmark ML models through the lenses of performance, explainability, and fairness. This framework addresses the increasing need for a holistic assessment of ML models, considering not only their predictive power but also their interpretability and equitable deployment. The proposed framework leverages a multi-faceted evaluation approach, integrating performance metrics with explainability and fairness assessments. Performance evaluation incorporates standard measures such as accuracy, precision, and recall, but extends to overall balanced error rate, overall area under the receiver operating characteristic (ROC) curve (AUC), to capture model behavior across different performance aspects. Explainability assessment employs state-of-the-art techniques to quantify the interpretability of model decisions, ensuring that model behavior can be understood and trusted by stakeholders. The fairness evaluation examines model predictions in terms of demographic parity, equalized odds, thereby addressing concerns of bias and discrimination in the deployment of ML systems. To demonstrate the practical utility of the framework, we apply it to a diverse set of ML algorithms across various functional domains, including finance, criminology, education, and healthcare prediction. The results showcase the importance of a balanced evaluation approach, revealing trade-offs between performance, explainability, and fairness that can inform model selection and deployment decisions. Furthermore, we provide insights into the analysis of tradeoffs in selecting the appropriate model for use cases where performance, interpretability and fairness are important. In summary, the Performance-Explainability-Fairness Framework offers a unified methodology for evaluating and benchmarking ML models, enabling practitioners and researchers to make informed decisions about model suitability and ensuring responsible and equitable AI deployment. We believe that this framework represents a crucial step towards building trustworthy and accountable ML systems in an era where AI plays an increasingly prominent role in decision-making processes

    GIT (gender-informed trauma) in black n blue boys / broken men: how concepts of gender restrict the black male actor’s creative process and the methods he can use for creative freedom.

    Get PDF
    This study examines how the black male actor’s creative process can be affected by historical and cultural constructions of masculinity connected to race, sexuality, and physical movement. My research on black men’s experience with gender identity finds that social and cultural forces lead black men to reproduce behaviors that mirror a prescribed masculine ideal through physical movement. This prescribed masculine behavior is typically coded in terms of stiffness or lack of expression. This study explores how self-imposed restrictions reiterated by social standards of masculine behavior limit the creative freedom in the black male actor’s creative process. Specifically, black male actors’ use of their bodies during the creative process while adhering to socially-prescribed gender norms can cause physical blockages in their acting work. These blockages result from the traumatic experiences of how the black community reinscribes social conceptions of masculinity. This study incorporates my personal experiences and other black men’s testimonies as evidence of such trauma and focuses on the creative limitations faced by black male actors due to limited movement styles available under traditional or heteronormative prescriptions of masculinity. This study offers methods I developed from class readings to map how I moved through these limitations in my rehearsal process for the University of Louisville’s Department of Theatre Arts’ Fall 2020 production of Deal Orlandersmith’s Black N Blue / Boys Broken Men. My acting journal entries and the testimonies of black men support how and why “gender-informed trauma” hinders the black male actor\u27s ability to fully explore his physical range for character development. From this, my thesis develops methods that helped me to overcome the effects of gender-informed trauma, to expand my physical range, and develop unique, fully-embodied characters

    Next generation sequencing approaches in rare diseases: the study of four different families

    Get PDF
    The main purpose of this PhD project was to study the molecular bases of rare Mendelian diseases through Next Generation Sequencing (NGS), finding the most appropriate NGS technology and data analysis approach. To this aim, we enrolled at Umberto I General Hospital and Sapienza University of Rome four different families with a phenotype due to a supposed genetic cause, in order to find the causative gene/genes. The selection of the experimental strategy, the number of subjects to sequence (the most distant family members, trio or singleton) and the data analysis approach were dictated by considerations on the diagnostic potential of each sequencing strategy and its feasibility and cost: the diagnostic rate, the possibility to re-evaluate the NGS data periodically, the management of NGS data, the functional interpretation of coding and non coding variants and the number of secondary findings were some of the criteria driving the choice of the NGS test. The choice was also influenced by specific features of each case, e.g. the supposed mode of inheritance, the available samples and the information about the phenotype. Whole exome sequencing (WES) and clinical exome sequencing (CES) experiments were performed in our laboratory or by outer companies. We analysed sequencing data through a dedicated bioinformatic pipeline and we filtered and prioritized the variants according to several parameters, specific for each case. Then, we validated the selected variant/variants through Sanger sequencing on the proband and the other family members, to study their segregation in the family, and we investigated the functional link between the candidate variant/variants and the phenotype. To study the molecular bases of the complex phenotypes regarding canine agenesis and eruption anomalies in the family A, we performed a WES approach on three first degree cousins. Different data analyses, based on different shared genetic causes, allowed us to identify several candidate variants: two missense variants in EDARADD and COL5A1, previously associated with tooth agenesis and a syndromic phenotype including dental anomalies, respectively; three missense variants in RSPO4, T and NELL1, genes functionally related to tooth morphogenesis. The segregation analyses pointed to two different signaling pathways as responsible for the phenotypes, one of them (i.e. EDA) for the canine agenesis, and the other (i.e. WNT) for the less severe canine eruption anomalies. To find the cause of the isolated brachydactyly observed in family B, we used a WES approach on the proband and his grandfather. We identified a shared frameshift variant in the GDF5 gene, encoding for a secreted ligand of TGF-ÎČ and predominantly expressed in long bones during embryonic development. This was important for genetic counselling as it is causative of a mild phenotype in heterozygous state, but also of a very severe phenotype in homozygous state. To find the cause of the corpus callosum anomaly observed in the proband of family C, we chose a trio approach. We performed a CES, using an enrichment kit that included 171 of 180 genes reported in literature as causative of corpus callosum malformations. We identified in the proband a supposedly de novo nonsense variant in the ARX gene, critical for early development and formation of a normal brain. Segregation analysis disclosed the presence of the same variant also in the fetus of a previous pregnancy, suggesting a gonadal or gonosomal mosaicism in one of the parents. The identification of this variant was important for genetic counselling as there is an increased recurrence risk for the couple to have a child with the same disorder. It was also important for the proband’s clinical prognosis and to properly calculate the risk to transmit the mutation, which is associated with different clinical outcomes depending on the sex. To investigate the molecular bases of the recurrent Dandy- Walker malformation observed in the family D, we performed WES only of the proband. We identified a homozygous missense variant in FKTN gene, encoding a glycosyltransferase with a role in brain development. In order to test the pathogenicity of the variant, we also performed a structural modeling of FKTN. This result allowed to properly redefine the clinical diagnosis as a Muscular Dystrophy-Dystroglycanopathy Type A, with implications on recurrence risk for the couple and on reproductive choices. The adopted experimental and data analysis strategies allowed us to identify the molecular causes of phenotypes involving different systems and belonging to different clinical pictures, with significant impact on diagnosis, prognosis and genetic counselling. These results show how NGS is revolutionizing medical genetics, accelerating the research about rare-genetic diseases, facilitating clinical diagnosis and leading us to the personalized medicine

    As Below So Above: Reconstructing the Neo-Babylonian Worldview

    Get PDF
    To add to our knowledge about a Near Eastern culture, this project examines through textual evidence how the early first millennium BCE Neo-Babylonians thought, reasoned, and wrote in order to partially reconstruct the shared, generally held worldview of the Neo-Babylonian people using the transdisciplinary approach of worldview analysis. Worldviews are what we use to think with, not what we think about. Underlying surficial cultural behaviors are deeper levels of cognition regarding how to reason, perceive the world, prioritize values, prescribe behavior, and explain all of life. Specifically, this work examines the language and logic reflected in the textual archive, believing that this is the foundational level of any worldview. I argue that one finds two related components: (1) that they were linguistically programmed to be attuned to the full context over particularities, verbal actions over agential subjects, the continuity of substances over discrete objects, the standard use of maleness over femaleness, and the affective power of spoken or written words, and (2) that they were logically programmed to prefer gradations over distinctions, functional properties over inherent attributes, radials and/or rhizomes over linearity, and relationships and/or comparisons over abstractions and algorithms. By addressing this underlying, implicit cognitive software the Neo-Babylonians used, one is better able to understand the society’s more observable and obvious religious, ethical, legal, political, and social features. This has the potential to present a more contextualized view of Neo-Babylonian civilization. Reconstructing the ancient Neo-Babylonian worldview allows scholars to compare and contrast the linguistic and logical features to other ancient nearby cultures in order to understand continuities and differences and what accounts for them. One can open a dialogue between these ancient societies at a deeper level. It demonstrates the uniqueness of Neo-Babylonia. And it provides a basis for understanding how Neo-Babylonians contributed to the roots of Western civilization and thought. Most current worldview analysis examines modern or postmodern worldviews. By examining an ancient worldview, one can begin to more clearly understand any common aspects which exist for all worldviews and any elements that exist in ancient ones which are missing from cataloging more modern worldviews. Thus, the cataloging of an ancient worldview helps to open new vistas within worldview studies. This study invites similar ones within ancient Near Eastern studies and within ancient studies in general

    OPTIMIZATION OF NONSTANDARD REASONING SERVICES

    Get PDF
    The increasing adoption of semantic technologies and the corresponding increasing complexity of application requirements are motivating extensions to the standard reasoning paradigms and services supported by such technologies. This thesis focuses on two of such extensions: nonmonotonic reasoning and inference-proof access control. Expressing knowledge via general rules that admit exceptions is an approach that has been commonly adopted for centuries in areas such as law and science, and more recently in object-oriented programming and computer security. The experiences in developing complex biomedical knowledge bases reported in the literature show that a direct support to defeasible properties and exceptions would be of great help. On the other hand, there is ample evidence of the need for knowledge confidentiality measures. Ontology languages and Linked Open Data are increasingly being used to encode the private knowledge of companies and public organizations. Semantic Web techniques facilitate merging different sources of knowledge and extract implicit information, thereby putting at risk security and the privacy of individuals. But the same reasoning capabilities can be exploited to protect the confidentiality of knowledge. Both nonmonotonic inference and secure knowledge base access rely on nonstandard reasoning procedures. The design and realization of these algorithms in a scalable way (appropriate to the ever-increasing size of ontologies and knowledge bases) is carried out by means of a diversified range of optimization techniques such as appropriate module extraction and incremental reasoning. Extensive experimental evaluation shows the efficiency of the developed optimization techniques: (i) for the first time performance compatible with real-time reasoning is obtained for large nonmonotonic ontologies, while (ii) the secure ontology access control proves to be already compatible with practical use in the e-health application scenario.

    The Global Risks Report 2016, 11th Edition

    Get PDF
    Now in its 11th edition, The Global Risks Report 2016 draws attention to ways that global risks could evolve and interact in the next decade. The year 2016 marks a forceful departure from past findings, as the risks about which the Report has been warning over the past decade are starting to manifest themselves in new, sometimes unexpected ways and harm people, institutions and economies. Warming climate is likely to raise this year's temperature to 1° Celsius above the pre-industrial era, 60 million people, equivalent to the world's 24th largest country and largest number in recent history, are forcibly displaced, and crimes in cyberspace cost the global economy an estimated US$445 billion, higher than many economies' national incomes. In this context, the Reportcalls for action to build resilience – the "resilience imperative" – and identifies practical examples of how it could be done.The Report also steps back and explores how emerging global risks and major trends, such as climate change, the rise of cyber dependence and income and wealth disparity are impacting already-strained societies by highlighting three clusters of risks as Risks in Focus. As resilience building is helped by the ability to analyse global risks from the perspective of specific stakeholders, the Report also analyses the significance of global risks to the business community at a regional and country-level
    • 

    corecore