4,715 research outputs found

    The Monarch Initiative in 2024: an analytic platform integrating phenotypes, genes and diseases across species.

    Get PDF
    Bridging the gap between genetic variations, environmental determinants, and phenotypic outcomes is critical for supporting clinical diagnosis and understanding mechanisms of diseases. It requires integrating open data at a global scale. The Monarch Initiative advances these goals by developing open ontologies, semantic data models, and knowledge graphs for translational research. The Monarch App is an integrated platform combining data about genes, phenotypes, and diseases across species. Monarch\u27s APIs enable access to carefully curated datasets and advanced analysis tools that support the understanding and diagnosis of disease for diverse applications such as variant prioritization, deep phenotyping, and patient profile-matching. We have migrated our system into a scalable, cloud-based infrastructure; simplified Monarch\u27s data ingestion and knowledge graph integration systems; enhanced data mapping and integration standards; and developed a new user interface with novel search and graph navigation features. Furthermore, we advanced Monarch\u27s analytic tools by developing a customized plugin for OpenAI\u27s ChatGPT to increase the reliability of its responses about phenotypic data, allowing us to interrogate the knowledge in the Monarch graph using state-of-the-art Large Language Models. The resources of the Monarch Initiative can be found at monarchinitiative.org and its corresponding code repository at github.com/monarch-initiative/monarch-app

    Differences in Out-Of-School Suspensions Between Black and White High School Students When Controlling for Student Factors, School Factors, and Delinquency

    Get PDF
    This quantitative, correlational study aims to determine how accurately out-of-school suspensions can be predicted from a linear combination of student delinquency, in-school delinquency, and prior suspensions for Black and White high school students. Further, a causal-comparative design is used to determine if there is a statically significant difference in out-of-school suspensions between Black and White high school students when controlling for student factors, school factors, and student delinquency factors. The study consists of five guiding theories that inform two general hypotheses. The first hypothesis, referred to as the differential selection hypothesis, is guided by critical race theory (CRT) and implicit bias theory. The second hypothesis, referred to as the differential involvement hypothesis, is guided by self-control, social learning, and attachment theories. These general hypotheses are used to guide the selection of control variables to determine if there is a statistically significant difference in out-of-school suspensions between Black and White high school students. This study will fill a gap in the literature concerning the understudied differential involvement hypothesis and the fidelity of the differential selection hypothesis. Using a series of instruments and student survey to collect demographic, school, and delinquency information, data was collected from 120 White and 120 Black high school students in central California. Data was analyzed using multiple regression and an ANCOVA. A discussion of the study’s limitations and future recommendations is offered following the findings

    Quantifying Equity Risk Premia: Financial Economic Theory and High-Dimensional Statistical Methods

    Get PDF
    The overarching question of this dissertation is how to quantify the unobservable risk premium of a stock when its return distribution varies over time. The first chapter, titled “Theory-based versus machine learning-implied stock risk premia”, starts with a comparison of two competing strands of the literature. The approach advocated by Martin and Wagner (2019) relies on financial economic theory to derive a closed-form approximation of conditional risk premia using information embedded in the prices of European options. The other approach, exemplified by the study of Gu et al. (2020), draws on the flexibility of machine learning methods and vast amounts of historical data to determine the unknown functional form. The goal of this study is to determine which of the two approaches produces more accurate measurements of stock risk premia. In addition, we present a novel hybrid approach that employs machine learning to overcome the approximation errors induced by the theory-based approach. We find that our hybrid approach is competitive especially at longer investment horizons. The second chapter, titled “The uncertainty principle in asset pricing”, introduces a representation of the conditional capital asset pricing model (CAPM) in which the betas and the equity premium are jointly characterized by the information embedded in option prices. A unique feature of our model is that its implied components represent valid measurements of their physical counterparts without the need for any further risk adjustment. Moreover, because the model’s time-varying parameters are directly observable, the model can be tested without any of the complications that typically arise from statistical estimation. One of the main empirical findings is that the well-known flat relationship between average predicted and realized excess returns of beta-sorted portfolios can be explained by the uncertainty governing market excess returns. In the third chapter, titled “Multi-task learning in cross-sectional regressions”, we challenge the way in which cross-sectional regressions are used to test factor models with time-varying loadings. More specifically, we extend the procedure by Fama and MacBeth (1973) by systematically selecting stock characteristics using a combination of l1- and l2-regularization, known as the multi-task Lasso, and addressing the bias that is induced by selection via repeated sample splitting. In the empirical part of this chapter, we apply our testing procedure to the option-implied CAPM from chapter two, and find that, while variants of the momentum effect lead to a rejection of the model, the implied beta is by far the most important predictor of cross-sectional return variation

    2023-2 Dynamic and Stochastic Rational Behavior

    Get PDF
    We analyze consumer demand behavior using Dynamic Random Utility Model (DRUM). Under DRUM, a consumer draws a utility function from a stochastic utility process in each period and maximizes this utility subject to her budget constraint. DRUM allows unrestricted time correlation and cross-section heterogeneity in preferences. We fully characterize DRUM for a panel data of consumer choices and budgets. DRUM is linked to a finite mixture of deterministic behavior represented as the Kronecker product of static rationalizable behavior. We provide a generalization of the Weyl-Minkowski theorem that uses this link and enables conversion of the characterizations of the static Random Utility Model (RUM) of McFadden-Richter (1990) to its dynamic form. DRUM is more flexible than Afriat’s (1967) framework for time series and more informative than RUM. We show the feasibility of the statistical test of DRUM in a Monte Carlo study

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Customer Satisfaction on Quality of ISO Standard 9126 Services in Electronic Banking in Libya

    Get PDF
    Despite the availability of some electronic banking (e-banking) services in Libya, these services are still facing difficulties in many ways, they have advantages and disadvantages, and so far, they have not been understood by some customers to get their satisfaction.  This research aims to know the extent of customer satisfaction (CS) with e-banking services in Libya through ISO 9126 standards.  The research population consisted of all customers of (Al-Wahda) and (Commerce & Development (C&D)) bank in Benghazi city, a random sample of 180 and 207 clients were selected respectively. Research hypotheses have been tested, and we noticed a relationship between the existence of a strong significant correlation between the quality services of e-banking according to quality standards ISO 9126 software and CS among the banks under study. The value of this research comes from the new scientific results to study the impact of the quality of e-banking services on CS in Libyan banks, and its originality lies because of the lack of this research applied to Libyan banks in general and Benghazi city in particular

    A General Theorem and Proof for the Identification of Composed CFA Models

    Get PDF
    In this article, we present a general theorem and proof for the global identification of composed CFA models. They consist of identified submodels that are related only through covariances between their respective latent factors. Composed CFA models are frequently used in the analysis of multimethod data, longitudinal data, or multidimensional psychometric data. Firstly, our theorem enables researchers to reduce the problem of identifying the composed model to the problem of identifying the submodels and verifying the conditions given by our theorem. Secondly, we show that composed CFA models are globally identified if the primary models are reduced models such as the CT-C model or similar types of models. In contrast, composed CFA models that include non-reduced primary models can be globally underidentified for certain types of cross-model covariance assumptions. We discuss necessary and sufficient conditions for the global identification of arbitrary composed CFA models and provide a Python code to check the identification status for an illustrative example. The code we provide can be easily adapted to more complex models

    The Role of a Microservice Architecture on cybersecurity and operational resilience in critical systems

    Get PDF
    Critical systems are characterized by their high degree of intolerance to threats, in other words, their high level of resilience, because depending on the context in which the system is inserted, the slightest failure could imply significant damage, whether in economic terms, or loss of reputation, of information, of infrastructure, of the environment, or human life. The security of such systems is traditionally associated with legacy infrastructures and data centers that are monolithic, which translates into increasingly high evolution and protection challenges. In the current context of rapid transformation where the variety of threats to systems has been consistently increasing, this dissertation aims to carry out a compatibility study of the microservice architecture, which is denoted by its characteristics such as resilience, scalability, modifiability and technological heterogeneity, being flexible in structural adaptations, and in rapidly evolving and highly complex settings, making it suited for agile environments. It also explores what response artificial intelligence, more specifically machine learning, can provide in a context of security and monitorability when combined with a simple banking system that adopts the microservice architecture.Os sistemas críticos são caracterizados pelo seu elevado grau de intolerância às ameaças, por outras palavras, o seu alto nível de resiliência, pois dependendo do contexto onde se insere o sistema, a mínima falha poderá implicar danos significativos, seja em termos económicos, de perda de reputação, de informação, de infraestrutura, de ambiente, ou de vida humana. A segurança informática de tais sistemas está tradicionalmente associada a infraestruturas e data centers legacy, ou seja, de natureza monolítica, o que se traduz em desafios de evolução e proteção cada vez mais elevados. No contexto atual de rápida transformação, onde as variedades de ameaças aos sistemas têm vindo consistentemente a aumentar, esta dissertação visa realizar um estudo de compatibilidade da arquitetura de microserviços, que se denota pelas suas caraterísticas tais como a resiliência, escalabilidade, modificabilidade e heterogeneidade tecnológica, sendo flexível em adaptações estruturais, e em cenários de rápida evolução e elevada complexidade, tornando-a adequada a ambientes ágeis. Explora também a resposta que a inteligência artificial, mais concretamente, machine learning, pode dar num contexto de segurança e monitorabilidade quando combinado com um simples sistema bancário que adota uma arquitetura de microserviços

    The Forward Physics Facility at the High-Luminosity LHC

    Get PDF
    High energy collisions at the High-Luminosity Large Hadron Collider (LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing LHC experiments. The proposed Forward Physics Facility (FPF), to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe standard model (SM) processes and search for physics beyond the standard model (BSM). In this report, we review the status of the civil engineering plans and the experiments to explore the diverse physics signals that can be uniquely probed in the forward region. FPF experiments will be sensitive to a broad range of BSM physics through searches for new particle scattering or decay signatures and deviations from SM expectations in high statistics analyses with TeV neutrinos in this low-background environment. High statistics neutrino detection will also provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. We report here on these physics topics, on infrastructure, detector, and simulation studies, and on future directions to realize the FPF's physics potential

    Digital Innovation Management and Path Dependence: An Integrated Perspective of Manufacturing Incumbents

    Get PDF
    Is digital innovation a big chance or a big threat for physical product-centric incumbents? Building on the unique characteristics of digital innovation, new market players can break the dominance of incumbents by providing digitally enabled products with distinct characteristics. Therefore, this paper empirically explores the dynamics within incumbents related to digital innovation management. Qualitative methods are used to systematically and inductively gain insights into how digital innovation is considered in the context of incumbents with physical product-driven business models. We use path dependence theory to explain the findings and support theoretical generalization of our results. The study contributes to the literature on digital innovation, how incumbents manage digital innovation under certain circumstances, and the related impacts on their business model. Further, we suggest three stages of digital innovation management in the context of path dependence
    corecore