27,286 research outputs found

    ImmPort, toward repurposing of open access immunological assay data for translational and clinical research

    Get PDF
    Immunology researchers are beginning to explore the possibilities of reproducibility, reuse and secondary analyses of immunology data. Open-access datasets are being applied in the validation of the methods used in the original studies, leveraging studies for meta-analysis, or generating new hypotheses. To promote these goals, the ImmPort data repository was created for the broader research community to explore the wide spectrum of clinical and basic research data and associated findings. The ImmPort ecosystem consists of four components–Private Data, Shared Data, Data Analysis, and Resources—for data archiving, dissemination, analyses, and reuse. To date, more than 300 studies have been made freely available through the ImmPort Shared Data portal , which allows research data to be repurposed to accelerate the translation of new insights into discoveries

    Polyglot Semantic Parsing in APIs

    Full text link
    Traditional approaches to semantic parsing (SP) work by training individual models for each available parallel dataset of text-meaning pairs. In this paper, we explore the idea of polyglot semantic translation, or learning semantic parsing models that are trained on multiple datasets and natural languages. In particular, we focus on translating text to code signature representations using the software component datasets of Richardson and Kuhn (2017a,b). The advantage of such models is that they can be used for parsing a wide variety of input natural languages and output programming languages, or mixed input languages, using a single unified model. To facilitate modeling of this type, we develop a novel graph-based decoding framework that achieves state-of-the-art performance on the above datasets, and apply this method to two other benchmark SP tasks.Comment: accepted for NAACL-2018 (camera ready version

    Multilevel Contracts for Trusted Components

    Full text link
    This article contributes to the design and the verification of trusted components and services. The contracts are declined at several levels to cover then different facets, such as component consistency, compatibility or correctness. The article introduces multilevel contracts and a design+verification process for handling and analysing these contracts in component models. The approach is implemented with the COSTO platform that supports the Kmelia component model. A case study illustrates the overall approach.Comment: In Proceedings WCSI 2010, arXiv:1010.233

    On degenerate models of cosmic inflation

    Get PDF
    In this article we discuss the role of current and future CMB measurements in pinning down the model of inflation responsible for the generation of primordial curvature perturbations. By considering a parameterization of the effective field theory of inflation with a modified dispersion relation arising from heavy fields, we derive the dependence of cosmological observables on the scale of heavy physics ΛUV\Lambda_{\rm UV}. Specifically, we show how the fNLf_{\rm NL} non-linearity parameters are related to the phase velocity of curvature perturbations at horizon exit, which is parameterized by ΛUV\Lambda_{\rm UV}. Bicep2 and Planck findings are shown to be consistent with a value ΛUV∌ΛGUT\Lambda_{\rm UV} \sim \Lambda_{\rm GUT}. However, we find a degeneracy in the parameter space of inflationary models that can only be resolved with a detailed knowledge of the shape of the non-Gaussian bispectrum.Comment: 22pp., 1 fig; v2: added some clarifications and references, corrected typos, matches published versio

    Analysis of Feature Models Using Alloy: A Survey

    Full text link
    Feature Models (FMs) are a mechanism to model variability among a family of closely related software products, i.e. a software product line (SPL). Analysis of FMs using formal methods can reveal defects in the specification such as inconsistencies that cause the product line to have no valid products. A popular framework used in research for FM analysis is Alloy, a light-weight formal modeling notation equipped with an efficient model finder. Several works in the literature have proposed different strategies to encode and analyze FMs using Alloy. However, there is little discussion on the relative merits of each proposal, making it difficult to select the most suitable encoding for a specific analysis need. In this paper, we describe and compare those strategies according to various criteria such as the expressivity of the FM notation or the efficiency of the analysis. This survey is the first comparative study of research targeted towards using Alloy for FM analysis. This review aims to identify all the best practices on the use of Alloy, as a part of a framework for the automated extraction and analysis of rich FMs from natural language requirement specifications.Comment: In Proceedings FMSPLE 2016, arXiv:1603.0857

    Ontology-based patterns for the integration of business processes and enterprise application architectures

    Get PDF
    Increasingly, enterprises are using Service-Oriented Architecture (SOA) as an approach to Enterprise Application Integration (EAI). SOA has the potential to bridge the gap between business and technology and to improve the reuse of existing applications and the interoperability with new ones. In addition to service architecture descriptions, architecture abstractions like patterns and styles capture design knowledge and allow the reuse of successfully applied designs, thus improving the quality of software. Knowledge gained from integration projects can be captured to build a repository of semantically enriched, experience-based solutions. Business patterns identify the interaction and structure between users, business processes, and data. Specific integration and composition patterns at a more technical level address enterprise application integration and capture reliable architecture solutions. We use an ontology-based approach to capture architecture and process patterns. Ontology techniques for pattern definition, extension and composition are developed and their applicability in business process-driven application integration is demonstrated
    • 

    corecore