27,286 research outputs found
ImmPort, toward repurposing of open access immunological assay data for translational and clinical research
Immunology researchers are beginning to explore the possibilities of reproducibility, reuse and secondary analyses of immunology data. Open-access datasets are being applied in the validation of the methods used in the original studies, leveraging studies for meta-analysis, or generating new hypotheses. To promote these goals, the ImmPort data repository was created for the broader research community to explore the wide spectrum of clinical and basic research data and associated findings. The ImmPort ecosystem consists of four componentsâPrivate Data, Shared Data, Data Analysis, and Resourcesâfor data archiving, dissemination, analyses, and reuse. To date, more than 300 studies have been made freely available through the ImmPort Shared Data portal , which allows research data to be repurposed to accelerate the translation of new insights into discoveries
Polyglot Semantic Parsing in APIs
Traditional approaches to semantic parsing (SP) work by training individual
models for each available parallel dataset of text-meaning pairs. In this
paper, we explore the idea of polyglot semantic translation, or learning
semantic parsing models that are trained on multiple datasets and natural
languages. In particular, we focus on translating text to code signature
representations using the software component datasets of Richardson and Kuhn
(2017a,b). The advantage of such models is that they can be used for parsing a
wide variety of input natural languages and output programming languages, or
mixed input languages, using a single unified model. To facilitate modeling of
this type, we develop a novel graph-based decoding framework that achieves
state-of-the-art performance on the above datasets, and apply this method to
two other benchmark SP tasks.Comment: accepted for NAACL-2018 (camera ready version
Recommended from our members
A monitoring approach for runtime service discovery
Effective runtime service discovery requires identification of services based on different service characteristics such as structural, behavioural, quality, and contextual characteristics. However, current service registries guarantee services described in terms of structural and sometimes quality characteristics and, therefore, it is not always possible to assume that services in them will have all the characteristics required for effective service discovery. In this paper, we describe a monitor-based runtime service discovery framework called MoRSeD. The framework supports service discovery in both push and pull modes of query execution. The push mode of query execution is performed in parallel to the execution of a service-based system, in a proactive way. Both types of queries are specified in a query language called SerDiQueL that allows the representation of structural, behavioral, quality, and contextual conditions of services to be identified. The framework uses a monitor component to verify if behavioral and contextual conditions in the queries can be satisfied by services, based on translations of these conditions into properties represented in event calculus, and verification of the satisfiability of these properties against services. The monitor is also used to support identification that services participating in a service-based system are unavailable, and identification of changes in the behavioral and contextual characteristics of the services. A prototype implementation of the framework has been developed. The framework has been evaluated in terms of comparison of its performance when using and when not using the monitor component
Multilevel Contracts for Trusted Components
This article contributes to the design and the verification of trusted
components and services. The contracts are declined at several levels to cover
then different facets, such as component consistency, compatibility or
correctness. The article introduces multilevel contracts and a
design+verification process for handling and analysing these contracts in
component models. The approach is implemented with the COSTO platform that
supports the Kmelia component model. A case study illustrates the overall
approach.Comment: In Proceedings WCSI 2010, arXiv:1010.233
On degenerate models of cosmic inflation
In this article we discuss the role of current and future CMB measurements in
pinning down the model of inflation responsible for the generation of
primordial curvature perturbations. By considering a parameterization of the
effective field theory of inflation with a modified dispersion relation arising
from heavy fields, we derive the dependence of cosmological observables on the
scale of heavy physics . Specifically, we show how the
non-linearity parameters are related to the phase velocity of
curvature perturbations at horizon exit, which is parameterized by
. Bicep2 and Planck findings are shown to be consistent with
a value . However, we find a
degeneracy in the parameter space of inflationary models that can only be
resolved with a detailed knowledge of the shape of the non-Gaussian bispectrum.Comment: 22pp., 1 fig; v2: added some clarifications and references, corrected
typos, matches published versio
Analysis of Feature Models Using Alloy: A Survey
Feature Models (FMs) are a mechanism to model variability among a family of
closely related software products, i.e. a software product line (SPL). Analysis
of FMs using formal methods can reveal defects in the specification such as
inconsistencies that cause the product line to have no valid products. A
popular framework used in research for FM analysis is Alloy, a light-weight
formal modeling notation equipped with an efficient model finder. Several works
in the literature have proposed different strategies to encode and analyze FMs
using Alloy. However, there is little discussion on the relative merits of each
proposal, making it difficult to select the most suitable encoding for a
specific analysis need. In this paper, we describe and compare those strategies
according to various criteria such as the expressivity of the FM notation or
the efficiency of the analysis. This survey is the first comparative study of
research targeted towards using Alloy for FM analysis. This review aims to
identify all the best practices on the use of Alloy, as a part of a framework
for the automated extraction and analysis of rich FMs from natural language
requirement specifications.Comment: In Proceedings FMSPLE 2016, arXiv:1603.0857
Ontology-based patterns for the integration of business processes and enterprise application architectures
Increasingly, enterprises are using Service-Oriented Architecture (SOA) as an approach to Enterprise Application Integration (EAI). SOA has the potential to bridge
the gap between business and technology and to improve the reuse of existing applications and the interoperability with new ones. In addition to service architecture
descriptions, architecture abstractions like patterns and styles capture design knowledge and allow the reuse of successfully applied designs, thus improving the quality of
software. Knowledge gained from integration projects can be captured to build a repository of semantically enriched, experience-based solutions. Business patterns identify the interaction and structure between users, business processes, and data.
Specific integration and composition patterns at a more technical level address enterprise application integration and capture reliable architecture solutions. We use an
ontology-based approach to capture architecture and process patterns. Ontology techniques for pattern definition, extension and composition are developed and their
applicability in business process-driven application integration is demonstrated
- âŠ