153,254 research outputs found
Component Substitution through Dynamic Reconfigurations
Component substitution has numerous practical applications and constitutes an
active research topic. This paper proposes to enrich an existing
component-based framework--a model with dynamic reconfigurations making the
system evolve--with a new reconfiguration operation which "substitutes"
components by other components, and to study its impact on sequences of dynamic
reconfigurations.
Firstly, we define substitutability constraints which ensure the component
encapsulation while performing reconfigurations by component substitutions.
Then, we integrate them into a substitutability-based simulation to take these
substituting reconfigurations into account on sequences of dynamic
reconfigurations. Thirdly, as this new relation being in general undecidable
for infinite-state systems, we propose a semi-algorithm to check it on the fly.
Finally, we report on experimentations using the B tools to show the
feasibility of the developed approach, and to illustrate the paper's proposals
on an example of the HTTP server.Comment: In Proceedings FESCA 2014, arXiv:1404.043
Fundamental properties of neighbourhood substitution in constraint satisfaction problems
AbstractIn combinatorial problems it is often worthwhile simplifying the problem, using operations such as consistency, before embarking on an exhaustive search for solutions. Neighbourhood substitution is such a simplification operation. Whenever a value x for a variable is such that it can be replaced in all constraints by another value y, then x is eliminated.This paper shows that neighbourhood substitutions are important whether the aim is to find one or all solutions. It is proved that the result of a convergent sequence of neighbourhood substitutions is invariant modulo isomorphism. An efficient algorithm is given to find such a sequence. It is also shown that to combine consistency (of any order) and neighbourhood substitution, we only need to establish consistency once
Web services synchronization health care application
With the advance of Web Services technologies and the emergence of Web
Services into the information space, tremendous opportunities for empowering
users and organizations appear in various application domains including
electronic commerce, travel, intelligence information gathering and analysis,
health care, digital government, etc. In fact, Web services appear to be s
solution for integrating distributed, autonomous and heterogeneous information
sources. However, as Web services evolve in a dynamic environment which is the
Internet many changes can occur and affect them. A Web service is affected when
one or more of its associated information sources is affected by schema
changes. Changes can alter the information sources contents but also their
schemas which may render Web services partially or totally undefined. In this
paper, we propose a solution for integrating information sources into Web
services. Then we tackle the Web service synchronization problem by
substituting the affected information sources. Our work is illustrated with a
healthcare case study.Comment: 18 pages, 12 figure
Inferring Algebraic Effects
We present a complete polymorphic effect inference algorithm for an ML-style
language with handlers of not only exceptions, but of any other algebraic
effect such as input & output, mutable references and many others. Our main aim
is to offer the programmer a useful insight into the effectful behaviour of
programs. Handlers help here by cutting down possible effects and the resulting
lengthy output that often plagues precise effect systems. Additionally, we
present a set of methods that further simplify the displayed types, some even
by deliberately hiding inferred information from the programmer
Improving Integrity Constraints Checking In Distributed Databases by Exploiting Local Checking
Integrity constraints are important tools and useful for specifying consistent states of a database. Checking integrity constraints has proven to be extremely difficult to implement, particularly in distributed database. The main issue concerning checking the integrity constraints in distributed database system is how to derive a set of integrity tests (simplified forms) that will reduce the amount of data transferred, the amount of data accessed, and the number of sites involved during the constraint checking process. Most of the previous approaches derive integrity tests (simplified forms) from the initial integrity constraints with the sufficiency property, since the sufficient test is known to be cheaper to execute than the complete test as it involved less data to be transferred across the network and always can be evaluated at the target site, i.e. only one site is involved during the checking process thus, achieving local checking. The previous approaches assume that an update operation will be executed at a site where the relation specified in the update operation is located (target site), which is not always true. If the update operation is submitted at a different site, the sufficient test is no longer local as it will definitely access data from the remote sites. Therefore, an approach is needed so that local checking can be performed regardless the location of the submitted update operation.
In this thesis we proposed an approach for checking integrity constraints in a distributed database system by utilizing as much as possible the information stored at the target site. The proposed constraints simplification approach produces support tests and this is integrated with complete and sufficient tests which are proposed by previous researchers. It uses the initial integrity constraint, the update template, and the other integrity constraints to generate the support tests.
The proposed constraints simplification approach adopted the substitution technique and the absorption rules to derive the tests. Since the constraint simplification approach derives several different types of integrity tests for a given update operation and integrity constraint, therefore a strategy to select the most suitable test is needed. We proposed a model to rank and select the suitable test to be checked based on the properties of the tests, the amount of data transferred across the network, the number of sites participated, and the amount of data accessed.
Three analyses have been performed to evaluate the proposed checking integrity constraints approach. The first analysis shows that applying different types of integrity tests gives different impacts to the performance of the constraint checking, with respect to the amount of data transferred across the network which is considered as the most critical factor that influences the performance of the checking mechanism. Integrating these various types of integrity tests during constraint checking has enhanced the performance of the constraint mechanisms. The second analysis shows that the cost of checking integrity constraints is reduced when various combinations of integrity tests are selected. The third analysis shows that in most cases localizing integrity checking can be achieved regardless of the location where the update operation is executed when various types of integrity tests are considered
Factoring Predicate Argument and Scope Semantics : underspecified Semantics with LTAG
In this paper we propose a compositional semantics for lexicalized tree-adjoining grammar (LTAG). Tree-local multicomponent derivations allow separation of the semantic contribution of a lexical item into one component contributing to the predicate argument structure and a second component contributing to scope semantics. Based on this idea a syntax-semantics interface is presented where the compositional semantics depends only on the derivation structure. It is shown that the derivation structure (and indirectly the locality of derivations) allows an appropriate amount of underspecification. This is illustrated by investigating underspecified representations for quantifier scope ambiguities and related phenomena such as adjunct scope and island constraints
D-Tree Grammars
DTG are designed to share some of the advantages of TAG while overcoming some
of its limitations. DTG involve two composition operations called subsertion
and sister-adjunction. The most distinctive feature of DTG is that, unlike TAG,
there is complete uniformity in the way that the two DTG operations relate
lexical items: subsertion always corresponds to complementation and
sister-adjunction to modification. Furthermore, DTG, unlike TAG, can provide a
uniform analysis for em wh-movement in English and Kashmiri, despite the fact
that the em wh element in Kashmiri appears in sentence-second position, and not
sentence-initial position as in English.Comment: Latex source, needs aclap.sty, 8 pages, to appear in ACL-9
- …