392 research outputs found
Towards data-aware resource analysis for service orchestrations
Compile-time program analysis techniques can be applied to Web service orchestrations to prove or check various properties. In particular, service orchestrations can be subjected to resource analysis, in which safe approximations
of upper and lower resource usage bounds are deduced. A uniform analysis can be simultaneously performed for different generalized resources that can be directiy correlated with cost- and performance-related quality attributes, such as invocations of partners, network traffic, number of activities, iterations, and data accesses. The resulting safe upper and lower bounds do not depend on probabilistic assumptions, and are expressed as functions of size or length of data components from an initiating message, using a finegrained structured data model that corresponds to the XML-style of information structuring. The analysis is performed by transforming a BPEL-like representation of an orchestration into an equivalent program in another programming language for which the appropriate analysis tools already exist
Lattice relaxation around arsenic and selenium in CdTe
We have investigated the lattice relaxation around impurity atoms at the anion sublattice in CdTe, such as As acting as acceptor and Se which is isovalent to Te, with fluorescence detected EXAFS. We experimentally verify the lattice relaxation with a bond length being reduced by 8% around the As atom as inferred indirectly from ab-initio calculations of the electric field gradient in comparison with the measured value in a PAC experiment (S. Lany et al., Phys. Rev. B 62, R2259 (2000)). In the case of the isovalent impurity atom Se, the bond length is similarly reduced as determined experimentally by EXAFS and by model calculations with the density functional theory implemented in the WIEN97 program. In contrast to this inward relaxation, preliminary calculations for Br in CdTe, the next element in the series As, Se, and Br, which acts as donor at the Te sublattice, indicate an increase in bond length, an interesting prediction waiting for experimental verification
Towards structured state threading in prolog
It is very often the case that programs require passing, maintaining, and updating some notion of state. Prolog programs often implement such stateful computations by carrying this state in predicate arguments (or, alternatively, in the internal datábase). This often causes code obfuscation, complicates code reuse, introduces dependencies on the data model, and is prone to incorrect propagation of the state information among predicate calis. To partly solve these problems, we introduce contexts as a consistent mechanism for specifying implicit arguments and its threading in clause goals. We propose a notation and an interpretation for contexts, ranging from single goals to complete programs, give an intuitive semantics, and describe a translation into standard Prolog. We also discuss a particular light-weight implementation in Ciao Prolog, and we show the usefulness of our proposals on a series of examples and applications, including code directiy using contexts, DCGs, extended DCGs, logical loops and other custom control structures
Towards data-aware cost-driven adaptation for service orchestrations.
Several activities in service oriented computing, such as automatic composition, monitoring, and adaptation, can benefit from knowing properties of a given service composition before executing them. Among these properties we will focus on those related to execution cost and resource usage, in a wide sense, as they can be linked to QoS characteristics. In order to attain more accuracy, we
formulate execution costs / resource usage as functions on input data (or appropriate abstractions thereof) and show how these functions can be used to make better, more informed decisions when performing composition, adaptation, and proactive monitoring. We present an approach to, on one
hand, synthesizing these functions in an automatic fashion from the definition of the different orchestrations taking part in a system and, on the other hand, to effectively using them to reduce the overall costs of non-trivial service-based systems featuring sensitivity to data and possibility of failure. We validate our approach by means of simulations of scenarios needing runtime selection of
services and adaptation due to service failure. A number of rebinding strategies, including the use of cost functions, are compared
A sharing-based approach to supporting adaptation in service compositions
Data-related properties of the activities involved in a service composition can be used to facilitate several design-time and run-time adaptation tasks, such as service evolution, distributed enactment, and instance-level adaptation. A number of these properties can be expressed using a notion of sharing. We present an approach for automated inference of data properties based on sharing analysis, which is able to handle service compositions with complex control structures, involving loops and sub-workflows. The properties inferred can include data dependencies, information content, domain-defined attributes, privacy or confidentiality levels, among others. The analysis produces characterizations of the data and the activities in the composition in terms of minimal and maximal sharing, which can then be used to verify compliance of potential adaptation actions, or as supporting information in their generation. This sharing analysis approach can be used both at design time and at run time. In the latter case, the results of analysis can be refined using the composition traces (execution logs) at the point of execution, in order to support run-time adaptation
Strategies and Networks for State-Dependent Quantum Cloning
State-dependent cloning machines that have so far been considered either
deterministically copy a set of states approximately, or probablistically copy
them exactly. In considering the case of two equiprobable pure states, we
derive the maximum global fidelity of approximate clones given initial
exact copies, where . We also consider strategies which interpolate
between approximate and exact cloning. A tight inequality is obtained which
expresses a trade-off between the global fidelity and success probability. This
inequality is found to tend, in the limit as , to a known
inequality which expresses the trade-off between error and inconclusive result
probabilities for state-discrimination measurements. Quantum-computational
networks are also constructed for the kinds of cloning machine we describe. For
this purpose, we introduce two gates: the distinguishability transfer and state
separation gates. Their key properties are describedComment: 12 pages, 6 eps figures, submitted to Phys. Rev.
Nonlocality without inequalities has not been proved for maximally entangled states
Two approaches to extend Hardy's proof of nonlocality without inequalities to
maximally entangled states of bipartite two-level systems are shown to fail. On
one hand, it is shown that Wu and co-workers' proof [Phys. Rev. A 53, R1927
(1996)] uses an effective state which is not maximally entangled. On the other
hand, it is demonstrated that Hardy's proof cannot be generalized by the
replacement of one of the four von Neumann measurements involved in the
original proof by a generalized measurement to unambiguously discriminate
between non-orthogonal states.Comment: 7 pages, 2 figures. To appear in Phys. Rev.
A factor analytic study of the Italian National Institute of Health Quality of Life – Core Evaluation Form (ISSQoL-CEF)
Objectives: The Italian National Institute of Health Quality of Life - Core Evaluation Form (ISSQoL-CEF) is a specific questionnaire measuring health-related quality of life for human immunodeficiency virus-infected people in the era of highly active antiretroviral therapy. The main goal of this study was to examine the construct validity of this questionnaire by confirmation of its hypothesized dimensional structure. Methods: Baseline quality of life data from four clinical studies were collected and a confirmatory factor analysis of the ISSQoL-CEF items was carried out. Both first-order and secondorder factor models were tested: Model 1 with nine correlated first-order factors; Model 2 with three correlated second-order factors (Physical, Mental, and Social Health); Model 3 with two correlated second-order factors (Physical and Mental/Social Health); Model 4 with only one second-order factor (General Health). Results: A total of 261 patients were surveyed. Model 1 had a good fit to the data. Model 2 had an acceptable fit to the data and it was the best of all hierarchical models. However, Model 2 fitted the data worse than Model 1. Conclusions: The findings of in this study, consistent with the results of previous study, pointed out the construct validity of the ISSQoL-CEF. © 2010 Lauriola et al, publisher and licensee Dove Medical Press Ltd
- …