4,553 research outputs found

    A Framework for Evaluating Model-Driven Self-adaptive Software Systems

    Get PDF
    In the last few years, Model Driven Development (MDD), Component-based Software Development (CBSD), and context-oriented software have become interesting alternatives for the design and construction of self-adaptive software systems. In general, the ultimate goal of these technologies is to be able to reduce development costs and effort, while improving the modularity, flexibility, adaptability, and reliability of software systems. An analysis of these technologies shows them all to include the principle of the separation of concerns, and their further integration is a key factor to obtaining high-quality and self-adaptable software systems. Each technology identifies different concerns and deals with them separately in order to specify the design of the self-adaptive applications, and, at the same time, support software with adaptability and context-awareness. This research studies the development methodologies that employ the principles of model-driven development in building self-adaptive software systems. To this aim, this article proposes an evaluation framework for analysing and evaluating the features of model-driven approaches and their ability to support software with self-adaptability and dependability in highly dynamic contextual environment. Such evaluation framework can facilitate the software developers on selecting a development methodology that suits their software requirements and reduces the development effort of building self-adaptive software systems. This study highlights the major drawbacks of the propped model-driven approaches in the related works, and emphasise on considering the volatile aspects of self-adaptive software in the analysis, design and implementation phases of the development methodologies. In addition, we argue that the development methodologies should leave the selection of modelling languages and modelling tools to the software developers.Comment: model-driven architecture, COP, AOP, component composition, self-adaptive application, context oriented software developmen

    Use of Banking Services in Emerging Markets -Household-Level Evidence (Replaces CentER DP 2010-092)

    Get PDF
    This paper uses survey data for 60,000 households from 29 transition economies in 2006 and 2010 to explore how the use of banking services is related to household characteristics, as well as to bank ownership, deposit insurance and creditor protection. At the household level we find that the holding of a bank account, a bank card, or a mortgage increases with income and education in most countries and find evidence for an urban-rural gap. The use of banking services is also related to the religion and social integration of a household as well as the gender of the household head. Using the within-country variation between 2006 and 2010, we find that the privatization of state-owned banks and an increase in market share of foreign banks are associated with a stronger use of banking services. Foreign bank ownership is also associated with a higher use of bank services among highincome households and households with formal employment. State ownership, by contrast is hardly associated with more outreach to poorer households. More generous deposit insurance and stronger creditor rights also foster the use of banking services among the urban, rich, better educated and formally employed.Access to finance;Household finance;Bank-ownership;Deposit insurance;Creditor protection

    Semantic model-driven development of service-centric software architectures

    Get PDF
    Service-oriented architecture (SOA) is a recent architectural paradigm that has received much attention. The prevalent focus on platforms such as Web services, however, needs to be complemented by appropriate software engineering methods. We propose the model-driven development of service-centric software systems. We present in particular an investigation into the role of enriched semantic modelling for a modeldriven development framework for service-centric software systems. Ontologies as the foundations of semantic modelling and its enhancement through architectural pattern modelling are at the core of the proposed approach. We introduce foundations and discuss the benefits and also the challenges in this context

    Engineering failure analysis and design optimisation with HiP-HOPS

    Get PDF
    The scale and complexity of computer-based safety critical systems, like those used in the transport and manufacturing industries, pose significant challenges for failure analysis. Over the last decade, research has focused on automating this task. In one approach, predictive models of system failure are constructed from the topology of the system and local component failure models using a process of composition. An alternative approach employs model-checking of state automata to study the effects of failure and verify system safety properties. In this paper, we discuss these two approaches to failure analysis. We then focus on Hierarchically Performed Hazard Origin & Propagation Studies (HiP-HOPS) - one of the more advanced compositional approaches - and discuss its capabilities for automatic synthesis of fault trees, combinatorial Failure Modes and Effects Analyses, and reliability versus cost optimisation of systems via application of automatic model transformations. We summarise these contributions and demonstrate the application of HiP-HOPS on a simplified fuel oil system for a ship engine. In light of this example, we discuss strengths and limitations of the method in relation to other state-of-the-art techniques. In particular, because HiP-HOPS is deductive in nature, relating system failures back to their causes, it is less prone to combinatorial explosion and can more readily be iterated. For this reason, it enables exhaustive assessment of combinations of failures and design optimisation using computationally expensive meta-heuristics. (C) 2010 Elsevier Ltd. All rights reserved

    Semi-automatic distribution pattern modeling of web service compositions using semantics

    Get PDF
    Enterprise systems are frequently built by combining a number of discrete Web services together, a process termed composition. There are a number of architectural configurations or distribution patterns, which express how a composed system is to be deployed. Previously, we presented a Model Driven Architecture using UML 2.0, which took existing service interfaces as its input and generated an executable Web service composition, guided by a distribution pattern model. In this paper, we propose using Web service semantic descriptions in addition to Web service interfaces, to assist in the semi-automatic generation of the distribution pattern model. Web services described using semantic languages, such as OWL-S, can be automatically assessed for compatibility and their input and output messages can be mapped to each other

    An R library for compositional data analysis in archaeometry

    Get PDF
    Compositional data naturally arises from the scientific analysis of the chemical composition of archaeological material such as ceramic and glass artefacts. Data of this type can be explored using a variety of techniques, from standard multivariate methods such as principal components analysis and cluster analysis, to methods based upon the use of log-ratios. The general aim is to identify groups of chemically similar artefacts that could potentially be used to answer questions of provenance. This paper will demonstrate work in progress on the development of a documented library of methods, implemented using the statistical package R, for the analysis of compositional data. R is an open source package that makes available very powerful statistical facilities at no cost. We aim to show how, with the aid of statistical software such as R, traditional exploratory multivariate analysis can easily be used alongside, or in combination with, specialist techniques of compositional data analysis. The library has been developed from a core of basic R functionality, together with purpose-written routines arising from our own research (for example that reported at CoDaWork'03). In addition, we have included other appropriate publicly available techniques and libraries that have been implemented in R by other authors. Available functions range from standard multivariate techniques through to various approaches to log-ratio analysis and zero replacement. We also discuss and demonstrate a small selection of relatively new techniques that have hitherto been little-used in archaeometric applications involving compositional data. The application of the library to the analysis of data arising in archaeometry will be demonstrated; results from different analyses will be compared; and the utility of the various methods discussedGeologische Vereinigung; Institut d’Estadística de Catalunya; International Association for Mathematical Geology; Patronat de l’Escola Politècnica Superior de la Universitat de Girona; Fundació privada: Girona, Universitat i Futur; Càtedra Lluís Santaló d’Aplicacions de la Matemàtica; Consell Social de la Universitat de Girona; Ministerio de Ciencia i Tecnología

    The influence of CpG and UpA dinucleotide frequencies on RNA virus replication and characterization of the innate cellular pathways underlying virus attenuation and enhanced replication

    Get PDF
    Most RNA viruses infecting mammals and other vertebrates show profound suppression of CpG and UpA dinucleotide frequencies. To investigate this functionally, mutants of the picornavirus, echovirus 7 (E7), were constructed with altered CpG and UpA compositions in two 1.1–1.3 Kbase regions. Those with increased frequencies of CpG and UpA showed impaired replication kinetics and higher RNA/infectivity ratios compared with wild-type virus. Remarkably, mutants with CpGs and UpAs removed showed enhanced replication, larger plaques and rapidly outcompeted wild-type virus on co-infections. Luciferase-expressing E7 sub-genomic replicons with CpGs and UpAs removed from the reporter gene showed 100-fold greater luminescence. E7 and mutants were equivalently sensitive to exogenously added interferon-β, showed no evidence for differential recognition by ADAR1 or pattern recognition receptors RIG-I, MDA5 or PKR. However, kinase inhibitors roscovitine and C16 partially or entirely reversed the attenuated phenotype of high CpG and UpA mutants, potentially through inhibition of currently uncharacterized pattern recognition receptors that respond to RNA composition. Generating viruses with enhanced replication kinetics has applications in vaccine production and reporter gene construction. More fundamentally, the findings introduce a new evolutionary paradigm where dinucleotide composition of viral genomes is subjected to selection pressures independently of coding capacity and profoundly influences host–pathogen interactions

    Electron Cryomicroscopy: From Molecules to Cells

    Get PDF
    Today's biomolecular electron microscopy uses essentially three different imaging modalities: (i) electron crystallography, (ii) single particle analysis and (iii) electron tomography. Ideally, these imaging modalities are applied to frozen-hydrated samples to ensure an optimum preservation of the structures under scrutiny. Electron crystallography requires the existence of two-dimensional crystals. In principle, electron crystallography is a high-resolution technique and it has indeed been demonstrated in a number of cases that near-atomic resolution can be attained. Single-particle analysis is particularly suited for structural studies of large macromolecular complexes. The amount of material needed is minute and some degree of heterogeneity is tolerable since image classification can be used for further 'purification in silico'. In principle, single particle analysis can attain high-resolution but, in practice, this often remains an elusive goal. However, since medium resolution structures can be obtained relatively easily, it often provides an excellent basis for hybrid approaches in which high-resolution structures of components are integrated into the medium resolution structures of the holocomplexes. Electron tomography can be applied to non-repetitive structures. Most supramolecuar structures inside cells fall into this category. In order to obtain three-dimensional structures of objects with unique topologies it is necessary to obtain different views by physical tilting. The challenge is to obtain large numbers of projection images covering as wide a tilt range as possible and, at the same time, to minimize the cumulative electron dose. Cryoelectron tomography provides medium resolution three-dimensional images of a wide range of biological structures from isolated supramolecular assemblies to organelles and cells. It allows the visualization of molecular machines in their functional environment (in situ) and the mapping of entire molecular landscapes

    About the whereabouts of indefinites

    Get PDF
    The paper characterizes three different domains in the German middle field which are relevant for the interpretation of an indefinite. It is argued that the so-called 'strong' reading of an indefinite is the basic one and that the 'weak' reading needs special licensing which is mirrored by certain syntactic requirements. Some popular claims about the relation between the position and the interpretation of indefinites as well as some claims about scrambling are discussed and rejected. From the findings also follows that the strong reading of an indefinite is independent of its information status
    corecore