185,729 research outputs found

    A web system trace model and its application to web design

    Full text link
    Traceability analysis is crucial to the development of web-centric systems, particularly those with frequent system changes, fine-grained evolution and maintenance, and high level of requirements uncertainty. A trace model at the level of the web system architecture is presented in this paper to address the specific challenges of developing web-centric systems. The trace model separates the concerns of different stakeholders in the web development life cycle into viewpoints; and classifies each viewpoint into structure and behaviour. Tracing relationships are presented along two dimensions: within viewpoints; and among viewpoints. Examples of tracing relationships are presented using UML. This trace model is demonstrated through its application to the design of a commercial web project using a web-design process. The design artifacts in each activity are transformed based on the artifacts tracing relationship in the trace model. The model provides mechanisms for verification of consistency, completeness and coverage within each viewpoint and the connectedness across viewpoints

    Quality-aware model-driven service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Quality aspects ranging from interoperability to maintainability to performance are of central importance for the integration of heterogeneous, distributed service-based systems. Architecture models can substantially influence quality attributes of the implemented software systems. Besides the benefits of explicit architectures on maintainability and reuse, architectural constraints such as styles, reference architectures and architectural patterns can influence observable software properties such as performance. Empirical performance evaluation is a process of measuring and evaluating the performance of implemented software. We present an approach for addressing the quality of services and service-based systems at the model-level in the context of model-driven service engineering. The focus on architecture-level models is a consequence of the black-box character of services

    Components Interoperability through Mediating Connector Patterns

    Full text link
    A key objective for ubiquitous environments is to enable system interoperability between system's components that are highly heterogeneous. In particular, the challenge is to embed in the system architecture the necessary support to cope with behavioral diversity in order to allow components to coordinate and communicate. The continuously evolving environment further asks for an automated and on-the-fly approach. In this paper we present the design building blocks for the dynamic and on-the-fly interoperability between heterogeneous components. Specifically, we describe an Architectural Pattern called Mediating Connector, that is the key enabler for communication. In addition, we present a set of Basic Mediator Patterns, that describe the basic mismatches which can occur when components try to interact, and their corresponding solutions.Comment: In Proceedings WCSI 2010, arXiv:1010.233

    Model-driven performance evaluation for service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluation of services and service compositions in the context of model-driven service engineering. Temporal databases theory is utilised for the empirical performance evaluation of model-driven developed service systems

    On the Intrinsic Locality Properties of Web Reference Streams

    Full text link
    There has been considerable work done in the study of Web reference streams: sequences of requests for Web objects. In particular, many studies have looked at the locality properties of such streams, because of the impact of locality on the design and performance of caching and prefetching systems. However, a general framework for understanding why reference streams exhibit given locality properties has not yet emerged. In this work we take a first step in this direction, based on viewing the Web as a set of reference streams that are transformed by Web components (clients, servers, and intermediaries). We propose a graph-based framework for describing this collection of streams and components. We identify three basic stream transformations that occur at nodes of the graph: aggregation, disaggregation and filtering, and we show how these transformations can be used to abstract the effects of different Web components on their associated reference streams. This view allows a structured approach to the analysis of why reference streams show given properties at different points in the Web. Applying this approach to the study of locality requires good metrics for locality. These metrics must meet three criteria: 1) they must accurately capture temporal locality; 2) they must be independent of trace artifacts such as trace length; and 3) they must not involve manual procedures or model-based assumptions. We describe two metrics meeting these criteria that each capture a different kind of temporal locality in reference streams. The popularity component of temporal locality is captured by entropy, while the correlation component is captured by interreference coefficient of variation. We argue that these metrics are more natural and more useful than previously proposed metrics for temporal locality. We use this framework to analyze a diverse set of Web reference traces. We find that this framework can shed light on how and why locality properties vary across different locations in the Web topology. For example, we find that filtering and aggregation have opposing effects on the popularity component of the temporal locality, which helps to explain why multilevel caching can be effective in the Web. Furthermore, we find that all transformations tend to diminish the correlation component of temporal locality, which has implications for the utility of different cache replacement policies at different points in the Web.National Science Foundation (ANI-9986397, ANI-0095988); CNPq-Brazi

    Evaluating advanced search interfaces using established information-seeking model

    No full text
    When users have poorly defined or complex goals search interfaces offering only keyword searching facilities provide inadequate support to help them reach their information-seeking objectives. The emergence of interfaces with more advanced capabilities such as faceted browsing and result clustering can go some way to some way toward addressing such problems. The evaluation of these interfaces, however, is challenging since they generally offer diverse and versatile search environments that introduce overwhelming amounts of independent variables to user studies; choosing the interface object as the only independent variable in a study would reveal very little about why one design out-performs another. Nonetheless if we could effectively compare these interfaces we would have a way to determine which was best for a given scenario and begin to learn why. In this article we present a formative framework for the evaluation of advanced search interfaces through the quantification of the strengths and weaknesses of the interfaces in supporting user tactics and varying user conditions. This framework combines established models of users, user needs, and user behaviours to achieve this. The framework is applied to evaluate three search interfaces and demonstrates the potential value of this approach to interactive IR evaluation

    Debugging of Web Applications with Web-TLR

    Full text link
    Web-TLR is a Web verification engine that is based on the well-established Rewriting Logic--Maude/LTLR tandem for Web system specification and model-checking. In Web-TLR, Web applications are expressed as rewrite theories that can be formally verified by using the Maude built-in LTLR model-checker. Whenever a property is refuted, a counterexample trace is delivered that reveals an undesired, erroneous navigation sequence. Unfortunately, the analysis (or even the simple inspection) of such counterexamples may be unfeasible because of the size and complexity of the traces under examination. In this paper, we endow Web-TLR with a new Web debugging facility that supports the efficient manipulation of counterexample traces. This facility is based on a backward trace-slicing technique for rewriting logic theories that allows the pieces of information that we are interested to be traced back through inverse rewrite sequences. The slicing process drastically simplifies the computation trace by dropping useless data that do not influence the final result. By using this facility, the Web engineer can focus on the relevant fragments of the failing application, which greatly reduces the manual debugging effort and also decreases the number of iterative verifications.Comment: In Proceedings WWV 2011, arXiv:1108.208

    A Nine Month Report on Progress Towards a Framework for Evaluating Advanced Search Interfaces considering Information Retrieval and Human Computer Interaction

    No full text
    This is a nine month progress report detailing my research into supporting users in their search for information, where the questions, results or even thei
    corecore