257,357 research outputs found
Non-functional properties in the model-driven development of service-oriented systems
Systems based on the service-oriented architecture (SOA) principles have become an important cornerstone of the development of enterprise-scale software applications. They are characterized by separating functions into distinct software units, called services, which can be published, requested and dynamically combined in the production of business applications. Service-oriented systems (SOSs) promise high flexibility, improved maintainability, and simple re-use of functionality. Achieving these properties requires an understanding not only of the individual artifacts of the system but also their integration. In this context, non-functional aspects play an important role and should be analyzed and modeled as early as possible in the development cycle. In this paper, we discuss modeling of non-functional aspects of service-oriented systems, and the use of these models for analysis and deployment. Our contribution in this paper is threefold. First, we show how services and service compositions may be modeled in UML by using a profile for SOA (UML4SOA) and how non-functional properties of service-oriented systems can be represented using the non-functional extension of UML4SOA (UML4SOA-NFP) and the MARTE profile. This enables modeling of performance, security and reliable messaging. Second, we discuss formal analysis of models which respect this design, in particular we consider performance estimates and reliability analysis using the stochastically timed process algebra PEPA as the underlying analytical engine. Last but not least, our models are the source for the application of deployment mechanisms which comprise model-to-model and model-to-text transformations implemented in the framework VIATRA. All techniques presented in this work are illustrated by a running example from an eUniversity case study
A Systematic Mapping Study on Requirements Engineering in Software Ecosystems
Software ecosystems (SECOs) and open innovation processes have been claimed
as a way forward for the software industry. A proper understanding of
requirements is as important for these IT-systems as for more traditional ones.
This paper presents a mapping study on the issues of requirements engineering
and quality aspects in SECOs and analyzes emerging ideas. Our findings indicate
that among the various phases or subtasks of requirements engineering, most of
the SECO specific research has been accomplished on elicitation, analysis, and
modeling. On the other hand, requirements selection, prioritization,
verification, and traceability has attracted few published studies. Among the
various quality attributes, most of the SECOs research has been performed on
security, performance and testability. On the other hand, reliability, safety,
maintainability, transparency, usability attracted few published studies. The
paper provides a review of the academic literature about SECO-related
requirements engineering activities, modeling approaches, and quality
attributes, positions the source publications in a taxonomy of issues and
identifies gaps where there has been little research.Comment: Journal of Information Technology Research (JITR) 11(1
Recommended from our members
Evaluating health information system projects using analytical and modeling methods
In recent years the value of the use of evaluation methods in all aspects of economic sectors, including the field of health care, has been increasingly recognized. The thrust of this paper is to explore the potential use of a) Cost-Benefit Analysis and b) Simulation and Modeling methods in estimating the value of Health Information System (HIS) projects. These methods were chosen for consideration as the most appropriate, primarily because CBA can measure delivered value of HIS interventions by weighting cost and benefits and identifying what is more socially desirable, while Simulation and Modeling tools can be used to estimate the future value of health care changes concerning HIS. The paper discusses these two families of methods, their evaluation potential is analyzed, and arguments for and against the use of each method are presented. Finally, methodological considerations and limitations are discussed and areas for future research are suggested
Modeling Quantum Optical Components, Pulses and Fiber Channels Using OMNeT++
Quantum Key Distribution (QKD) is an innovative technology which exploits the
laws of quantum mechanics to generate and distribute unconditionally secure
cryptographic keys. While QKD offers the promise of unconditionally secure key
distribution, real world systems are built from non-ideal components which
necessitates the need to model and understand the impact these non-idealities
have on system performance and security. OMNeT++ has been used as a basis to
develop a simulation framework to support this endeavor. This framework,
referred to as "qkdX" extends OMNeT++'s module and message abstractions to
efficiently model optical components, optical pulses, operating protocols and
processes. This paper presents the design of this framework including how
OMNeT++'s abstractions have been utilized to model quantum optical components,
optical pulses, fiber and free space channels. Furthermore, from our toolbox of
created components, we present various notional and real QKD systems, which
have been studied and analyzed.Comment: Published in: A. F\"orster, C. Minkenberg, G. R. Herrera, M. Kirsche
(Eds.), Proc. of the 2nd OMNeT++ Community Summit, IBM Research - Zurich,
Switzerland, September 3-4, 201
Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance
Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft
or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner.
Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''.
The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few.
This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage.
The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling
Using webcrawling of publicly available websites to assess E-commerce relationships
We investigate e-commerce success factors concerning their impact on the success of commerce transactions between businesses companies. In scientific literature, many e-commerce success factors are introduced. Most of them are focused on companies' website quality. They are evaluated concerning companies' success in the business-to- consumer (B2C) environment where consumers choose their preferred e-commerce websites based on these success factors e.g. website content quality, website interaction, and website customization. In contrast to previous work, this research focuses on the usage of existing e-commerce success factors for predicting successfulness of business-to-business (B2B) ecommerce. The introduced methodology is based on the identification of semantic textual patterns representing success factors from the websites of B2B companies. The successfulness of the identified success factors in B2B ecommerce is evaluated by regression modeling. As a result, it is shown that some B2C e-commerce success factors also enable the predicting of B2B e-commerce success while others do not. This contributes to the existing literature concerning ecommerce success factors. Further, these findings are valuable for B2B e-commerce websites creation
Early aspects: aspect-oriented requirements engineering and architecture design
This paper reports on the third Early Aspects: Aspect-Oriented Requirements Engineering and Architecture Design Workshop, which has been held in Lancaster, UK, on March 21, 2004. The workshop included a presentation session and working sessions in which the particular topics on early aspects were discussed. The primary goal of the workshop was to focus on challenges to defining methodical software development processes for aspects from early on in the software life cycle and explore the potential of proposed methods and techniques to scale up to industrial applications
- …