2,610 research outputs found
Challenges to describe QoS requirements for web services quality prediction to support web services interoperability in electronic commerce
Quality of service (QoS) is significant and necessary for web service applications quality assurance. Furthermore, web services quality has contributed to the successful implementation of Electronic Commerce (EC) applications. However, QoS is still the big issue for web services research and remains one of the main research questions that need to be explored. We believe that QoS should not only be measured but should also be predicted during the development and implementation stages. However, there are challenges and constraints to determine and choose QoS requirements for high quality web services. Therefore, this paper highlights the challenges for the QoS requirements prediction as they are not easy to identify. Moreover, there are many different perspectives and purposes of web services, and various prediction techniques to describe QoS requirements. Additionally, the paper introduces a metamodel as a concept of what makes a good web service
A Constrained Object Model for Configuration Based Workflow Composition
Automatic or assisted workflow composition is a field of intense research for
applications to the world wide web or to business process modeling. Workflow
composition is traditionally addressed in various ways, generally via theorem
proving techniques. Recent research observed that building a composite workflow
bears strong relationships with finite model search, and that some workflow
languages can be defined as constrained object metamodels . This lead to
consider the viability of applying configuration techniques to this problem,
which was proven feasible. Constrained based configuration expects a
constrained object model as input. The purpose of this document is to formally
specify the constrained object model involved in ongoing experiments and
research using the Z specification language.Comment: This is an extended version of the article published at BPM'05, Third
International Conference on Business Process Management, Nancy Franc
Towards Consistency Management for a Business-Driven Development of SOA
The usage of the Service Oriented Architecture
(SOA) along with the Business Process Management has emerged
as a valuable solution for the complex (business process driven)
system engineering. With a Model Driven Engineering where the
business process models drive the supporting service component
architectures, less effort is gone into the Business/IT alignment
during the initial development activities, and the IT developers
can rapidly proceed with the SOA implementation. However, the
difference between the design principles of the emerging domainspecific
languages imposes serious challenges in the following
re-design phases. Moreover, enabling evolutions on the business
process models while keeping them synchronized with the underlying
software architecture models is of high relevance to the key
elements of any Business Driven Development (BDD). Given a
business process update, this paper introduces an incremental
model transformation approach that propagates this update
to the related service component configurations. It, therefore,
supports the change propagation among heterogenous domainspecific
languages, e.g., the BPMN and the SCA. As a major
contribution, our approach makes model transformation more
tractable to reconfigure system architecture without disrupting its
structural consistency. We propose a synchronizer that provides
the BPMN-to-SCA model synchronization with the help of the
conditional graph rewriting
On the Notion of Abstract Platform in MDA Development
Although platform-independence is a central property in MDA models, the study of platform-independence has been largely overlooked in MDA. As a consequence, there is a lack of guidelines to select abstraction criteria and modelling concepts for platform-independent design. In addition, there is little methodological support to distinguish between platform-independent and platform-specific concerns, which could be detrimental to the beneficial exploitation of the PIM-PSM separation-of-concerns adopted by MDA. This work is an attempt towards clarifying the notion of platform-independent modelling in MDA development. We argue that each level of platform-independence must be accompanied by the identification of an abstract platform. An abstract platform is determined by the platform characteristics that are relevant for applications at a certain level of platform-independence, and must be established by balancing various design goals. We present some methodological principles for abstract platform design, which forms a basis for defining requirements for design languages intended to support platform-independent design. Since our methodological framework is based on the notion of abstract platform, we pay particular attention to the definition of abstract platforms and the language requirements to specify abstract platforms. We discuss how the concept of abstract platform relates to UML
Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models
This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
Regression Models and Experimental Designs: A Tutorial for Simulation Analaysts
This tutorial explains the basics of linear regression models. especially low-order polynomials. and the corresponding statistical designs. namely, designs of resolution III, IV, V, and Central Composite Designs (CCDs).This tutorial assumes 'white noise', which means that the residuals of the fitted linear regression model are normally, independently, and identically distributed with zero mean.The tutorial gathers statistical results that are scattered throughout the literature on mathematical statistics, and presents these results in a form that is understandable to simulation analysts.metamodels;fractional factorial designs;Plackett-Burman designs;factor interactions;validation;cross-validation
Validation of Simulation, With and Without Real Data
This paper gives a survey on how to validate simulation models through the application of mathematical statistics. The type of statistical test actually applied, depends on the availability of data on the real system: (i) no data, (ii) only output data, and (iii) both input and output data. In case (i), the system analysts can still experiment with the simulation model to obtain simulated data. Those experiments should be guided by the statistical theory on design of experiments (DOE); an inferior - but popular - approach is to change only one factor at a time. In case (ii), real and simulated output data may be compared through the well-known Student t statistic. In case (iii), trace-driven simulation becomes possible. Then validation, however, should not proceed as follows: make a scatter plot with real and simulated outputs, fit a line, and test whether that line has unit slope and passes through the origin. Instead, better tests are presented. Several case studies are summarized, to illustrate the three types of situations.verification;credibility;assessment;sensitivity;robustness;regression
- …