5 research outputs found
Incremental Calibration of Architectural Performance Models with Parametric Dependencies
Architecture-based Performance Prediction (AbPP) allows evaluation of the
performance of systems and to answer what-if questions without measurements for
all alternatives. A difficulty when creating models is that Performance Model
Parameters (PMPs, such as resource demands, loop iteration numbers and branch
probabilities) depend on various influencing factors like input data, used
hardware and the applied workload. To enable a broad range of what-if
questions, Performance Models (PMs) need to have predictive power beyond what
has been measured to calibrate the models. Thus, PMPs need to be parametrized
over the influencing factors that may vary.
Existing approaches allow for the estimation of parametrized PMPs by
measuring the complete system. Thus, they are too costly to be applied
frequently, up to after each code change. They do not keep also manual changes
to the model when recalibrating.
In this work, we present the Continuous Integration of Performance Models
(CIPM), which incrementally extracts and calibrates the performance model,
including parametric dependencies. CIPM responds to source code changes by
updating the PM and adaptively instrumenting the changed parts. To allow AbPP,
CIPM estimates the parametrized PMPs using the measurements (generated by
performance tests or executing the system in production) and statistical
analysis, e.g., regression analysis and decision trees.
Additionally, our approach responds to production changes (e.g., load or
deployment changes) and calibrates the usage and deployment parts of PMs
accordingly.
For the evaluation, we used two case studies. Evaluation results show that we
were able to calibrate the PM incrementally and accurately.Comment: Manar Mazkatli is supported by the German Academic Exchange Service
(DAAD
Continuous Integration of Architectural Performance Models with Parametric Dependencies – The CIPM Approach
Explicitly considering the software architecture supports efficient assessments of quality attributes. In particular, Architecture-based Performance Prediction (AbPP) supports performance assessment for future scenarios (e.g., alternative workload, design, deployment, etc.) without expensive measurements for all such alternatives.
However, accurate AbPP requires an up-to-date architectural Performance Model (aPM) that is parameterized over factors impacting performance like input data characteristics. Especially in agile development, keeping such a parametric aPM consistent with software artifacts is challenging due to frequent evolutionary, adaptive and usage-related changes.
The shortcoming of existing approaches is the scope of consistency maintenance since they do not address the impact of all aforementioned changes. Besides, extracting aPM by static and/or dynamic analysis after each impacting change would cause unnecessary monitoring overhead and may overwrite previous manual adjustments.
In this article, we present our Continuous Integration of architectural Performance Model (CIPM) approach, which automatically updates the parametric aPM after each evolutionary, adaptive or usage change. To reduce the monitoring overhead, CIPM calibrates just the affected performance parameters (e.g., resource demand), using adaptive monitoring. Moreover, CIPM proposes a self-validation process that validates the accuracy, manages the monitoring and recalibrates the inaccurate parts. As a result, CIPM will automatically keep the aPM up-to-date throughout the development time and operation time, which enables AbPP for a proactive identification of upcoming performance problems and evaluating alternatives at low costs.
CIPM is evaluated using three case studies, considering (1) the accuracy of the updated aPMs and associated AbPP and (2) the applicability of CIPM in terms of the scalability and the required monitoring overhead