31,293 research outputs found
Iterative criteria-based approach to engineering the requirements of software development methodologies
Software engineering endeavours are typically based on and governed by the requirements of the target software; requirements identification is therefore an integral part of software development methodologies. Similarly, engineering a software development methodology (SDM) involves the identification of the requirements of the target methodology. Methodology engineering approaches pay special attention to this issue; however, they make little use of existing methodologies as sources of insight into methodology requirements. The authors propose an iterative method for eliciting and specifying the requirements of a SDM using existing methodologies as supplementary resources. The method is performed as the analysis phase of a methodology engineering process aimed at the ultimate design and implementation of a target methodology. An initial set of requirements is first identified through analysing the characteristics of the development situation at hand and/or via delineating the general features desirable in the target methodology. These initial requirements are used as evaluation criteria; refined through iterative application to a select set of relevant methodologies. The finalised criteria highlight the qualities that the target methodology is expected to possess, and are therefore used as a basis for de. ning the final set of requirements. In an example, the authors demonstrate how the proposed elicitation process can be used for identifying the requirements of a general object-oriented SDM. Owing to its basis in knowledge gained from existing methodologies and practices, the proposed method can help methodology engineers produce a set of requirements that is not only more complete in span, but also more concrete and rigorous
A Survey on Compiler Autotuning using Machine Learning
Since the mid-1990s, researchers have been trying to use machine-learning
based approaches to solve a number of different compiler optimization problems.
These techniques primarily enhance the quality of the obtained results and,
more importantly, make it feasible to tackle two main compiler optimization
problems: optimization selection (choosing which optimizations to apply) and
phase-ordering (choosing the order of applying optimizations). The compiler
optimization space continues to grow due to the advancement of applications,
increasing number of compiler optimizations, and new target architectures.
Generic optimization passes in compilers cannot fully leverage newly introduced
optimizations and, therefore, cannot keep up with the pace of increasing
options. This survey summarizes and classifies the recent advances in using
machine learning for the compiler optimization field, particularly on the two
major problems of (1) selecting the best optimizations and (2) the
phase-ordering of optimizations. The survey highlights the approaches taken so
far, the obtained results, the fine-grain classification among different
approaches and finally, the influential papers of the field.Comment: version 5.0 (updated on September 2018)- Preprint Version For our
Accepted Journal @ ACM CSUR 2018 (42 pages) - This survey will be updated
quarterly here (Send me your new published papers to be added in the
subsequent version) History: Received November 2016; Revised August 2017;
Revised February 2018; Accepted March 2018
Adaptive development and maintenance of user-centric software systems
A software system cannot be developed without considering the various facets of its environment. Stakeholders – including the users that play a central role – have their needs, expectations, and perceptions of a system. Organisational and technical aspects of the environment are constantly changing. The ability to adapt a software system and its requirements to its environment throughout its
full lifecycle is of paramount importance in a constantly changing environment. The continuous involvement of users is as important as the constant evaluation of the system and the observation of evolving environments. We present a methodology for adaptive software systems development and
maintenance. We draw upon a diverse range of accepted methods including participatory design, software architecture, and evolutionary design. Our focus is on user-centred software systems
Polynomial-Chaos-based Kriging
Computer simulation has become the standard tool in many engineering fields
for designing and optimizing systems, as well as for assessing their
reliability. To cope with demanding analysis such as optimization and
reliability, surrogate models (a.k.a meta-models) have been increasingly
investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging
are two popular non-intrusive meta-modelling techniques. PCE surrogates the
computational model with a series of orthonormal polynomials in the input
variables where polynomials are chosen in coherency with the probability
distributions of those input variables. On the other hand, Kriging assumes that
the computer model behaves as a realization of a Gaussian random process whose
parameters are estimated from the available computer runs, i.e. input vectors
and response values. These two techniques have been developed more or less in
parallel so far with little interaction between the researchers in the two
fields. In this paper, PC-Kriging is derived as a new non-intrusive
meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal
polynomials (PCE) approximates the global behavior of the computational model
whereas Kriging manages the local variability of the model output. An adaptive
algorithm similar to the least angle regression algorithm determines the
optimal sparse set of polynomials. PC-Kriging is validated on various benchmark
analytical functions which are easy to sample for reference results. From the
numerical investigations it is concluded that PC-Kriging performs better than
or at least as good as the two distinct meta-modeling techniques. A larger gain
in accuracy is obtained when the experimental design has a limited size, which
is an asset when dealing with demanding computational models
- …