11 research outputs found

    Complementing Measurements and Real Options Concepts to Support Inter-iteration Decision-Making in Agile Projects

    Get PDF
    Agile software projects are characterized by iterative and incremental development, accommodation of changes and active customer participation. The process is driven by creating business value for the client, assuming that the client (i) is aware of it, and (ii) is capable to estimate the business value, associated with the separate features of the system to be implemented. This paper is focused on the complementary use of measurement techniques and concepts of real-option-analysis to assist clients in assessing and comparing alternative sets of requirements. Our overall objective is to provide systematic support to clients for the decision-making process on what to implement in each iteration. The design of our approach is justified by using empirical data, published earlier by other authors

    Multi-sensor data fusion techniques for RPAS detect, track and avoid

    Get PDF
    Accurate and robust tracking of objects is of growing interest amongst the computer vision scientific community. The ability of a multi-sensor system to detect and track objects, and accurately predict their future trajectory is critical in the context of mission- and safety-critical applications. Remotely Piloted Aircraft System (RPAS) are currently not equipped to routinely access all classes of airspace since certified Detect-and-Avoid (DAA) systems are yet to be developed. Such capabilities can be achieved by incorporating both cooperative and non-cooperative DAA functions, as well as providing enhanced communications, navigation and surveillance (CNS) services. DAA is highly dependent on the performance of CNS systems for Detection, Tacking and avoiding (DTA) tasks and maneuvers. In order to perform an effective detection of objects, a number of high performance, reliable and accurate avionics sensors and systems are adopted including non-cooperative sensors (visual and thermal cameras, Laser radar (LIDAR) and acoustic sensors) and cooperative systems (Automatic Dependent Surveillance-Broadcast (ADS-B) and Traffic Collision Avoidance System (TCAS)). In this paper the sensors and system information candidates are fully exploited in a Multi-Sensor Data Fusion (MSDF) architecture. An Unscented Kalman Filter (UKF) and a more advanced Particle Filter (PF) are adopted to estimate the state vector of the objects based for maneuvering and non-maneuvering DTA tasks. Furthermore, an artificial neural network is conceptualised/adopted to exploit the use of statistical learning methods, which acts to combined information obtained from the UKF and PF. After describing the MSDF architecture, the key mathematical models for data fusion are presented. Conceptual studies are carried out on visual and thermal image fusion architectures

    Design of a fuzzy logic software estimation process

    Get PDF
    This thesis describes the design of a fuzzy logic software estimation process. Studies show that most of the projects finish overbudget or later than the planned end date (Standish Group, 2009) even though the software organizations have attempted to increase the success rate of software projects by making the process more manageable and, consequently, more predictable. Project estimation is an important issue because it is the basis for the allocation and management of the resources associated to a project. When the estimation process is not performed properly, this leads to higher risks in their software projects, and the organizations may end up with losses instead of the expected profits from their funded projects. The most important estimates need to be made right in the very early phases of a project when the information is only available at a very high level of abstraction and, often, is based on a number of assumptions. The approach for estimating software projects in the software industry is the one typically based on the experience of the employees in the organization. There are a number of problems with using experience for estimation purposes: for instance, the way to obtain the estimate is only implicit, i.e. there is no consistent way to derive the estimated value, and the experience is strongly related to the experts, not to the organization. The research goal of this thesis is to design a software estimation process able to manage the lack of detailed and quantitative information embedded in the early phases of the software development life cycle. The research approach aims to leverage the advantages of the experience-based approach that can be used in early phases of software estimation while addressing some of the major problems generated by this estimation approach. The specific research objectives to be met by this improved software estimation process are: A. The proposed estimation process must use relevant techniques to handle uncertainty and ambiguity in order to consider the way practitioners make their estimates: the proposed estimation process must use the variables that the practitioners use. B. The proposed estimation process must be useful in early stages of the software development process. C. The proposed estimation process needs to preserve the experience or knowledge base for the organization: this implies an easy way to define and capture the experience of the experts. D. The proposed model must be usable by people with skills distinct from those of the people who configure the original context of the proposed model. In this thesis, an estimation process based on fuzzy logic is proposed, and is referred as the ‘Estimation of Projects in a Context of Uncertainty - EPCU’. The fuzzy logic approach was adopted for the proposed estimation process because it is a formal way to manage the uncertainty and the linguistic variables observed in the early phases of a project when the estimates need to be obtained: using a fuzzy system allows to capture the experience from the organization’s experts via inference rules and to keep this experience within the organization. The experimentation phase typically presents a big challenge, in software engineering in particular, and more so since the software projects estimates must be done “a priori”: indeed for verification purposes, there is a typically large elapsed time between the initial estimate and the completion of the projects upon which the ‘true’ values of effort, duration and costs can be known with certainty in order to verify whether or not the estimates were the right ones. This thesis includes a number of experiments with data from the software industry in Mexico. These experiments are organized into several scenarios, including one with reestimation of real projects completed in industry, but using – for estimation purposes - only the information that was available at the beginning of these projects. From the experiments results reported in this thesis it can be observed that with the use of the proposed fuzzy-logic based estimation process, estimates for these projects are better than the estimates based on the expert opinion approach. Finally, to handle the large amount of calculations required by the EPCU estimation model, as well as for the recording and the management of the information generated by the EPCU model, a research prototype tool was designed and developed to perform the necessary calculations

    Enhancing Security Requirements Engineering by Organisational Learning

    Get PDF
    More and more software projects today are security-related in one way or the other. Requirements engineers often fail to recognise indicators for security problems which is a major source of security problems in practice. Identifying security-relevant requirements is labour-intensive and errorprone. In order to facilitate the security requirements elicitation process, we present an approach supporting organisational learning on security requirements by establishing company-wide experience resources, and a socio-technical network to benefit from them. The approach is based on modelling the flow of requirements and related experiences. Based on those models, we enable people to exchange experiences about security-requirements while they write and discuss project requirements. At the same time, the approach enables participating stakeholders to learn while they write requirements. This can increase security awareness and facilitate learning on both individual and organisational levels. As a basis for our approach, we introduce heuristic assistant tools which support reuse of existing security-related experiences. In particular, they include Bayesian classifiers which issue a warning automatically when new requirements seem to be security-relevant. Our results indicate that this is feasible, in particular if the classifier is trained with domain specific data and documents from previous projects. We show how the ability to identify security-relevant requirements can be improved using this approach. We illustrate our approach by providing a step-by-step example of how we improved the security requirements engineering process at the European Telecommunications Standards Institute (ETSI) and report on experiences made in this application

    Framework for a service-oriented measurement infrastructure

    Get PDF
    Magdeburg, Univ., Fak. fĂŒr Informatik, Diss., 2009Martin Kun

    Formal and quantitative approach to non-functional requirements modeling and assessment in software engineering

    Get PDF
    In the software market place, in which functionally equivalent products compete for the same customer, Non Functional Requirements (NFRs) become more important in distinguishing between the competing products. However, in practice, NFRs receive little attention relative to Functional Requirements (FRs). This is mainly because of the nature of these requirements which poses a challenge when taking the choice of treating them earlier in the software development. NFRs are subjective, relative and they become scattered among multiple modules when they are mapped from the requirements domain to the solution space. Furthermore, NFRs can often interact, in the sense that attempts to achieve one NFR can help or hinder the achievement of other NFRs at particular software functionality. Such an interaction creates an extensive network of interdependencies and tradeoffs among NFRs which is not easy to trace or estimate. This thesis contributes towards achieving the goal of managing the attainable scope and the changes of NFRs. The thesis proposes and empirically evaluates a formal and quantitative approach to modeling and assessing NFRs. Central to such an approach is the implementation of the proposed NFRs Ontology for capturing and structuring the knowledge on the software requirements (FRs and NFRs), their refinements, and their interdependencies. In this thesis, we also propose a change management mechanism for tracing the impact of NFRs on the other constructs in the ontology and vice-versa. We provide a traceability mechanism using Datalog expressions to implement queries on the relational model-based representation for the ontology. An alternative implementation view using XML and XQuery is provided as well. In addition, we propose a novel approach for the early requirements-based effort estimation, based on NFRs Ontology. The effort estimation approach complementarily uses one standard functional size measurement model, namely COSMIC, and a linear regression techniqu

    Eliciting Security Requirements and Tracing them to Design: An Integration of Common Criteria, Heuristics, and UMLsec

    Get PDF
    Building secure systems is difficult for many reasons. This paper deals with two of the main challenges: (i) the lack of security expertise in development teams, and (ii) the inadequacy of existing methodologies to support developers who are not security experts. The security standard ISO 14508 (Common Criteria) together with secure design techniques such as UMLsec can provide the security expertise, knowledge, and guidelines that are needed. However, security expertise and guidelines are not stated explicitly in the Common Criteria. They are rather phrased in security domain terminology and difficult to understand for developers. This means that some general security and secure design expertise are required to fully take advantage of the Common Criteria and UMLsec. In addition, there is the problem of tracing security requirements and objectives into solution design,which is needed for proof of requirements fulfilment. This paper describes a security requirements engineering methodology called SecReq. SecReq combines three techniques: the Common Criteria, the heuristic requirements editorHeRA, andUMLsec. SecReqmakes systematic use of the security engineering knowledge contained in the Common Criteria and UMLsec, as well as security-related heuristics in the HeRA tool. The integrated SecReq method supports early detection of security-related issues (HeRA), their systematic refinement guided by the Common Criteria, and the ability to trace security requirements into UML design models. A feedback loop helps reusing experiencewithin SecReq and turns the approach into an iterative process for the secure system life-cycle, also in the presence of system evolution

    Representation of business processes at multiple levels of abstraction (strategic, tactical and operational) during the requirements elicitation stage of a software project, and the measurement of their functional size with ISO 19761

    Get PDF
    This thesis aims at helping software engineers and business analysts to better model business processes when those models are meant to be used: for software requirements specification, and for functional size measurement purposes. The research goal of this thesis is to contribute to the representation of business processes for its use during the requirements elicitation stage of a software project. To achieve this goal, two research objectives are clearly defined: 1. To propose a novel modeling approach that generates business process models intended to be used in a software requirements elicitation activity. The modeling approach should not significantly increase the complexity of the modeling notations used to represent the business processes; and it must allow the active participation of the various stakeholders involved in a typical software project in order to represent, in a consistent and structured way, their needs and constraints. 2. To develop a procedure to measure the functional size of a software application from the business process models representing it. This measurement procedure should be compatible with the COSMIC ISO 19761 standard; and it should be able to be used independently of the modeling notation used to represent the business process. To achieve the first objective, this thesis proposes a novel modeling approach (coined BPM+) that models business processes at three levels of abstraction: strategic, tactical and operational. An a priori version of BPM+ was designed based on the findings of the literature review. This a priori version was iteratively refined through a pilot case study in industry, a series of ontological analyses, and a survey of experts. As a result, a reviewed version of BPM+ was proposed. The reviewed version was evaluated through a second case study in industry. Therefore, the design of BPM+ has been based on a triangulation of evidences obtained from various sources. To achieve the second objective, the measurement procedure was developed from an analytical comparison between the specifications of COSMIC and those of the modeling notations selected for this research (i.e. BPMN and Qualigram). This analytical comparison helped to define a set of modeling guidelines for the business application software domain. The comparison also allowed defining a set of mapping rules between the modeling notations’ constructs and the COSMIC concepts. In addition, the modeling guidelines were adapted for their application to the real-time software domain. The measurement procedure was evaluated by comparing its measurement results to those obtained in COSMIC reference case studies. The research results demonstrate that: 1. BPM+ allows generating business process models that represent in a consistent and structured way the needs of various stakeholders. 2. Qualigram notation is better suited to BPM+’s design. In addition, Qualigram notation is preferred to be used for non-IT stakeholders, while BPMN is preferred for IT stakeholders. 3. The measurement procedure was successfully applied using two different notations: Qualigram and BPMN, and in two different software domains: the business application domain and the real-time domain. 4. The accuracy of the measurement procedure is in conformity with all the rules of the ISO 19761 standard

    Impact analysis of a multiple imputation technique for handling missing value in the ISBSG repository of software projects

    Get PDF
    Up until the early 2000’s, most of the empirical studies on the performance of estimation models for software projects have been carried out with fairly small samples (less than 20 projects) while only a few were based on larger samples (between 60 to 90 projects). With the set-up of the repository of software projects by the International Software Benchmarking Standards Group – ISBSG – there exists now a much larger data repository available for productivity analysis and for building estimation models: the 2013 release 12 of this ISBSG repository contains over 6,000 projects, thereby providing a sounder basis for statistical studies. However, there is in the ISBSG repository a large number of missing values for a significant number of variables, making its uses rather challenging for research purposes. This research aims to build a basis to improve the investigation of the ISBSG repository of software projects, in order to develop estimation models using different combinations of parameters for which there are distinct sub-samples without missing values. The goal of this research is to tackle the new problems in larger datasets in software engineering including missing values and outliers using the multiple imputation technique
    corecore