1,306 research outputs found

    Change decision support:extraction and analysis of late architecture changes using change characterization and software metrics

    Get PDF
    Software maintenance is one of the most crucial aspects of software development. Software engineering researchers must develop practical solutions to handle the challenges presented in maintaining mature software systems. Research that addresses practical means of mitigating the risks involved when changing software, reducing the complexity of mature software systems, and eliminating the introduction of preventable bugs is paramount to today’s software engineering discipline. Giving software developers the information that they need to make quality decisions about changes that will negatively affect their software systems is a key aspect to mitigating those risks. This dissertation presents work performed to assist developers to collect and process data that plays a role in change decision-making during the maintenance phase. To address these problems, developers need a way to better understand the effects of a change prior to making the change. This research addresses the problems associated with increasing architectural complexity caused by software change using a twoold approach. The first approach is to characterize software changes to assess their architectural impact prior to their implementation. The second approach is to identify a set of architecture metrics that correlate to system quality and maintainability and to use these metrics to determine the level of difficulty involved in making a change. The two approaches have been combined and the results presented provide developers with a beneficial analysis framework that offers insight into the change process

    1991 NASA Life Support Systems Analysis workshop

    Get PDF
    The 1991 Life Support Systems Analysis Workshop was sponsored by NASA Headquarters' Office of Aeronautics and Space Technology (OAST) to foster communication among NASA, industrial, and academic specialists, and to integrate their inputs and disseminate information to them. The overall objective of systems analysis within the Life Support Technology Program of OAST is to identify, guide the development of, and verify designs which will increase the performance of the life support systems on component, subsystem, and system levels for future human space missions. The specific goals of this workshop were to report on the status of systems analysis capabilities, to integrate the chemical processing industry technologies, and to integrate recommendations for future technology developments related to systems analysis for life support systems. The workshop included technical presentations, discussions, and interactive planning, with time allocated for discussion of both technology status and time-phased technology development recommendations. Key personnel from NASA, industry, and academia delivered inputs and presentations on the status and priorities of current and future systems analysis methods and requirements

    Software Evolution for Industrial Automation Systems. Literature Overview

    Get PDF

    A holistic method for improving software product and process quality

    Get PDF
    The concept of quality in general is elusive, multi-faceted and is perceived differently by different stakeholders. Quality is difficult to define and extremely difficult to measure. Deficient software systems regularly result in failures which often lead to significant financial losses but more importantly to loss of human lives. Such systems need to be either scrapped and replaced by new ones or corrected/improved through maintenance. One of the most serious challenges is how to deal with legacy systems which, even when not failing, inevitably require upgrades, maintenance and improvement because of malfunctioning or changing requirements, or because of changing technologies, languages, or platforms. In such cases, the dilemma is whether to develop solutions from scratch or to re-engineer a legacy system. This research addresses this dilemma and seeks to establish a rigorous method for the derivation of indicators which, together with management criteria, can help decide whether restructuring of legacy systems is advisable. At the same time as the software engineering community has been moving from corrective methods to preventive methods, concentrating not only on both product quality improvement and process quality improvement has become imperative. This research investigation combines Product Quality Improvement, primarily through the re-engineering of legacy systems; and Process Improvement methods, models and practices, and uses a holistic approach to study the interplay of Product and Process Improvement. The re-engineering factor rho, a composite metric was proposed and validated. The design and execution of formal experiments tested hypotheses on the relationship of internal (code-based) and external (behavioural) metrics. In addition to proving the hypotheses, the insights gained on logistics challenges resulted in the development of a framework for the design and execution of controlled experiments in Software Engineering. The next part of the research resulted in the development of the novel, generic and, hence, customisable Quality Model GEQUAMO, which observes the principle of orthogonality, and combines a top-down analysis of the identification, classification and visualisation of software quality characteristics, and a bottom-up method for measurement and evaluation. GEQUAMO II addressed weaknesses that were identified during various GEQUAMO implementations and expert validation by academics and practitioners. Further work on Process Improvement investigated the Process Maturity and its relationship to Knowledge Sharing, resulted in the development of the I5P Visualisation Framework for Performance Estimation through the Alignment of Process Maturity and Knowledge Sharing. I5P was used in industry and was validated by experts from academia and industry. Using the principles that guided the creation of the GEQUAMO model, the CoFeD visualisation framework, was developed for comparative quality evaluation and selection of methods, tools, models and other software artifacts. CoFeD is very useful as the selection of wrong methods, tools or even personnel is detrimental to the survival and success of projects and organisations, and even to individuals. Finally, throughout the many years of research and teaching Software Engineering, Information Systems, Methodologies, I observed the ambiguities of terminology and the use of one term to mean different concepts and one concept to be expressed in different terms. These practices result in lack of clarity. Thus my final contribution comes in my reflections on terminology disambiguation for the achievement of clarity, and the development of a framework for achieving disambiguation of terms as a necessary step towards gaining maturity and justifying the use of the term “Engineering” 50 years since the term Software Engineering was coined. This research resulted in the creation of new knowledge in the form of novel indicators, models and frameworks which can aid quantification and decision making primarily on re-engineering of legacy code and on the management of process and its improvement. The thesis also contributes to the broader debate and understanding of problems relating to Software Quality, and establishes the need for a holistic approach to software quality improvement from both the product and the process perspectives

    Re-engineering strategies for legacy software systems

    Get PDF
    Re-engineering can be described as a process for updating an existing system in order to meet new requirements. Restructuring and refactoring are activities that can be performed as a part of the re-engineering process. Supporting new requirements like migrating to new frameworks, new environments and architectural styles is essential for preservation of quality attributes like maintainability and evolvability. Many larger legacy systems slowly deteriorate over time in quality and adding new functionality becomes increasingly difficult and costly as technical debt accumulates. To modernize a legacy system and improve the cost effectiveness of implementing new features a re-engineering process is often needed. The alternative is to develop a completely new system but this can often lead to loss of years of accumulated functionality and be too expensive. Re-engineering strategies can be specialized and solve specific needs like cloud migration or be more generic in nature supporting several kinds of needs. Different approaches are suitable for different kinds of source and target systems. The choice of a re-engineering strategy is also influenced by organisational and business factors. The re-engineering of a highly tailored legacy system in a small organisation is different from re-engineering a scalable system in a large organisation. Generic and flexible solutions are well suited for especially smaller organisations with complex systems. The re-engineering strategy Renaissance was applied in a case study at Roima Intelligence Oy in order to find out if such a strategy is realistically usable, useful and valuable for a smaller organization. The results show that a re-engineering strategy is possible to be used with low overhead in order to prioritize different parts of the system and determining a suitable modernization plan. Renaissance was also shown to add value especially in the form of deeper understanding of the system and a structured way to evaluate different options for modernization. This is achieved through assessing the system from different views taking into account especially business and technical aspects. A lesson learned about Renaissance is that determining an optimal scope for the system assessment is challenging. The results are applicable for other organisations dealing with complex legacy systems with constrained resources. Limitations of the study are that the number of different kinds of re-engineering strategies discussed is small and more suitable strategies than Renaissance could be discovered with a systematic mapping study. The amount of experts participating in the process itself as well as the evaluation was also low, introducing some uncertainty to the validity of the results. Further research is needed in order to determine how specialized and generic re-engineering strategies compare in terms of needed resources and added value

    Information systems flexibility

    Get PDF
    • …
    corecore