8 research outputs found
Requirements volatility in multicultural situational contexts
Requirements volatility refers to additions, deletions, and modifications of requirements during the system development life cycle. Different approaches in software development, including Agile and DevOps, have addressed requirements volatility by increasing user participation throughout the whole development process.
In this paper, we analyse requirements volatility from a situational context angle with the aim to increase understanding of the role of culture and cultural diversity in a multicultural requirements elicitation process. Research on the situational context in Requirements Engineering (RE) is rather limited, despite the recognized importance of RE and requirements elicitation for improving the quality of the final system and software product.
This paper builds on an extensive literature review demonstrating the importance of raising awareness and understanding of the role of culture and cultural diversity for requirements volatility, as one of the most significant situational factors in the requirements elicitation process, with the aim to improve the whole systems development process as well as the resulting products and services.
The paper concludes with the presentation of the Requirements Cultural Volatility Framework which aims to reveal potential conflicts that may occur in requirements elicitation on a multiplicity of cultural dimensions, The framework proposes actions to be taken in order to address the conflicts and point out expected benefits on each dimension
A holistic method for improving software product and process quality
The concept of quality in general is elusive, multi-faceted and is perceived differently by different stakeholders. Quality is difficult to define and extremely difficult to measure. Deficient software systems regularly result in failures which often lead to significant financial losses but more importantly to loss of human lives. Such systems need to be either scrapped and replaced by new ones or corrected/improved through maintenance. One of the most serious challenges is how to deal with legacy systems which, even when not failing, inevitably require upgrades, maintenance and improvement because of malfunctioning or changing requirements, or because of changing technologies, languages, or platforms. In such cases, the dilemma is whether to develop solutions from scratch or to re-engineer a legacy system. This research addresses this dilemma and seeks to establish a rigorous method for the derivation of indicators which, together with management criteria, can help decide whether restructuring of legacy systems is advisable.
At the same time as the software engineering community has been moving from corrective methods to preventive methods, concentrating not only on both product quality improvement and process quality improvement has become imperative. This research investigation combines Product Quality Improvement, primarily through the re-engineering of legacy systems; and Process Improvement methods, models and practices, and uses a holistic approach to study the interplay of Product and Process Improvement. The re-engineering factor rho, a composite metric was proposed and validated.
The design and execution of formal experiments tested hypotheses on the relationship of internal (code-based) and external (behavioural) metrics. In addition to proving the hypotheses, the insights gained on logistics challenges resulted in the development of a framework for the design and execution of controlled experiments in Software Engineering.
The next part of the research resulted in the development of the novel, generic and, hence, customisable Quality Model GEQUAMO, which observes the principle of orthogonality, and combines a top-down analysis of the identification, classification and visualisation of software quality characteristics, and a bottom-up method for measurement and evaluation. GEQUAMO II addressed weaknesses that were identified during various GEQUAMO implementations and expert validation by academics and practitioners.
Further work on Process Improvement investigated the Process Maturity and its relationship to Knowledge Sharing, resulted in the development of the I5P Visualisation Framework for Performance Estimation through the Alignment of Process Maturity and Knowledge Sharing. I5P was used in industry and was validated by experts from academia and industry. Using the principles that guided the creation of the GEQUAMO model, the CoFeD visualisation framework, was developed for comparative quality evaluation and selection of methods, tools, models and other software artifacts. CoFeD is very useful as the selection of wrong methods, tools or even personnel is detrimental to the survival and success of projects and organisations, and even to individuals.
Finally, throughout the many years of research and teaching Software Engineering, Information Systems, Methodologies, I observed the ambiguities of terminology and the use of one term to mean different concepts and one concept to be expressed in different terms. These practices result in lack of clarity. Thus my final contribution comes in my reflections on terminology disambiguation for the achievement of clarity, and the development of a framework for achieving disambiguation of terms as a necessary step towards gaining maturity and justifying the use of the term “Engineering” 50 years since the term Software Engineering was coined.
This research resulted in the creation of new knowledge in the form of novel indicators, models and frameworks which can aid quantification and decision making primarily on re-engineering of legacy code and on the management of process and its improvement. The thesis also contributes to the broader debate and understanding of problems relating to Software Quality, and establishes the need for a holistic approach to software quality improvement from both the product and the process perspectives
Developing effective teams in global multidiscipline engineering and manufacturing organisations
In today’s competitive business environment most activities in global relationships (subsidiaries, outsourcing, joint ventures) are carried out by multi-cultural and multidisciplinary teams which may be collocated or distributed. The members of these teams comprise a variety of experts of diverse cultural, organizational, and professional backgrounds. Within the project lifetime they are connected together with time and money constraints for a specific period of time to accomplish certain distinct objectives. The aims of this paper are to report on findings from an extensive literature review regarding multi-cultural and multidiscipline team work and to provide a basis for discussion and analysis of challenges such teams experience. A case study is carried out in a global multidiscipline engineering organization to identify empirical evidence of potential challenges in projects carried out by multicultural and multidisciplinary collaborative teamwork
Towards developing a software process improvement strategy through the application of ethical concepts
Aligning Software Process Improvement with the business and strategic goals of an enterprise is a core factor for process improvement. Achieving success in Software Process Improvement (SPI) has shown to be a problematic challenge for countless organisations. SPI, as a discipline, can be described as a set of use cases, each use case describing the logically related activities that must be undertaken. In addition, each use case is a description of the interactions between itself and the participants, i.e. the Actors. The nature of these interactions more often than not may demand, from the participant, the recognition, and fulfilment, of ethical duties.
In this paper we customise a theoretical framework developed by the US Content Subcommittee of the Impact CS Steering Committee that specifies traditional moral and ethical concepts, which can be used to identify the moral issues concerning the Software Process Improvement field. An application of these conventional and generic ethical concepts is made to use cases such as: Determining Business Needs; Conducting Process Improvement Assessment; the Tailoring and Creation of Processes; and Deployment. In doing so a number of ethical issues are highlighted. In the application and utilisation of SPI: business process engineers, software engineering teams, process improvement managers, and so on must be aware of these ethical duties, which have been identified by the application of the moral and ethical concepts, as presented in this paper, in order to become more responsible professionals in general. We propose a set of heuristics for ethical engagement with the SPI discipline proposing that an effective SPI strategy must be underpinned with ethical consideration
The I5P visualisation framework for performance estimation through the alignment of process maturity and knowledge sharing
This paper argues that Knowledge Management (KM) and Knowledge Sharing (KS) are strongly linked to organisational maturity. The mechanisms that enable this upward movement and depict measurable effects of performance as the organisation climbs from ad hoc levels to institutionalised high levels of process maturity are investigated. The I5P visualisation framework which aligns a Knowledge Sharing level to the appropriate maturity level and characterises the process from incidental to innovative is examined. This framework provides the basis, in terms of preparedness and disposition towards knowledge sharing, for estimating and measuring organisational performance. In today’s competitive global business environment organisations are increasingly dependent on Information and Communication Technologies (ICTs) and particularly vulnerable to knowledge dilution. The framework links knowledge sharing to process maturity providing a framework that aims to encapsulate tacit accumulated knowledge in the organisation by preserving it for future needs. The framework will be useful to Information Technology (IT) organisations that are familiar with maturity models, such as CMMI