181 research outputs found
Understanding and supporting large-scale requirements management
Large market-driven software companies face new challenges in requirements engineering and management that emerged due to their recent extensive growth. At the same time, the pressure generated by competitors’ and users’ expectations demands being more competitive, creative and flexible to more quickly respond to a rapidly changing market situation. In the pursuit of staying competitive in this context, new ideas on how to improve the current software engineering practice are requested to help maintaining the engineering efficiency while coping with growing size and complexity of requirements engineering processes and their products. This thesis focuses on understanding and supporting large-scale requirements management for developing software products to open markets. In particular, this thesis focuses on the following requirements management activities in the mentioned context, namely: scope management, variability management and requirements consolidation. The goals of the research effort in this thesis are to provide effective methods in supporting mentioned requirements management activities in a situation when the size of them and their complexity require large time and skills efforts. Based on empirical research, where both quantitative and qualitative approaches were utilized, this thesis reports on possible improvements for managing variability and presents visualization techniques to assist scope management for large-scale software product development contexts. Both reported ideas are empirically evaluated in case studies in a large-scale context. Additionally, the benefits of using linguistic methods for requirements consolidation are investigated in a replicated experimental study based on a relevant industry scenario
The implications of digitalization on business model change
Context: Digitalization brings new opportunities and also challenges to
software companies.
Objective: Software companies have mostly focused on the technical aspects of
handing changes and mostly ignoring the business model changes and their
implications on software organization and the architecture. In this paper, we
synthesize implications of the digitalization based on an extensive literature
survey and a longitudinal case study at Ericsson AB.
Method: Using thematic analysis, we present six propositions to be used to
facilitate the cross-disciplinary analysis of business model dynamics and the
effectiveness and efficiency of the outcome of business modeling, by linking
value, transaction, and organizational learning to business model change.
Conclusions: Business model alignment is highlighted as a new business model
research area for understanding the relationships between the dynamic nature of
business models, organization design, and the value creation in the business
model activities
Monitoring and Maintenance of Telecommunication Systems: Challenges and Research Perspectives
In this paper, we present challenges associated with monitoring and
maintaining a large telecom system at Ericsson that was developed with high
degree of component reuse. The system constitutes of multiple services,
composed of both legacy and modern systems that are constantly changing and
need to be adapted to changing business needs. The paper is based on firsthand
experience from architecting, developing and maintaining such a system,
pointing out current challenges and potential avenues for future research that
might contribute to addressing them.Comment: Proceedings KKIO Software Engineering Conference 2018: 166-17
MultiDimEr: a multi-dimensional bug analyzEr
Background: Bugs and bug management consumes a significant amount of time and
effort from software development organizations. A reduction in bugs can
significantly improve the capacity for new feature development. Aims: We
categorize and visualize dimensions of bug reports to identify accruing
technical debt. This evidence can serve practitioners and decision makers not
only as an argumentative basis for steering improvement efforts, but also as a
starting point for root cause analysis, reducing overall bug inflow. Method: We
implemented a tool, MultiDimEr, that analyzes and visualizes bug reports. The
tool was implemented and evaluated at Ericsson. Results: We present our
preliminary findings using the MultiDimEr for bug analysis, where we
successfully identified components generating most of the bugs and bug trends
within certain components. Conclusions: By analyzing the dimensions provided by
MultiDimEr, we show that classifying and visualizing bug reports in different
dimensions can stimulate discussions around bug hot spots as well as validating
the accuracy of manually entered bug report attributes used in technical debt
measurements such as fault slip through.Comment: TechDebt@ICSE 2022: 66-7
Hazard Analysis of Collision Avoidance System using STPA
As our society becomes more and more dependent on IT systems, failures of these systems can harm more and more people and organizations both public and private. Diligently performing risk and hazard analysis helps to minimize the societal harms of IT system failures. In this paper we present experiences gained by applying the System Theoretic Process Analysis (STPA) method for hazard analysis on a forward collision avoidance system. Our main objectives are to investigate effectiveness in terms of the number and quality of identified hazards, and time efficiency in terms of required efforts of the studied method. Based on the findings of this study STPA has proved to be an effective and efficient hazard analysis method for assessing the safety of a safety-critical system and it requires a moderate level of effort
Only Time Will Tell: Modelling Information Diffusion in Code Review with Time-Varying Hypergraphs
Background: Modern code review is expected to facilitate knowledge sharing:
All relevant information, the collective expertise, and meta-information around
the code change and its context become evident, transparent, and explicit in
the corresponding code review discussion. The discussion participants can
leverage this information in the following code reviews; the information
diffuses through the communication network that emerges from code review.
Traditional time-aggregated graphs fall short in rendering information
diffusion as those models ignore the temporal order of the information
exchange: Information can only be passed on if it is available in the first
place.
Aim: This manuscript presents a novel model based on time-varying hypergraphs
for rendering information diffusion that overcomes the inherent limitations of
traditional, time-aggregated graph-based models.
Method: In an in-silico experiment, we simulate an information diffusion
within the internal code review at Microsoft and show the empirical impact of
time on a key characteristic of information diffusion: the number of reachable
participants.
Results: Time-aggregation significantly overestimates the paths of
information diffusion available in communication networks and, thus, is neither
precise nor accurate for modelling and measuring the spread of information
within communication networks that emerge from code review.
Conclusion: Our model overcomes the inherent limitations of traditional,
static or time-aggregated, graph-based communication models and sheds the first
light on information diffusion through code review. We believe that our model
can serve as a foundation for understanding, measuring, managing, and improving
knowledge sharing in code review in particular and information diffusion in
software engineering in general.Comment: 10 pages, 6 figure
On Understanding the Relation of Knowledge and Confidence to Requirements Quality
Context and Motivation: Software requirements are affected by the knowledge
and confidence of software engineers. Analyzing the interrelated impact of
these factors is difficult because of the challenges of assessing knowledge and
confidence.
Question/Problem: This research aims to draw attention to the need for
considering the interrelated effects of confidence and knowledge on
requirements quality, which has not been addressed by previous publications.
Principal ideas/results: For this purpose, the following steps have been
taken: 1) requirements quality was defined based on the instructions provided
by the ISO29148:2011 standard, 2) we selected the symptoms of low qualified
requirements based on ISO29148:2011, 3) we analyzed five Software Requirements
Specification (SRS) documents to find these symptoms, 3) people who have
prepared the documents were categorized in four classes to specify the
more/less knowledge and confidence they have regarding the symptoms, and 4)
finally, the relation of lack of enough knowledge and confidence to symptoms of
low quality was investigated. The results revealed that the simultaneous
deficiency of confidence and knowledge has more negative effects in comparison
with a deficiency of knowledge or confidence.
Contribution: In brief, this study has achieved these results: 1) the
realization that a combined lack of knowledge and confidence has a larger
effect on requirements quality than only one of the two factors, 2) the
relation between low qualified requirements and requirements engineers' needs
for knowledge and confidence, and 3) variety of requirements engineers' needs
for knowledge based on their abilities to make discriminative and consistent
decisions.Comment: Preprint accepted for publication at the 27th International Working
Conference on Requirement Engineering: Foundation for Software Qualit
Open innovation using open source tools: a case study at Sony Mobile
Despite growing interest of Open Innovation (OI) in Software Engineering
(SE), little is known about what triggers software organizations to adopt it
and how this affects SE practices. OI can be realized in numerous of ways,
including Open Source Software (OSS) involvement. Outcomes from OI are not
restricted to product innovation but also include process innovation, e.g.
improved SE practices and methods. This study explores the involvement of a
software organization (Sony Mobile) in OSS communities from an OI perspective
and what SE practices (requirements engineering and testing) have been adapted
in relation to OI. It also highlights the innovative outcomes resulting from
OI. An exploratory embedded case study investigates how Sony Mobile use and
contribute to Jenkins and Gerrit; the two central OSS tools in their continuous
integration tool chain. Quantitative analysis was performed on change log data
from source code repositories in order to identify the top contributors and
triangulated with the results from five semi-structured interviews to explore
the nature of the commits. The findings of the case study include five major
themes: i) The process of opening up towards the tool communities correlates in
time with a general adoption of OSS in the organization. ii) Assets not seen as
competitive advantage nor a source of revenue are made open to OSS communities,
and gradually, the organization turns more open. iii) The requirements
engineering process towards the community is informal and based on engagement.
iv) The need for systematic and automated testing is still in its infancy, but
the needs are identified. v) The innovation outcomes included free features and
maintenance, and were believed to increase speed and quality in development.
Adopting OI was a result of a paradigm shift of moving from Windows to Linux
- …