10 research outputs found

    SmartyCo: Managing Cyber-Physical Systems for Smart Environments

    Get PDF
    International audienceCyber-Physical Systems (CPS) are composed of heterogeneous devices, communicating with each other and interacting with the physical world. Fostered by the growing use of smart devices that are permanently connected to the Internet, these CPS can be found in smart environments such as smart buildings, pavilions or homes. CPS must cope with the complexity and heterogeneity of their connected devices while supporting end-users with limited technical background to configure and manage their system. To deal with these issues, in this paper we introduce SmartyCo, our approach based on Dynamic Software Product Line (DSPL) principles to configure and manage CPS for smart environments. We describe its underlying architecture and illustrate in the context of smart homes how end-users can use it to define their own CPS in an automated way. We then explain how such an approach supports the reconfiguration of smart devices based on end-users rules, thus adapting the CPS w.r.t. environment changes. Finally, we show that our approach is well-suited to handle the addition and removal of CPS devices while the system is running, and we report on our experience in enabling home inhabitants to dynamically reconfigure their CPS

    Test Smell: A Parasitic Energy Consumer in Software Testing

    Full text link
    Traditionally, energy efficiency research has focused on reducing energy consumption at the hardware level and, more recently, in the design and coding phases of the software development life cycle. However, software testing's impact on energy consumption did not receive attention from the research community. Specifically, how test code design quality and test smell (e.g., sub-optimal design and bad practices in test code) impact energy consumption has not been investigated yet. This study examined 12 Apache projects to analyze the association between test smell and its effects on energy consumption in software testing. We conducted a mixed-method empirical analysis from two dimensions; software (data mining in Apache projects) and developers' views (a survey of 62 software practitioners). Our findings show that: 1) test smell is associated with energy consumption in software testing. Specifically smelly part of a test case consumes 10.92\% more energy compared to the non-smelly part. 2) certain test smells are more energy-hungry than others, 3) refactored test cases tend to consume less energy than their smelly counterparts, and 4) most developers lack knowledge about test smells' impact on energy consumption. We conclude the paper with several observations that can direct future research and developments

    Programme and The Book of Abstracts / Eighteenth Annual Conference YUCOMAT 2016, Herceg Novi, September 5-10, 2016

    Get PDF
    The First Conference on materials science and engineering, including physics, physical chemistry, condensed matter chemistry, and technology in general, was held in September 1995, in Herceg Novi. An initiative to establish Yugoslav Materials Research Society was born at the conference and, similar to other MR societies in the world, the programme was made and objectives determined. The Yugoslav Materials Research Society (Yu-MRS), a nongovernment and non-profit scientific association, was founded in 1997 to promote multidisciplinary goal-oriented research in materials science and engineering. The main task and objective of the Society has been to encourage creativity in materials research and engineering to reach a harmonic coordination between achievements in this field in our country and analogous activities in the world with an aim to include our country into global international projects. Until 2003, Conferences were held every second year and then they grew into Annual Conferences that were traditionally held in Herceg Novi in September of every year. In 2007 Yu-MRS formed two new MRS: MRS-Serbia (official successor of Yu-MRS) and MRS-Montenegro (in founding). In 2008, MRS – Serbia became a member of FEMS (Federation of European Materials Societies)

    Supporting Software Development by an Integrated Documentation Model for Decisions

    Get PDF
    Decision-making is a vital activity during software development. Decisions made during requirements engineering, software design, and implementation guide the development process. In order to make decisions, developers may apply different strategies. For instance, they can search for alternatives and evaluate them according to given criteria, or they may rely on their personal experience and heuristics to make single solution claims. Thereby, knowledge emerges during the process of decision making, as the content, outcome, and context of decisions are explored by developers. For instance, different solution options may be considered to address a given decision problem. In particular, such knowledge is growing rapidly, when multiple developers are involved. Therefore, it should be documented to make decisions comprehensible in the future. However, this documentation is often not performed by developers in practice. First, developers need to find and use a documentation approach, which provides support for the decision making strategies applied for the decision to be documented. Thus, documentation approaches are required to support multiple strategies. Second, due to the collaborative nature of the decision making process during one or more development activities, decision knowledge needs to be captured and structured according to one integrated model, which can be applied during all these development activities. This thesis uncovers two important reasons, why the aforementioned requirements are currently not fulfilled sufficiently. First, it is investigated, which decision making strategies can be identified in the documentation of decisions within issue tickets from the Firefox project. Interestingly, most documented decision knowledge originates from naturalistic decision making, whereas most current documentation approaches structure the captured knowledge according to rational decision making strategies. Second, most decision documentation approaches focus on one development activity, so that for instance decision documentation during requirements engineering and implementation are not supported within the same documentation model. The main contribution of this thesis is a documentation model for decision knowledge, which addresses these two findings. In detail, the documentation model supports the documentation of decision knowledge resulting from both naturalistic and rational decision making strategies, and integrates this knowledge within flexible documentation structures. Also, it is suitable for capturing decision knowledge during the three development activities of requirements engineering, design, and implementation. Furthermore, a tool support is presented for the model, which allows developers to integrate decision capturing and documentation in their activities using the Eclipse IDE

    Continuous Rationale Management

    Get PDF
    Continuous Software Engineering (CSE) is a software life cycle model open to frequent changes in requirements or technology. During CSE, software developers continuously make decisions on the requirements and design of the software or the development process. They establish essential decision knowledge, which they need to document and share so that it supports the evolution and changes of the software. The management of decision knowledge is called rationale management. Rationale management provides an opportunity to support the change process during CSE. However, rationale management is not well integrated into CSE. The overall goal of this dissertation is to provide workflows and tool support for continuous rationale management. The dissertation contributes an interview study with practitioners from the industry, which investigates rationale management problems, current practices, and features to support continuous rationale management beneficial for practitioners. Problems of rationale management in practice are threefold: First, documenting decision knowledge is intrusive in the development process and an additional effort. Second, the high amount of distributed decision knowledge documentation is difficult to access and use. Third, the documented knowledge can be of low quality, e.g., outdated, which impedes its use. The dissertation contributes a systematic mapping study on recommendation and classification approaches to treat the rationale management problems. The major contribution of this dissertation is a validated approach for continuous rationale management consisting of the ConRat life cycle model extension and the comprehensive ConDec tool support. To reduce intrusiveness and additional effort, ConRat integrates rationale management activities into existing workflows, such as requirements elicitation, development, and meetings. ConDec integrates into standard development tools instead of providing a separate tool. ConDec enables lightweight capturing and use of decision knowledge from various artifacts and reduces the developers' effort through automatic text classification, recommendation, and nudging mechanisms for rationale management. To enable access and use of distributed decision knowledge documentation, ConRat defines a knowledge model of decision knowledge and other artifacts. ConDec instantiates the model as a knowledge graph and offers interactive knowledge views with useful tailoring, e.g., transitive linking. To operationalize high quality, ConRat introduces the rationale backlog, the definition of done for knowledge documentation, and metrics for intra-rationale completeness and decision coverage of requirements and code. ConDec implements these agile concepts for rationale management and a knowledge dashboard. ConDec also supports consistent changes through change impact analysis. The dissertation shows the feasibility, effectiveness, and user acceptance of ConRat and ConDec in six case study projects in an industrial setting. Besides, it comprehensively analyses the rationale documentation created in the projects. The validation indicates that ConRat and ConDec benefit CSE projects. Based on the dissertation, continuous rationale management should become a standard part of CSE, like automated testing or continuous integration

    Challenges facing the delivery of mega projects in Transnet capital projects.

    Get PDF
    Doctoral degree. University of KwaZulu-Natal, Durban.This study was undertaken to critically identify and analyse the challenges facing mega-project delivery at Transnet Capital Projects (TCP), a project execution wing of Transnet State-Owned Enterprise (SOE). Transnet is the custodian and operator of eight commercial ports, 20 500 kilometres of railway and 3800 kilometres of pipeline network in South Africa. The Department of Public Enterprises regards Transnet as an integral SOE upon which the country is heavily dependent on for economic growth, job creation and socio-economic transformation. In fulfilling its mandate, Transnet continually reviews its strategies to increase operational efficiencies, produce infrastructure ahead of demand, and reduce the costs of doing business. As a specialist project wing, TCP has delivered several mega-projects for the port, rail and pipelines divisions of Transnet, however evidence indicates that almost 55% of railway projects fail and that the New Multi-Product Pipeline (NMPP) project cost has almost doubled from its original estimate of R11.1 billion to R23.4 Billion. Currently TCP is executing several mega-infrastructure projects arising from Transnet’s R300 Billion Market Demand Strategy (MDS). Future mega-projects are anticipated to follow from Transnet’s 30-year long-term framework plan (LTFP). This study uses PMI’s (2013) ‘10 Knowledge Areas of Project Management’ as a model to critically identify and analyse the challenges facing the delivery of mega-projects in TCP. A simple convenience sample and quantitative data collected from 191 respondents in TCP project management teams provided an extensive data set for analysis. Statistical Package for Social Scientist (SPSS) was applied to analyse 122 constructs and through principal component analysis determine the main contributing challenges. Inferential statistical calculations were processed to determine correlation of 21 hypotheses. The findings indicate that while TCP has world-class project lifecycle processes and selected tools, there are challenges in upfront planning of mega-projects, inefficient scope and cost control, contract management challenges, inadequate personnel experience and skill, an overwhelmingly large number of stakeholders to manage and weak risk management. This study contributes new data on mega-project challenges, recommends new mega-project management practices, processes, and strategies to address these challenges. A new model to refocus mega-project execution for successful outcomes is presented. Transnet, project practitioners, researchers, stakeholders; public and private sector; and the African continent will benefit from the learnings of mega-project delivery at TCP

    Adaptation-Aware Architecture Modeling and Analysis of Energy Efficiency for Software Systems

    Get PDF
    This thesis presents an approach for the design time analysis of energy efficiency for static and self-adaptive software systems. The quality characteristics of a software system, such as performance and operating costs, strongly depend upon its architecture. Software architecture is a high-level view on software artifacts that reflects essential quality characteristics of a system under design. Design decisions made on an architectural level have a decisive impact on the quality of a system. Revising architectural design decisions late into development requires significant effort. Architectural analyses allow software architects to reason about the impact of design decisions on quality, based on an architectural description of the system. An essential quality goal is the reduction of cost while maintaining other quality goals. Power consumption accounts for a significant part of the Total Cost of Ownership (TCO) of data centers. In 2010, data centers contributed 1.3% of the world-wide power consumption. However, reasoning on the energy efficiency of software systems is excluded from the systematic analysis of software architectures at design time. Energy efficiency can only be evaluated once the system is deployed and operational. One approach to reduce power consumption or cost is the introduction of self-adaptivity to a software system. Self-adaptive software systems execute adaptations to provision costly resources dependent on user load. The execution of reconfigurations can increase energy efficiency and reduce cost. If performed improperly, however, the additional resources required to execute a reconfiguration may exceed their positive effect. Existing architecture-level energy analysis approaches offer limited accuracy or only consider a limited set of system features, e.g., the used communication style. Predictive approaches from the embedded systems and Cloud Computing domain operate on an abstraction that is not suited for architectural analysis. The execution of adaptations can consume additional resources. The additional consumption can reduce performance and energy efficiency. Design time quality analyses for self-adaptive software systems ignore this transient effect of adaptations. This thesis makes the following contributions to enable the systematic consideration of energy efficiency in the architectural design of self-adaptive software systems: First, it presents a modeling language that captures power consumption characteristics on an architectural abstraction level. Second, it introduces an energy efficiency analysis approach that uses instances of our power consumption modeling language in combination with existing performance analyses for architecture models. The developed analysis supports reasoning on energy efficiency for static and self-adaptive software systems. Third, to ease the specification of power consumption characteristics, we provide a method for extracting power models for server environments. The method encompasses an automated profiling of servers based on a set of restrictions defined by the user. A model training framework extracts a set of power models specified in our modeling language from the resulting profile. The method ranks the trained power models based on their predicted accuracy. Lastly, this thesis introduces a systematic modeling and analysis approach for considering transient effects in design time quality analyses. The approach explicitly models inter-dependencies between reconfigurations, performance and power consumption. We provide a formalization of the execution semantics of the model. Additionally, we discuss how our approach can be integrated with existing quality analyses of self-adaptive software systems. We validated the accuracy, applicability, and appropriateness of our approach in a variety of case studies. The first two case studies investigated the accuracy and appropriateness of our modeling and analysis approach. The first study evaluated the impact of design decisions on the energy efficiency of a media hosting application. The energy consumption predictions achieved an absolute error lower than 5.5% across different user loads. Our approach predicted the relative impact of the design decision on energy efficiency with an error of less than 18.94%. The second case study used two variants of the Spring-based community case study system PetClinic. The case study complements the accuracy and appropriateness evaluation of our modeling and analysis approach. We were able to predict the energy consumption of both variants with an absolute error of no more than 2.38%. In contrast to the first case study, we derived all models automatically, using our power model extraction framework, as well as an extraction framework for performance models. The third case study applied our model-based prediction to evaluate the effect of different self-adaptation algorithms on energy efficiency. It involved scientific workloads executed in a virtualized environment. Our approach predicted the energy consumption with an error below 7.1%, even though we used coarse grained measurement data of low accuracy to train the input models. The fourth case study evaluated the appropriateness and accuracy of the automated model extraction method using a set of Big Data and enterprise workloads. Our method produced power models with prediction errors below 5.9%. A secondary study evaluated the accuracy of extracted power models for different Virtual Machine (VM) migration scenarios. The results of the fifth case study showed that our approach for modeling transient effects improved the prediction accuracy for a horizontally scaling application. Leveraging the improved accuracy, we were able to identify design deficiencies of the application that otherwise would have remained unnoticed

    Domänengetriebener Entwurf von ressourcenorientierten Microservices

    Get PDF

    Adaptation-Aware Architecture Modeling and Analysis of Energy Efficiency for Software Systems

    Get PDF
    This work presents an approach for the architecture analysis of energy efficiency for static and self-adaptive software systems. It introduces a modeling language that captures consumption characteristics on an architectural level. The outlined analysis predicts the energy efficiency of systems described with this language. Lastly, this work introduces an approach for considering transient effects in design time architecture analyses
    corecore