548 research outputs found

    Software Design Change Artifacts Generation through Software Architectural Change Detection and Categorisation

    Get PDF
    Software is solely designed, implemented, tested, and inspected by expert people, unlike other engineering projects where they are mostly implemented by workers (non-experts) after designing by engineers. Researchers and practitioners have linked software bugs, security holes, problematic integration of changes, complex-to-understand codebase, unwarranted mental pressure, and so on in software development and maintenance to inconsistent and complex design and a lack of ways to easily understand what is going on and what to plan in a software system. The unavailability of proper information and insights needed by the development teams to make good decisions makes these challenges worse. Therefore, software design documents and other insightful information extraction are essential to reduce the above mentioned anomalies. Moreover, architectural design artifacts extraction is required to create the developer’s profile to be available to the market for many crucial scenarios. To that end, architectural change detection, categorization, and change description generation are crucial because they are the primary artifacts to trace other software artifacts. However, it is not feasible for humans to analyze all the changes for a single release for detecting change and impact because it is time-consuming, laborious, costly, and inconsistent. In this thesis, we conduct six studies considering the mentioned challenges to automate the architectural change information extraction and document generation that could potentially assist the development and maintenance teams. In particular, (1) we detect architectural changes using lightweight techniques leveraging textual and codebase properties, (2) categorize them considering intelligent perspectives, and (3) generate design change documents by exploiting precise contexts of components’ relations and change purposes which were previously unexplored. Our experiment using 4000+ architectural change samples and 200+ design change documents suggests that our proposed approaches are promising in accuracy and scalability to deploy frequently. Our proposed change detection approach can detect up to 100% of the architectural change instances (and is very scalable). On the other hand, our proposed change classifier’s F1 score is 70%, which is promising given the challenges. Finally, our proposed system can produce descriptive design change artifacts with 75% significance. Since most of our studies are foundational, our approaches and prepared datasets can be used as baselines for advancing research in design change information extraction and documentation

    Provision and Collection of Safety Evidence: A Systematic Literature Review

    Get PDF
    Safety-Critical Systems (SCS) are becoming more and more present in modern societies’ daily lives, increasing people’s dependence on them. Current SCS are firmly based on computational technology; possible failures in the operation of these systems can lead to accidents and endanger human life, as well as damage the environment and property. SCS are present in many areas such as avionics, automotive systems, industrial plants (chemical, oil & gas, and nuclear), medical devices, railroad control, defense, and aerospace systems. Companies that develop SCS must present evidence of their safety to obtain certification and authorization. This paper presents a Systematic Literature Review (SLR) to investigate processes, tools, and techniques for collecting and managing safety evidence in SCS. The authors conducted this SLR according to the guidelines proposed by Kitchenham and Charters. The SLR comprises seven (7) research questions that investigate essential aspects of collecting and managing safety evidence. The primary studies analyzed in this SLR were selected based on a search string applied into four data sources: ACM, IEEE Xplore, SpringerLink, and ScienceDirect. Data extraction considered (fifty-one) 51 primary studies. The authors identified eleven (11) different approaches covering processes, tools, and techniques for collecting and managing safety evidence. Despite other SLR works conducted about safety evidence, none of them focused on the details related to safety evidence collection. We found that very few approaches focused specifically on the process of collecting safety evidence

    Design of the Electronics Subsystem for a High-Resolution Electro-Optical Payload Using Systems Engineering Approach

    Get PDF
    Satellite imagers, in contrast to commercial imagers, demand exceptional performance and operate under harsh conditions. The camera is an essential part of an Earth Observation Electro Optical (EO) payload that is designed in response to needs such as military demands, changes in world politics, inception of new technologies, operational requirements and experiments. As one of the key subsystems, the Imager Electronics Subsystem of a high-resolution EO payload plays very important role in the accomplishment of mission objectives and payload goals. Hence, these Electronics Subsystems require a special design approach optimised for their needs and meticulous characterizations of high-resolution space applications. This dissertation puts forward the argument that the system being studied is a subsystem of a larger system and that systems engineering principles can be applied to the subsystem design process also. The aim of this dissertation is to design the Imager Electronics Subsystem of a high-resolution Electro Optical Payload using a systems engineering approach to represent a logical integration and test flow using the space industry guidelines. The Imager Electronics Subsystem consists of group of elements forming the functional chain from the Image Sensors on the Focal Plane down to electrical interface to the Data Handling Unit and power interface of the satellite. This subsystem is responsible for collecting light in different spectral bands, converting this light to data of different spectral bands from image sensors for high-resolution imaging, performing operations for aligning, tagging and multiplexing along with incorporating internal and external interfaces

    Digital Twins of production systems - Automated validation and update of material flow simulation models with real data

    Get PDF
    Um eine gute Wirtschaftlichkeit und Nachhaltigkeit zu erzielen, müssen Produktionssysteme über lange Zeiträume mit einer hohen Produktivität betrieben werden. Dies stellt produzierende Unternehmen insbesondere in Zeiten gesteigerter Volatilität, die z.B. durch technologische Umbrüche in der Mobilität, sowie politischen und gesellschaftlichen Wandel ausgelöst wird, vor große Herausforderungen, da sich die Anforderungen an das Produktionssystem ständig verändern. Die Frequenz von notwendigen Anpassungsentscheidungen und folgenden Optimierungsmaßnahmen steigt, sodass der Bedarf nach Bewertungsmöglichkeiten von Szenarien und möglichen Systemkonfigurationen zunimmt. Ein mächtiges Werkzeug hierzu ist die Materialflusssimulation, deren Einsatz aktuell jedoch durch ihre aufwändige manuelle Erstellung und ihre zeitlich begrenzte, projektbasierte Nutzung eingeschränkt wird. Einer längerfristigen, lebenszyklusbegleitenden Nutzung steht momentan die arbeitsintensive Pflege des Simulationsmodells, d.h. die manuelle Anpassung des Modells bei Veränderungen am Realsystem, im Wege. Das Ziel der vorliegenden Arbeit ist die Entwicklung und Umsetzung eines Konzeptes inkl. der benötigten Methoden, die Pflege und Anpassung des Simulationsmodells an die Realität zu automatisieren. Hierzu werden die zur Verfügung stehenden Realdaten genutzt, die aufgrund von Trends wie Industrie 4.0 und allgemeiner Digitalisierung verstärkt vorliegen. Die verfolgte Vision der Arbeit ist ein Digitaler Zwilling des Produktionssystems, der durch den Dateninput zu jedem Zeitpunkt ein realitätsnahes Abbild des Systems darstellt und zur realistischen Bewertung von Szenarien verwendet werden kann. Hierfür wurde das benötigte Gesamtkonzept entworfen und die Mechanismen zur automatischen Validierung und Aktualisierung des Modells entwickelt. Im Fokus standen dabei unter anderem die Entwicklung von Algorithmen zur Erkennung von Veränderungen in der Struktur und den Abläufen im Produktionssystem, sowie die Untersuchung des Einflusses der zur Verfügung stehenden Daten. Die entwickelten Komponenten konnten an einem realen Anwendungsfall der Robert Bosch GmbH erfolgreich eingesetzt werden und führten zu einer Steigerung der Realitätsnähe des Digitalen Zwillings, der erfolgreich zur Produktionsplanung und -optimierung eingesetzt werden konnte. Das Potential von Lokalisierungsdaten für die Erstellung von Digitalen Zwillingen von Produktionssystem konnte anhand der Versuchsumgebung der Lernfabrik des wbk Instituts für Produktionstechnik demonstriert werden

    Engineering for a changing world: 60th Ilmenau Scientific Colloquium, Technische Universität Ilmenau, September 04-08, 2023 : programme

    Get PDF
    In 2023, the Ilmenau Scientific Colloquium is once more organised by the Department of Mechanical Engineering. The title of this year’s conference “Engineering for a Changing World” refers to limited natural resources of our planet, to massive changes in cooperation between continents, countries, institutions and people – enabled by the increased implementation of information technology as the probably most dominant driver in many fields. The Colloquium, supplemented by workshops, is characterised but not limited to the following topics: – Precision engineering and measurement technology Nanofabrication – Industry 4.0 and digitalisation in mechanical engineering – Mechatronics, biomechatronics and mechanism technology – Systems engineering – Productive teaming - Human-machine collaboration in the production environment The topics are oriented on key strategic aspects of research and teaching in Mechanical Engineering at our university

    Validation and Verification of Safety-Critical Systems in Avionics

    Get PDF
    This research addresses the issues of safety-critical systems verification and validation. Safety-critical systems such as avionics systems are complex embedded systems. They are composed of several hardware and software components whose integration requires verification and testing in compliance with the Radio Technical Commission for Aeronautics standards and their supplements (RTCA DO-178C). Avionics software requires certification before its deployment into an aircraft system, and testing is mandatory for certification. Until now, the avionics industry has relied on expensive manual testing. The industry is searching for better (quicker and less costly) solutions. This research investigates formal verification and automatic test case generation approaches to enhance the quality of avionics software systems, ensure their conformity to the standard, and to provide artifacts that support their certification. The contributions of this thesis are in model-based automatic test case generations approaches that satisfy MC/DC criterion, and bidirectional requirement traceability between low-level requirements (LLRs) and test cases. In the first contribution, we integrate model-based verification of properties and automatic test case generation in a single framework. The system is modeled as an extended finite state machine model (EFSM) that supports both the verification of properties and automatic test case generation. The EFSM models the control and dataflow aspects of the system. For verification, we model the system and some properties and ensure that properties are correctly propagated to the implementation via mandatory testing. For testing, we extended an existing test case generation approach with MC/DC criterion to satisfy RTCA DO-178C requirements. Both local test cases for each component and global test cases for their integration are generated. The second contribution is a model checking-based approach for automatic test case generation. In the third contribution, we developed an EFSM-based approach that uses constraints solving to handle test case feasibility and addresses bidirectional requirements traceability between LLRs and test cases. Traceability elements are determined at a low-level of granularity, and then identified, linked to their source artifact, created, stored, and retrieved for several purposes. Requirements’ traceability has been extensively studied but not at the proposed low-level of granularity

    Towards Developing a Digital Twin Implementation Framework for Manufacturing Systems

    Get PDF
    This research studies the implementation of digital twins in manufacturing systems. Digital transformation is relevant due to changing manufacturing techniques and user demands. It brings new business opportunities, changes organizations, and allows factories to compete in the digital era. Nevertheless, digital transformation presents many uncertainties that could bring problems to a manufacturing system. Some potential problems are loss of data, cybersecurity threats, unpredictable behavior, and so on. For instance, there are doubts about how to integrate the physical and virtual spaces. Digital twin (DT) is a modern technology that can enable the digital transformation of manufacturing companies. DT works by collecting real-time data of machines, products, and processes. DT monitors and controls operations in real-time helping in the identification of problems. It performs simulations to improve manufacturing processes and end-products. DT presents several benefits for manufacturing systems. It gives feedback to the physical system, increases the system’s reliability and availability, reduces operational risks, helps to achieve organizational goals, reduces operations and maintenance costs, predicts machine failures, etc. DT presents all these benefits without affecting the system’s operation. xv This dissertation analyzes the implementation of digital twins in manufacturing systems. It uses systems thinking methods and tools to study the problem space and define the solution space. Some of these methods are the conceptagon, systemigram, and the theory of inventive problem solving (TRIZ in Russian acronym). It also uses systems thinking tools such as the CATWOE, the 9-windows tool, and the ideal final result (IFR). This analysis gives some insights into the digital twin implementation issues and potential solutions. One of these solutions is to build a digital twin implementation framework Next, this study proposes the development of a small-scale digital twin implementation framework. This framework could help users to create digital twins in manufacturing systems. The method to build this framework uses a Model-Based Systems Engineering approach and the systems engineering “Vee” model. This framework encompasses many concepts from the digital twin literature. The framework divides these concepts along three spaces: physical, virtual, and information. It also includes other concepts such as digital thread, data, ontology, and enabling technologies. Finally, this dissertation verifies the correctness of the proposed framework. The verification process shows that the proposed framework can develop digital twins for manufacturing systems. For that purpose, this study creates a process digital twin simulation using the proposed framework. This study presents a mapping and a workflow diagram to help users use the proposed framework. Then, it compares the digital twin simulation with the digital twin user and system requirements. The comparison finds that the proposed framework was built right

    Innovation Modeling Grid

    Full text link
    This technical document presents the committee driven innovation modeling methodology "Innovation Modeling Grid" in detail. This document is the successor of three publications on IMoG and focuses on presenting all details of the methodologyComment: ~170p, many figures, technical documen
    • …
    corecore