93 research outputs found

    Applying model-based systems engineering in search of quality by design

    Get PDF
    2022 Spring.Includes bibliographical references.Model-Based System Engineering (MBSE) and Model-Based Engineering (MBE) techniques have been successfully introduced into the design process of many different types of systems. The application of these techniques can be reflected in the modeling of requirements, functions, behavior, and many other aspects. The modeled design provides a digital representation of a system and the supporting development data architecture and functional requirements associated with that architecture through modeling system aspects. Various levels of the system and the corresponding data architecture fidelity can be represented within MBSE environment tools. Typically, the level of fidelity is driven by crucial systems engineering constraints such as cost, schedule, performance, and quality. Systems engineering uses many methods to develop system and data architecture to provide a representative system that meets costs within schedule with sufficient quality while maintaining the customer performance needs. The most complex and elusive constraints on systems engineering are defining system requirements focusing on quality, given a certain set of system level requirements, which is the likelihood that those requirements will be correctly and accurately found in the final system design. The focus of this research will investigate specifically the Department of Defense Architecture Framework (DoDAF) in use today to establish and then assess the relationship between the system, data architecture, and requirements in terms of Quality By Design (QbD). QbD was first coined in 1992, Quality by Design: The New Steps for Planning Quality into Goods and Services [1]. This research investigates and proposes a means to: contextualize high-level quality terms within the MBSE functional area, provide an outline for a conceptual but functional quality framework as it pertains to the MBSE DoDAF, provides tailored quality metrics with improved definitions, and then tests this improved quality framework by assessing two corresponding case studies analysis evaluations within the MBSE functional area to interrogate model architectures and assess quality of system design. Developed in the early 2000s, the Department of Defense Architecture Framework (DoDAF) is still in use today, and its system description methodologies continue to impact subsequent system description approaches [2]. Two case studies were analyzed to show proposed QbD evaluation to analyze DoDAF CONOP architecture quality. The first case study addresses the analysis of DoDAF CONOP of the National Aeronautics and Space Administration (NASA) Joint Polar Satellite System (JPSS) ground system for National Oceanic and Atmospheric Administration (NOAA) satellite system with particular focus on the Stored Mission Data (SMD) mission thread. The second case study addresses the analysis of DoDAF CONOP of the Search and Rescue (SAR) navel rescue operation network System of Systems (SoS) with particular focus on the Command and Control signaling mission thread. The case studies help to demonstrate a new DoDAF Quality Conceptual Framework (DQCF) as a means to investigate quality of DoDAF architecture in depth to include the application of DoDAF standard, the UML/SysML standards, requirement architecture instantiation, as well as modularity to understand architecture reusability and complexity. By providing a renewed focus on a quality-based systems engineering process when applying the DoDAF, improved trust in the system and data architecture of the completed models can be achieved. The results of the case study analyses reveal how a quality-focused systems engineering process can be used during development to provide a product design that better meets the customer's intent and ultimately provides the potential for the best quality product

    Interactive situation modelling in knowledge intensive domains

    Get PDF
    Interactive Situation Modelling (ISM) method, a semi-methodological approach, is proposed to tackle issues associated with modelling complex knowledge intensive domains, which cannot be easily modelled using traditional approaches. This paper presents the background and implementation of ISM within a complex domain, where synthesizing knowledge from various sources is critical, and is based on the principles of ethnography within a constructivist framework. Although the motivation for the reported work comes from the application presented in the paper, the actual scope of the paper covers a wide range of issues related to modelling complex systems. The author firstly reviews approaches used for modelling knowledge intensive domains, preceded by a brief discussion about two main issues: symmetry of ignorance and system behaviour, which are often confronted when applying modelling approaches to business domains. The ISM process is then characterized and critiqued with lessons from an exemplar presented to illustrate its effectiveness

    Interactive situation modelling in knowledge intensive domains

    Get PDF
    Interactive Situation Modelling (ISM) method, a semi-methodological approach, is proposed to tackle issues associated with modelling complex knowledge intensive domains, which cannot be easily modelled using traditional approaches. This paper presents the background and implementation of ISM within a complex domain, where synthesizing knowledge from various sources is critical, and is based on the principles of ethnography within a constructivist framework. Although the motivation for the reported work comes from the application presented in the paper, the actual scope of the paper covers a wide range of issues related to modelling complex systems. The author firstly reviews approaches used for modelling knowledge intensive domains, preceded by a brief discussion about two main issues: symmetry of ignorance and system behaviour, which are often confronted when applying modelling approaches to business domains. The ISM process is then characterized and critiqued with lessons from an exemplar presented to illustrate its effectiveness.

    De l'analyse des contraintes à la conception d'un système d'évaluation des performances d'un périmètre collectif irrigué : Fatnassa Nord, Kébili, Tunisie

    Get PDF
    International audienceCette communication expose la méthode suivie pour mettre en place un système d'évaluation des performances d'un périmètre collectif irrigué, appliquée à une oasis au sud tunisien. Après avoir proposé une restructuration des déterminants de dysfonctionnement et des variables identifiées en relation avec le fonctionnement actuel du périmètre irrigué, on y expose la procédure d'acquisition des données. Le diagramme de classes distingue quatre ensembles de classes relatives aux exploitations agricoles, aux systèmes de culture, aux contraintes du milieu physique et aux irrigations. Les variables ont été renseignées et validées sur la base de mesures, d'observations de terrain, d'enquêtes et d'exploitation des données disponibles (plans parcellaires, rôle et feuilles du suivi journalier du tour d'eau). L'exploitation préliminaire des données montre, en premier lieu, un dysfonctionnement systématique du tour d'eau au cours de l'année agricole 2006-2007 où sa fréquence en saison estivale reste entre 70 et 30 jours pour les trois antennes d'irrigation. Les principaux déterminants de ces dysfonctionnements sont premièrement la surface irriguée qui est 1,12 fois supérieure à la surface officielle sur la base de laquelle est calculée la durée théorique du tour d'eau ; et deuxièmement la durée d'irrigation à la parcelle où la durée théorique fixée à 10 h/ha a été respectée pour seulement 12,84 % des parcelles élémentaires irriguées sur l'oasis. L'étude montre, ensuite, que la performance des palmeraies, exprimée en rendement du palmier dattier, présente une grande diversité et que le rendement total en dattes reste inférieur à 3 t/ha pour 40,88 % des parcelles irriguées et que seulement 34,28 % des parcelles ont un rendement supérieur à 10 t/ha. La base de données sera exploitée pour analyser et évaluer les déterminants de la performance des irrigations et des dysfonctionnements du tour d'eau, d'une part, et les déterminants de la performance des systèmes de cultures oasiens, d'autre part, tout en élaborant des indicateurs de performance du fonctionnement du périmètre oasien

    Developing Methods of Obtaining Quality Failure Information from Complex Systems

    Get PDF
    The complexity in most engineering systems is constantly growing due to ever-increasing technological advancements. This result in a corresponding need for methods that adequately account for the reliability of such systems based on failure information from components that make up these systems. This dissertation presents an approach to validating qualitative function failure results from model abstraction details. The impact of the level of detail available to a system designer during conceptual stages of design is considered for failure space exploration in a complex system. Specifically, the study develops an efficient approach towards detailed function and behavior modeling required for complex system analyses. In addition, a comprehensive research and documentation of existing function failure analysis methodologies is also synthesized into identified structural groupings. Using simulations, known governing equations are evaluated for components and system models to study responses to faults by accounting for detailed failure scenarios, component behaviors, fault propagation paths, and overall system performance. The components were simulated at nominal states and varying degrees of fault representing actual modes of operation. Information on product design and provisions on expected working conditions of components were used in the simulations to address normally overlooked areas during installation. The results of system model simulations were investigated using clustering analysis to develop an efficient grouping method and measure of confidence for the obtained results. The intellectual merit of this work is the use of a simulation based approach in studying how generated failure scenarios reveal component fault interactions leading to a better understanding of fault propagation within design models. The information from using varying fidelity models for system analysis help in identifying models that are sufficient enough at the conceptual design stages to highlight potential faults. This will reduce resources such as cost, manpower and time spent during system design. A broader impact of the project is to help design engineers identifying critical components, quantifying risks associated with using particular components in their prototypes early in the design process and help improving fault tolerant system designs. This research looks to eventually establishing a baseline for validating and comparing theories of complex systems analysis

    Process modelling to support software development under the capability maturity model

    Get PDF

    Knowledge-based inspection

    Get PDF
    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing the ability to analyse, compare and transform models. In the example domain all possible connections between critical nuclear processes were formalized providing also for probability-based analysis of weapons acquisition paths that will help design objective-based inspection processes.Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing the ability to analyse, compare and transform models. In the example domain all possible connections between critical nuclear processes were formalized providing also for probability-based analysis of weapons acquisition paths that will help design objective-based inspection processes

    Text mining and natural language processing for the early stages of space mission design

    Get PDF
    Final thesis submitted December 2021 - degree awarded in 2022A considerable amount of data related to space mission design has been accumulated since artificial satellites started to venture into space in the 1950s. This data has today become an overwhelming volume of information, triggering a significant knowledge reuse bottleneck at the early stages of space mission design. Meanwhile, virtual assistants, text mining and Natural Language Processing techniques have become pervasive to our daily life. The work presented in this thesis is one of the first attempts to bridge the gap between the worlds of space systems engineering and text mining. Several novel models are thus developed and implemented here, targeting the structuring of accumulated data through an ontology, but also tasks commonly performed by systems engineers such as requirement management and heritage analysis. A first collection of documents related to space systems is gathered for the training of these methods. Eventually, this work aims to pave the way towards the development of a Design Engineering Assistant (DEA) for the early stages of space mission design. It is also hoped that this work will actively contribute to the integration of text mining and Natural Language Processing methods in the field of space mission design, enhancing current design processes.A considerable amount of data related to space mission design has been accumulated since artificial satellites started to venture into space in the 1950s. This data has today become an overwhelming volume of information, triggering a significant knowledge reuse bottleneck at the early stages of space mission design. Meanwhile, virtual assistants, text mining and Natural Language Processing techniques have become pervasive to our daily life. The work presented in this thesis is one of the first attempts to bridge the gap between the worlds of space systems engineering and text mining. Several novel models are thus developed and implemented here, targeting the structuring of accumulated data through an ontology, but also tasks commonly performed by systems engineers such as requirement management and heritage analysis. A first collection of documents related to space systems is gathered for the training of these methods. Eventually, this work aims to pave the way towards the development of a Design Engineering Assistant (DEA) for the early stages of space mission design. It is also hoped that this work will actively contribute to the integration of text mining and Natural Language Processing methods in the field of space mission design, enhancing current design processes

    Systematic Model-based Design Assurance and Property-based Fault Injection for Safety Critical Digital Systems

    Get PDF
    With advances in sensing, wireless communications, computing, control, and automation technologies, we are witnessing the rapid uptake of Cyber-Physical Systems across many applications including connected vehicles, healthcare, energy, manufacturing, smart homes etc. Many of these applications are safety-critical in nature and they depend on the correct and safe execution of software and hardware that are intrinsically subject to faults. These faults can be design faults (Software Faults, Specification faults, etc.) or physically occurring faults (hardware failures, Single-event-upsets, etc.). Both types of faults must be addressed during the design and development of these critical systems. Several safety-critical industries have widely adopted Model-Based Engineering paradigms to manage the design assurance processes of these complex CPSs. This thesis studies the application of IEC 61508 compliant model-based design assurance methodology on a representative safety-critical digital architecture targeted for the Nuclear power generation facilities. The study presents detailed experiences and results to demonstrate the benefits of Model testing in finding design flaws and its relevance to subsequent verification steps in the workflow. Additionally, to study the impact of physical faults on the digital architecture we develop a novel property-based fault injection method that overcomes few deficiencies of traditional fault injection methods. The model-based fault injection approach presented here guarantees high efficiency and near-exhaustive input/state/fault space coverage, by utilizing formal model checking principles to identify fault activation conditions and prove the fault tolerance features. The fault injection framework facilitates automated integration of fault saboteurs throughout the model to enable exhaustive fault location coverage in the model

    A decision support system for integrated semi-centralised urban wastewater treatment systems

    Get PDF
    The importance of adequate water supply and sanitation infrastructure as cornerstones for the development of civilizations is undeniable. Although a strategy based on centralised infrastructure has proven to be successful in the past, in some circumstances such conventional systems are inappropriate for future needs. A Semi-centralised Urban Wastewater Treatment System (SUWWTS) may be considered a viable sustainable urban water management solution to promote water security. A SUWWTS merges regulations of traditional centralised systems with the concepts of close-loop and resource recovery of decentralised systems. However, research on the design and feasibility of implementing semi-centralised systems is in its infancy. This Thesis is a first attempt to articulate the complexity, to systematize and to automatize the design of a SUWWTS. Here we show a novel method, referred to as framework, for the development of SUWWTS with allowance for the socio-economic and geographic context of any urban area. To demonstrate the proposed framework a Decision Support System (DSS) was developed; its output is a recommended design comprised of several wastewater treatment plants, their respective technology, and their associated sewerage and reclaimed water distribution networks. The results demonstrate the capabilities and the usefulness of the DSS; it applies the design engineers’ subjective preferences, such as regional technological inclinations and implementation strategies. The results from a feasibility study on the city of Rio de Janeiro validated and demonstrated how the DSS can be used to assist decision-makers. This Thesis discusses the framework, the DSS and the demonstration case. Overall, it will hopefully help both other researchers and practitioners by contributing to the discussion on how to promote urban water security, to decrease urban areas’ dependency on ecosystem services whilst delivering better social welfare
    corecore