928 research outputs found
Independent verification of specification models for large software systems at the early phases of development lifecycle
One of the major challenges facing the software industry, in general and IV&V (Independent Verification and Validation) analysts in particular, is to find ways for analyzing dynamic behavior of requirement specifications of large software systems early in the development lifecycle. Such analysis can significantly improve the performance and reliability of the developed systems. This dissertation addresses the problem of developing an IV&V framework for extracting semantics of dynamic behavior from requirement specifications based on: (1) SART (Structured Analysis with Realtime) models, and (2) UML (Unified Modeling Language) models.;For SART, the framework presented here shows a direct mapping from SART specification models to CPN (Colored Petrinets) models. The semantics of the SART hierarchy at the individual levels are preserved in the mapping. This makes it easy for the analyst to perform the analysis and trace back to the corresponding SART model. CPN was selected because it supports rigorous dynamic analysis. A large scale case study based on a component of NASA EOS system was performed for a proof of the concept.;For UML specifications, an approach based on metamodels is presented. A special type of metamodel, called dynamic metamodel (DMM), is introduced. This approach holds several advantages over the direct mapping of UML to CPN. The mapping rules for generating DMM are not CPN specific, hence they would not change if a language other than CPN is used. Also it makes it more flexible to develop DMM because other types of models can be added to the existing UML models. A simple example of a pacemaker is used to illustrate the concepts of DMM
Modular Software Process Simulation Models Through Metamodeling
In this paper we present the main concepts and principles of a multilevel architecture to help in the
development of modularized and reusable software process models under the System Dynamics approach. The
conceptual ideas of the multilevel architecture have been formalized using UML as a notation. Metamodeling is
used to support the process of abstract modules development. The architecture proposed is also based on ISO’s
Information Resource Dictionary System. The principles of the architecture and overall guide to develop
software process simulation models are described in this work.Ministerio de Ciencia y Tecnología TIN2004-06689-C03-0
Metamodeling Techniques Applied to the Design of Reconfigurable Control Applications
In order to realize autonomous manufacturing systems in environments characterized by high dynamics and high complexity of task, it is necessary to improve the control system modelling and performance. This requires the use of better and reusable abstractions. In this paper, we explore the metamodel techniques as a foundation to the solution of this problem. The increasing popularity of model-driven approaches and a new generation of tools to support metamodel techniques are changing software engineering landscape, boosting the adoption of new methodologies for control application development
Tackling Traceability Challenges through Modeling Principles in Methodologies Underpinned by Metamodels.
Traceability is recognized to be essential for supporting software development. However, a number of traceability issues are still open, such as link semantics formalization or traceability process models. Traceability methodologies underpinned by metamodels are a promising approach. However current metamodels still have serious limitations. Concerning methodologies in general, three hierarchical layered levels have been identified: metamodel, methodology and project. Metamodels do not often properly support this architecture, and that results in semantic problems at the time of specifying the methodology. Another reason is that they provide extensive predefined sets of types for describing project attributes, while these project attributes are domain specific and, sometimes, even project specific. This paper introduces two complementary modeling principles to overcome these limitations, i.e. the metamodeling three layer hierarchy, and power-type patterns modeling principles. Mechanisms to extend and refine traceability models are inherent to them. The paper shows that, when methodologies are developed from metamodels based on these two principles, the result is a methodology well fitted to project features. Links semantics is also improved
Recommended from our members
From multiscale modeling to metamodeling of geomechanics problems
In numerical simulations of geomechanics problems, a grand challenge consists of overcoming the difficulties in making accurate and robust predictions by revealing the true mechanisms in particle interactions, fluid flow inside pore spaces, and hydromechanical coupling effect between the solid and fluid constituents, from microscale to mesoscale, and to macroscale. While simulation tools incorporating subscale physics can provide detailed insights and accurate material properties to macroscale simulations via computational homogenizations, these numerical simulations are often too computational demanding to be directly used across multiple scales. Recent breakthroughs of Artificial Intelligence (AI) via machine learning have great potential to overcome these barriers, as evidenced by their great success in many applications such as image recognition, natural language processing, and strategy exploration in games. The AI can achieve super-human performance level in a large number of applications, and accomplish tasks that were thought to be not feasible due to the limitations of human and previous computer algorithms. Yet, machine learning approaches can also suffer from overfitting, lack of interpretability, and lack of reliability. Thus the application of machine learning into generation of accurate and reliable surrogate constitutive models for geomaterials with multiscale and multiphysics is not trivial. For this purpose, we propose to establish an integrated modeling process for automatic designing, training, validating, and falsifying of constitutive models, or "metamodeling". This dissertation focuses on our efforts in laying down step-by-step the necessary theoretical and technical foundations for the multiscale metamodeling framework.
The first step is to develop multiscale hydromechanical homogenization frameworks for both bulk granular materials and granular interfaces, with their behaviors homogenized from subscale microstructural simulations. For efficient simulations of field-scale geomechanics problems across more than two scales, we develop a hybrid data-driven method designed to capture the multiscale hydro-mechanical coupling effect of porous media with pores of various different sizes. By using sub-scale simulations to generate database to train material models, an offline homogenization procedure is used to replace the up-scaling procedure to generate path-dependent cohesive laws for localized physical discontinuities at both grain and specimen scales.
To enable AI in taking over the trial-and-error tasks in the constitutive modeling process, we introduce a novel “metamodeling” framework that employs both graph theory and deep reinforcement learning (DRL) to generate accurate, physics compatible and interpretable surrogate machine learning models. The process of writing constitutive models is simplified as a sequence of forming graph edges with the goal of maximizing the model score (a function of accuracy, robustness and forward prediction quality). By using neural networks to estimate policies and state values, the computer agent is able to efficiently self-improve the constitutive models generated through self-playing.
To overcome the obstacle of limited information in geomechanics, we improve the efficiency in utilization of experimental data by a multi-agent cooperative metamodeling framework to provide guidance on database generation and constitutive modeling at the same time. The modeler agent in the framework focuses on evaluating all modeling options (from domain experts’ knowledge or machine learning) in a directed multigraph of elasto-plasticity theory, and finding the optimal path that links the source of the directed graph (e.g., strain history) to the target (e.g., stress). Meanwhile, the data agent focuses on collecting data from real or virtual experiments, interacts with the modeler agent sequentially and generates the database for model calibration to optimize the prediction accuracy. Finally, we design a non-cooperative meta-modeling framework that focuses on automatically developing strategies that simultaneously generate experimental data to calibrate model parameters and explore weakness of a known constitutive model until the strengths and weaknesses of the constitutive law on the application range can be identified through competition. These tasks are enabled by a zero-sum reward system of the metamodeling game and robust adversarial reinforcement learning techniques
Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models
This research investigates how aggregation is currently conducted for simulation of large systems. The purpose is to examine how to achieve suitable aggregation in the simulation of large systems. More specifically, investigating how to accurately aggregate hierarchical lower-level (higher resolution) models into the next higher-level in order to reduce the complexity of the overall simulation model. The focus is on the exploration of the different aggregation techniques for hierarchical lower-level (higher resolution) models into the next higher-level. We develop aggregation procedures between two simulation levels (e.g., aggregation of engagement level models into a mission level model) to address how much and what information needs to pass from the high resolution to the low-resolution model in order to preserve statistical fidelity. We present a mathematical representation of the simulation model based on network theory and procedures for simulation aggregation that are logical and executable. This research examines the effectiveness of several statistical techniques, to include regression and three types of artificial neural networks, as an aggregation technique in predicting outputs of the lower-level model and evaluating its effects as an input into the next higher-level model. The proposed process is a collection of various conventional statistical and aggregation techniques, to include one novel concept and extensions to the regression and neural network methods, which are compared to the truth simulation model, where the truth model is when actual lower-level model outputs are used as a direct input into the next higher-level model. The aggregation methodology developed in this research provides an analytic foundation that formally defines the necessary steps essential in appropriately and effectively simulating large hierarchical systems
Model Continuity in Discrete Event Simulation: A Framework for Model-Driven Development of Simulation Models.
Most of the well known modeling and simulation methodologies state the importance of conceptual modeling in simulation studies and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to howto move from a conceptual model to an executable simulation model. Besides, existing modeling and simulation methodologies do not typically provide a formal method for model transformations between the models in different stages of the development process. Hence, in the current M&S practice, model continuity is usually not fulfilled. In this article, a model driven development framework for modeling and simulation is in order to bridge the gap between different stages of a simulation study and to obtain model continuity. The applicability of the framework is illustrated with a prototype modeling environment and a case study in the discrete event simulation domain
SDK development for bridging heterogeneous data sources through connect bridge platform
Nesta dissertação apresentou-se um SDK para a criação de conectores a integrar com o CB Server, que pretende: acelerar o desenvolvimento, garantir melhores práticas e simplificar as diversas atividades e tarefas no processo de desenvolvimento. O SDK fornece uma API pública e simples, suportada por um conjunto de ferramentas, que facilitam o processo de desenvolvimento, explorando as facilidades disponibilizadas através da API. Para analisar a exatidão, viabilidade, integridade e acessibilidade da solução apresentam-se dois exemplos e casos de estudo. Através dos casos de estudo foi possível identificar uma lista de problemas, de pontos sensíveis e melhorias na solução proposta. Para avaliar a usabilidade da API, uma metodologia baseada em vários métodos de avaliação de usabilidade foi estabelecida. O múltiplo caso de estudo funciona como o principal método de avaliação, combinando vários métodos de pesquisa. O caso de estudo consiste em três fases de avaliação: um workshop, uma avaliação heurística e uma análise subjetiva. O caso de estudo envolveu três engenheiros de software (incluindo programadores e avaliadores). A metodologia aplicada gerou resultados com base num método de inspeção, testes de utilizador e entrevistas. Identificou-se não só pontos sensíveis e falhas no código-fonte, mas também problemas estruturais, de documentação e em tempo de execução, bem como problemas relacionados com a experiência do utilizador. O contexto do estudo é apresentado de modo a tirar conclusões acerca dos resultados obtidos. O trabalho futuro incluirá o desenvolvimento de novas funcionalidades. Adicionalmente, pretende-se resolver problemas encontrados na metodologia aplicada para avaliar a usabilidade da API, nomeadamente problemas e falhas no código fonte (por exemplo, validações) e problemas estruturais.In this dissertation, we present an SDK for the creation of connectors to integrate with CB Server which accelerates deployment, ensures best practices and simplifies the various activities and tasks in the development process. The SDK provides a public and simple API leveraged by a set of tools around the API developed which facilitate the development process by exploiting the API facilities. To analyse the correctness, feasibility, completeness, and accessibility of our solution, we presented two examples and case studies. From the case studies, we derived a list of issues found in our solution and a set of proposals for improvement. To evaluate the usability of the API, a methodology based on several usability evaluation methods has been established. Multiple case study works as the main evaluation method, combining several research methods. The case study consists of three evaluation phases – a hands-on workshop, a heuristic evaluation and subjective analysis. The case study involved three computer science engineers (including novice and expert developers and evaluators). The applied methodology generated insights based on an inspection method, a user test, and interviews. We identify not only problems and flaws in the source code, but also runtime, structural and documentation problems, as well as problems related to user experience. To help us draw conclusion from the results, we point out the context of the study. Future work will include the development of new functionalities. Additionally, we aim to solve problems found in the applied methodology to evaluate the usability of the API, namely problems and flaws in the source code (e.g. validations) and structural problems
- …