14 research outputs found

    Model Continuity in Discrete Event Simulation: A Framework for Model-Driven Development of Simulation Models.

    Get PDF
    Most of the well known modeling and simulation methodologies state the importance of conceptual modeling in simulation studies and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to howto move from a conceptual model to an executable simulation model. Besides, existing modeling and simulation methodologies do not typically provide a formal method for model transformations between the models in different stages of the development process. Hence, in the current M&S practice, model continuity is usually not fulfilled. In this article, a model driven development framework for modeling and simulation is in order to bridge the gap between different stages of a simulation study and to obtain model continuity. The applicability of the framework is illustrated with a prototype modeling environment and a case study in the discrete event simulation domain

    Model-Based Methodology for Building Confidence in a Dynamic Measuring System

    Get PDF
    This thesis examines the special case in which a newly developed dynamic measurement system must be characterized when an accepted standard qualification procedure does not yet exist. In order to characterize this type of system, both physical experimentation and computational simulation methods will be used to build trust in this measurement system. This process of establishing credibility will be presented in the form of a proposed methodology. This proposed methodology will utilize verification and validation methods that apply within the simulation community as the foundation for this multi-faceted approach. The methodology will establish the relationships between four key elements: physical experimentation, conceptual modeling, computational simulations, and data processing. The combination of these activities will provide a comprehensive characterization study of the system. In order to illustrate the methodology, a case study was performed on a dynamic force measurement system owned by Sandia National Laboratories. This system was designed to measure the force required to pull a specimen to failure in tension at a user-input velocity. The results of the case study found that there was a significant measurement error occurring as the pull event involved large break loads and high velocities. 100 pull events were recorded using an experimental test assembly. The highest load conditions discovered a force measurement error of over 100%. Using computational simulations, this measurement error was reduced to less than 10%. These simulations were designed to account for the inertial effects that skew the piezoelectric load cells. This thesis displays the raw data and the corrected data for five different pull settings. The simulations designed using the methodology significantly reduced the error in all five pull settings. In addition to the force analysis, the simulations provide insight into the complete system performance. This includes the analysis of the maximum system velocity as well as the analysis of several proposed design changes. The findings suggest that the dynamic measurement system has a maximum velocity of 28 fps, and that this maximum velocity is unaffected by the track length or the mass of the moving carriage

    Agent-based dynamic knowledge representation of Pseudomonas aeruginosa virulence activation in the stressed gut: Towards characterizing host-pathogen interactions in gut-derived sepsis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is a growing realization that alterations in host-pathogen interactions (HPI) can generate disease phenotypes without pathogen invasion. The gut represents a prime region where such HPI can arise and manifest. Under normal conditions intestinal microbial communities maintain a stable, mutually beneficial ecosystem. However, host stress can lead to changes in environmental conditions that shift the nature of the host-microbe dialogue, resulting in escalation of virulence expression, immune activation and ultimately systemic disease. Effective modulation of these dynamics requires the ability to characterize the complexity of the HPI, and dynamic computational modeling can aid in this task. Agent-based modeling is a computational method that is suited to representing spatially diverse, dynamical systems. We propose that dynamic knowledge representation of gut HPI with agent-based modeling will aid in the investigation of the pathogenesis of gut-derived sepsis.</p> <p>Methodology/Principal Findings</p> <p>An agent-based model (ABM) of virulence regulation in <it>Pseudomonas aeruginosa </it>was developed by translating bacterial and host cell sense-and-response mechanisms into behavioral rules for computational agents and integrated into a virtual environment representing the host-microbe interface in the gut. The resulting gut milieu ABM (GMABM) was used to: 1) investigate a potential clinically relevant laboratory experimental condition not yet developed - i.e. non-lethal transient segmental intestinal ischemia, 2) examine the sufficiency of existing hypotheses to explain experimental data - i.e. lethality in a model of major surgical insult and stress, and 3) produce behavior to potentially guide future experimental design - i.e. suggested sample points for a potential laboratory model of non-lethal transient intestinal ischemia. Furthermore, hypotheses were generated to explain certain discrepancies between the behaviors of the GMABM and biological experiments, and new investigatory avenues proposed to test those hypotheses.</p> <p>Conclusions/Significance</p> <p>Agent-based modeling can account for the spatio-temporal dynamics of an HPI, and, even when carried out with a relatively high degree of abstraction, can be useful in the investigation of system-level consequences of putative mechanisms operating at the individual agent level. We suggest that an integrated and iterative heuristic relationship between computational modeling and more traditional laboratory and clinical investigations, with a focus on identifying useful and sufficient degrees of abstraction, will enhance the efficiency and translational productivity of biomedical research.</p

    UM MODELO BASEADO EM ONTOLOGIA E ORIENTADO A RISCOS PARA CERTIFICAÇÃO DE QUALIDADE DE PRODUTOS DE SOFTWARE

    Get PDF
    This paper presents a model for software product quality certification based on an ontology that uses a risk oriented framework. A literature review about testing and software quality and international standards such as ISO were used as the grounds for the research that also involved a case study. A prototype was developed using ontologies and software agents. The contribution of this work consists on a framework, which is focused on risks for specialization of software quality model, and ontology for the representation of the involved knowledge.Este artigo apresenta um modelo para certificação de qualidade de produtos de software baseado em ontologia que faz uso de um framework orientado aos riscos. Para desenvolvimento da pesquisa utilizou-se a base teórica relacionada à área de teste e qualidade de software e os modelos internacionais como normas das séries ISO, além da aplicação em um estudo de caso. Um protótipo foi desenvolvido utilizando-se de tecnologias de ontologias e agentes de software. A contribuição do trabalho consiste no framework estar focado em riscos para especialização do modelo de qualidade e em apresentar uma ontologia para representar o conhecimento envolvido no processo de certificação

    Model-Based Methodology for Building Confidence in a Dynamic Measuring System

    Get PDF
    This thesis examines the special case in which a newly developed dynamic measurement system must be characterized when an accepted standard qualification procedure does not yet exist. In order to characterize this type of system, both physical experimentation and computational simulation methods will be used to build trust in this measurement system. This process of establishing credibility will be presented in the form of a proposed methodology. This proposed methodology will utilize verification and validation methods that apply within the simulation community as the foundation for this multi-faceted approach. The methodology will establish the relationships between four key elements: physical experimentation, conceptual modeling, computational simulations, and data processing. The combination of these activities will provide a comprehensive characterization study of the system. In order to illustrate the methodology, a case study was performed on a dynamic force measurement system owned by Sandia National Laboratories. This system was designed to measure the force required to pull a specimen to failure in tension at a user-input velocity. The results of the case study found that there was a significant measurement error occurring as the pull event involved large break loads and high velocities. 100 pull events were recorded using an experimental test assembly. The highest load conditions discovered a force measurement error of over 100%. Using computational simulations, this measurement error was reduced to less than 10%. These simulations were designed to account for the inertial effects that skew the piezoelectric load cells. This thesis displays the raw data and the corrected data for five different pull settings. The simulations designed using the methodology significantly reduced the error in all five pull settings. In addition to the force analysis, the simulations provide insight into the complete system performance. This includes the analysis of the maximum system velocity as well as the analysis of several proposed design changes. The findings suggest that the dynamic measurement system has a maximum velocity of 28 fps, and that this maximum velocity is unaffected by the track length or the mass of the moving carriage

    The Advanced Framework for Evaluating Remote Agents (AFERA): A Framework for Digital Forensic Practitioners

    Get PDF
    Digital forensics experts need a dependable method for evaluating evidence-gathering tools. Limited research and resources challenge this process and the lack of multi-endpoint data validation hinders reliability in distributed digital forensics. A framework was designed to evaluate distributed agent-based forensic tools while enabling practitioners to self-evaluate and demonstrate evidence reliability as required by the courts. Grounded in Design Science, the framework features guidelines, data, criteria, and checklists. Expert review enhances its quality and practicality

    Development Approaches Coupled with Verification and Validation Methodologies for Agent-Based Mission-Level Analytical Combat Simulations

    Get PDF
    This research investigated the applicability of agent-based combat simulations to real-world combat operations. An agent-based simulation of the Allied offensive search for German U-Boats in the Bay of Biscay during World War II was constructed, extending the state-of-the-art in agent-based combat simulations, bridging the gap between the current level of agent-like combat simulations and the concept of agent-based simulations found in the broader literature. The proposed simulation advances agent-based combat simulations to “validateable” mission-level military operations. Simulation validation is a complex task with numerous, diverse techniques available and levels of validation differing significantly among simulations and applications. This research presents a verification and validation taxonomy based on face validity, empirical validity, and theoretical validity, extending the verification and validation knowledge-base to include techniques specific to agent-based models. The verification and validation techniques are demonstrated in a Bay of Biscay case study. Validating combat operations pose particular problems due to the infrequency of real-world occurrences to serve as simulation validation cases; often just a single validation comparison can be made. This means comparisons to the underlying stochastic process are not possible without significant loss of statistical confidence. This research also presents a statistical validation methodology based on re-sampling historical outcomes, which when coupled with the traditional nonparametric sign test, allows comparison between a simulation and historic operation providing an improved validation indicator beyond the single pass or fail test

    Factores de aceptación del Cloud Computing en la ciudad de Medellín: Estudio de caso

    Get PDF
    El Cloud Computing es una de las tecnologías con mayor crecimiento aplicadas al sector empresarial y entendida como un conjunto de recursos computacionales disponibles en Internet y en forma pública que permite a las empresas externalizar su infraestructura en pro de mejorar la eficiencia en sus procesos. Sin embargo, la difusión de una nueva tecnología no asegura su uso, aceptación y utilidad en las pequeñas y medianas empresas (PYME). Es por esto que es importante poder estudiar los factores que condicionan la adopción del Cloud Computing en las PYME de Medellín para describir su nivel de conocimiento, las motivaciones para incorporar nuevas tecnologías y la forma de las empresas que ya lo hicieron como lo lograron. Esta propuesta de investigación contempló como objetivo general examinar los factores de aceptación del Cloud Computing en 5 PYME del Sector Servicios de la ciudad de Medellín a través del modelo de Aceptación Tecnológica (TAM) con el fin de plantear estrategias empresariales internas para impulsar su utilización en las PYME. Se utilizó el modelo de aceptación tecnológica (TAM) que parte de la teoría de difusión de innovaciones que es uno de los ejes claves en el desarrollo de productos y servicios tecnológicos y uno de los modelos más utilizados y empleados en investigaciones para indagar la aceptación de sistemas de información. Se realizó una investigación exploratoria descriptiva, con el fin de indagar sobre los factores de adopción del Cloud Computing en las PYME, para tratar de describir cuales empresas han adoptado la tecnología y como lo están haciendo. Mediante un estudio de campo con acceso a fuentes primarias de información con metodología cualitativa a través de entrevistas semiestructuradas. Entre los resultados obtenidos se encuentra un campo en continuo crecimiento y de gran dispersión, con alta dinámica en la evolución de sus redes. Así mismo, se encontró que la percepción del Cloud Computing está enfocada en la utilidad de uso mediante factores que fomentan su adopción como la flexibilidad, fácil adaptación, expansión del negocio, estrategia para competir y mantenerse en el mercadoThe cloud computing is one of the technologies with a big increase applied to the business sector and understood as available computer recourses in the internet and in a public way which let the companies outsourcing its infrastructure pro to improve the efficiency of the process. Nevertheless, the diffusion of a new technology doesn’t sure its use, acceptance and usefulness in medium or small companies (MSC). For that reason it is important can study the factors which determine the adoption of the cloud computing in those medium and small companies (MSC) of Medellin to describe their knowledge levels, the motivations to incorporate new technologies and the way how the companies already did as they succeeded. Keeping in mind the last, this investigation purpose contemplates as a general objective examining the acceptation factors of Cloud computing in 5 MSC of the service sector of Medellin city trough the technological acceptance model (TAM) with the purpose to plant some internal business strategies to impulse its use in MSC. Based in last it will use the technological acceptance model (TAM) which begins to the diffusion theory of innovations that is one of the key axes in the development of technological products and services and one of the most used on investigations to inquire the information system acceptance. It will realize a descriptive exploratory investigation, with the purpose of inquiring about the adopting factors of the Cloud computing in MSC, trying to describe which companies have adopted the technology and how they are doing it. Through a field study with an access to a primary information sources with a qualitative methodology through a semi - structured interviews. Among the obtained results there is a field of continuous increasing and a big dispersion, with a high dynamic in the evolution of its networking. Likewise, it was found the perception of Cloud computing is focused in the use utility through some factors that promote its adoption as the flexibility, easy adaptation, business growth, strategies to compete and keep itself in the marketingMaestría en Gestión de la Innovación Tecnológica, Cooperación y Desarrollo Regiona
    corecore