1,642 research outputs found

    Semiautomatic generation of Web courses by means of an object-oriented simulation language

    Full text link
    This paper describes the procedure we have used to semiautomatically generate three different courses for the web. The simulation used in these courses have been written in our special-purpose object-oriented continuous simulation language (OOCSMP). A compiler we have written for this language automatically generates Java code and html pages, which must be completed manually with the text and images associated to each.This paper has been sponsored by the Spanish Interdepartmental Commission of Science and Technology (CICYT), project numbers TIC-96-0723-C02-01 and TEL97-030

    A ranking of the most known freeware and open source discrete-event simulation tools

    Get PDF
    Freeware and open source simulation software can be of great relevant when applying simulation in companies that do not possess the required monetary resources to invest in traditional commercial software, since these can be unaffordable Even so, there is a lack of papers that contribute to literature with a comparison of opensource and freeware simulation tools. Furthermore, such existing papers fail to establish a proper assessment of these type of tools. In this regard, this paper proposes a study in which several freeware and open source discrete-event general purpose simulation tools were selected and compared, in order to propose a ranking based on the tools' popularity, considering several criteria. For this purpose, 30 criteria were used to assess the score of each tool, leading to a podium composed by SimPy, JSim and JaamSim. Further conclusion and future work are discussed in the last section.This work has been supported by FCT – Fundação para a Ciência e Tecnologia within the Project Scope: UID/CEC/00319/2019 and by the Doctoral scholarship PDE/BDE/114566/2016 funded by FCT, the Portuguese Ministry of Science, Technology and Higher Education, through national funds, and co-financed by the European Social Fund (ESF) through the Operational Programme for Human Capital (POCH

    Freeform User Interfaces for Graphical Computing

    Get PDF
    報告番号: 甲15222 ; 学位授与年月日: 2000-03-29 ; 学位の種別: 課程博士 ; 学位の種類: 博士(工学) ; 学位記番号: 博工第4717号 ; 研究科・専攻: 工学系研究科情報工学専

    An Automated procedure for simulating complex arrival processes: A Web-based approach

    Get PDF
    In industry, simulation is one of the most widely used probabilistic modeling tools for modeling highly complex systems. Major sources of complexity include the inputs that drive the logic of the model. Effective simulation input modeling requires the use of accurate and efficient input modeling procedures. This research focuses on nonstationary arrival processes. The fundamental stochastic model on which this study is conducted is the nonhomogeneous Poisson process (NHPP) which has successfully been used to characterize arrival processes where the arrival rate changes over time. Although a number of methods exist for modeling the rate and mean value functions that define the behavior of NHPPs, one of the most flexible is a multiresolution procedure that is used to model the mean value function for processes possessing long-term trends over time or asymmetric, multiple cyclic behavior. In this research, a statistical-estimation procedure for automating the multiresolution procedure is developed that involves the following steps at each resolution level corresponding to a basic cycle: (a) transforming the cumulative relative frequency of arrivals within the cycle to obtain a linear statistical model having normal residuals with homogeneous variance; (b) fitting specially formulated polynomials to the transformed arrival data; (c) performing a likelihood ratio test to determine the degree of the fitted polynomial; and (d) fitting a polynomial of the degree determined in (c) to the original (untransformed) arrival data. Next, an experimental performance evaluation is conducted to test the effectiveness of the estimation method. A web-based application for modeling NHPPs using the automated multiresolution procedure and generating realizations of the NHPP is developed. Finally, a web-based simulation infrastructure that integrates modeling, input analysis, verification, validation and output analysis is discussed

    Detectability of the First Cosmic Explosions

    Full text link
    We present a fully self-consistent simulation of a synthetic survey of the furthermost cosmic explosions. The appearance of the first generation of stars (Population III) in the Universe represents a critical point during cosmic evolution, signaling the end of the dark ages, a period of absence of light sources. Despite their importance, there is no confirmed detection of Population III stars so far. A fraction of these primordial stars are expected to die as pair-instability supernovae (PISNe), and should be bright enough to be observed up to a few hundred million years after the big bang. While the quest for Population III stars continues, detailed theoretical models and computer simulations serve as a testbed for their observability. With the upcoming near-infrared missions, estimates of the feasibility of detecting PISNe are not only timely but imperative. To address this problem, we combine state-of-the-art cosmological and radiative simulations into a complete and self-consistent framework, which includes detailed features of the observational process. We show that a dedicated observational strategy using 8\lesssim 8 per cent of total allocation time of the James Webb Space Telescope mission can provide us up to 915\sim 9-15 detectable PISNe per year.Comment: 9 pages, 8 figures. Minor corrections added to match published versio

    Resources Events Agents (REA), a text DSL for OMNIA Entities

    Get PDF
    The Numbersbelieve has been developing the OMNIA platform. This is a web application platform for developing applications using Low-code principles, using Agile approaches. Modeling Entities is an application that is used on the platform to create new entities. The OMNIA Entity concept has the following properties: Agents, Commitments, Documents, Events, entities, Resources or Series. Most of these concepts are in accordance with the Resources Events Agents (REA) ontology but are not formalized. One of the goals of Numbersbelieve is a formalization of the REA concepts according to the ontology for the application that creates entities on OMNIA platform and later for other applications. REA defines an enterprise ontology developed by McCarthy (1979, 1982) has its origin in accounting database systems. Later Geerts and McCarthy (2002, 2006) extended the original model with new concepts. To formalize the concepts of the REA ontology, this research shows the development of a textual Domain-Specific Language (DSL) based on the development methodology Model Driven Engineering (MDE) which focuses software development on models. This simplifies the engineering processes as it represents the actions and behaviors of a system even before the start of the coding phase. This research is structured according to the Design Science Research Methodology (DSRM). The Design Science (DS) is a methodology for solving problems that seek to innovate by creating useful artifacts that define practices, projects and implementations and is therefore suitable for this research. This research developed three artifacts for the formalization of the DSL, a meta-model the abstract syntax, a textual language the concrete syntax and a Json file for interaction with OMNIA. The first phase of DSRM was to identify the problem that was mentioned above. The following focuses on the identification of requirements which identified the REA concepts to be included in the meta-model and textual language. Subsequently, the development of the artifacts and the editor of the language. The editor allows use cases, provided by the Numbersbelieve team, to be defined with the DSL language, correct faults and improve the language. The results were evaluated according the objectives and requirements, all successfully completed. Based on the analysis of the artifacts, the use of the language and the interaction with the OMNIA platform, through the Json file, it is concluded that the use of the DSL language is suitable to interact with the OMNIA platform through the Application Program Interface (API) and helped demonstrate that other applications on the platform could be modeled using a REA approach.A Numbersbelieve tem vindo a desenvolver a plataforma OMNIA. Esta plataforma é uma aplicação web para o desenvolvimento de aplicações usando princípios Low-code, usando abordagens Agile. Modeling Entities é a aplicação que é usada na plataforma para criar novas entidades. O conceito OMNIA de Entidade tem as seguintes propriedades: Agents, Commitments, Documents, Events, Generic entities, Resources or Series. A maior parte destes conceitos estão de acordo com a ontologia REA mas não estão formalizados. Um dos objetivos da Numbersbelieve é ter uma formalização dos conceitos REA de acordo com a ontologia para a aplicação que cria as entidades na plataforma OMNIA e posteriormente para as outras aplicações. REA define uma ontologia empresarial desenvolvida por McCarthy (1979, 1982) tem sua origem nos sistemas de base de dados para contabilidade. Mais tarde Geerts and McCarthy (2002, 2006) estenderam o modelo original com novos conceitos. Para formalizar os conceitos da ontologia REA, esta pesquisa mostra o desenvolvimento de uma DSL textual com base na metodologia de desenvolvimento MDE que foca o desenvolvimento de software no modelo. Esta simplifica os processos de engenharia pois representa as ações e comportamentos de um sistema mesmo antes do início da fase de codificação. A pesquisa está estruturada de acordo com a DSRM. O DS é uma metodologia para resolver problemas que procuram inovar criando artefactos úteis que definem práticas, projetos e implementações e por isso é adequado a esta pesquisa que desenvolveu três artefactos para a formalização da DSL, um meta-modelo a sintaxe abstrata, uma linguagem textual a sintaxe concreta e um ficheiro Json para interação com a plataforma OMNIA. A primeira fase do DSRM foi identificar o problema que foi referido em cima. A seguinte concentra-se na identificação dos requisitos que identificaram os conceitos REA a serem incluídos no meta-modelo e na linguagem textual. Posteriormente, é feito o desenvolvimento dos artefactos e do editor da linguagem. O editor permite definir, com a DSL, os casos de uso fornecidos pela equipa da Numbersbelieve, corrigir falhas e melhorar a linguagem. Os resultados foram avaliados de acordo com o cumprimento dos requisitos. Foram todos foram concluídos com êxito. Com base na análise dos artefactos, do uso da linguagem e da interação com a plataforma OMNIA, através do ficheiro Json, conclui-se que a utilização da linguagem é adequada para interagir com a plataforma OMNIA através da sua API e ajudou a demonstrar que outras aplicações da plataforma podem ser modeladas usando uma abordagem REA

    DESP-C++: a discrete-event simulation package for C++

    Get PDF

    Finding a suitable performance testing tool

    Get PDF
    Abstract. The pursuit of finding the most suitable testing software for each project is a difficult task as there are a lot of software effective finding certain kind of problems but completely missing others in the field of stress and load testing. A silver bullet solving all problems in a cost effective and reliable way has not yet been found. This project was done as a systematic literature review to find whether there are solutions documented capable of testing everything in a cost-effective way. The document starts with an introduction of the task, originating from a real software testing company’s suggestion of finding suitable test software that can, cost effectively and reliably, fulfil the needs of the company. A history section is describing the reason of testing importance, basics of testing and what others have found in their studies of the area. The research method is described in detail followed by results describing tools found during the research divided in sections by license type. The sectioning by license type was selected for the benefit of testing companies that are interested in further developing tools found to their own interest. Findings and answered research questions were presented and discussed followed by possible implications and further research suggestions to future scholars interested in the matter. The systematic literature review found a total of 40 different tools identified during the data extraction process. One complete software system was available commercially including heavy support and help functions for the customer. A different approach linking open source and relatively inexpensive pieces of software together to achieve a composite solution was also identified. The solution included the most common and most popular individual piece of software identified by the study. All found pieces of software were listed and commented briefly mainly with information originating from the authors’ home pages

    The use of multilayer network analysis in animal behaviour

    Get PDF
    Network analysis has driven key developments in research on animal behaviour by providing quantitative methods to study the social structures of animal groups and populations. A recent formalism, known as \emph{multilayer network analysis}, has advanced the study of multifaceted networked systems in many disciplines. It offers novel ways to study and quantify animal behaviour as connected 'layers' of interactions. In this article, we review common questions in animal behaviour that can be studied using a multilayer approach, and we link these questions to specific analyses. We outline the types of behavioural data and questions that may be suitable to study using multilayer network analysis. We detail several multilayer methods, which can provide new insights into questions about animal sociality at individual, group, population, and evolutionary levels of organisation. We give examples for how to implement multilayer methods to demonstrate how taking a multilayer approach can alter inferences about social structure and the positions of individuals within such a structure. Finally, we discuss caveats to undertaking multilayer network analysis in the study of animal social networks, and we call attention to methodological challenges for the application of these approaches. Our aim is to instigate the study of new questions about animal sociality using the new toolbox of multilayer network analysis.Comment: Thoroughly revised; title changed slightl
    corecore