3,145 research outputs found
Recommended from our members
Leveraging simulation practice in industry through use of desktop grid middleware
This chapter focuses on the collaborative use of computing resources to support decision making in industry. Through the use of middleware for desktop grid computing, the idle CPU cycles available on existing computing resources can be harvested and used for speeding-up the execution of applications that have “non-trivial” processing requirements. This chapter focuses on the desktop grid middleware BOINC and Condor, and discusses the integration of commercial simulation software together with free-to-download grid middleware so as to offer competitive advantage to organizations that opt for this technology. It is expected that the low-intervention integration approach presented in this chapter (meaning no changes to source code required) will appeal to both simulation practitioners (as simulations can be executed faster, which in turn would mean that more replications and optimization is possible in the same amount of time) and the management (as it can potentially increase the return on investment on existing resources)
Recommended from our members
A survey of simulation techniques in commerce and defence
Despite the developments in Modelling and Simulation (M&S) tools and techniques over the past years, there has been a gap in the M&S research and practice in healthcare on developing a toolkit to assist the modellers and simulation practitioners with selecting an appropriate set of techniques. This study is a preliminary step towards this goal. This paper presents some results from a systematic literature survey on applications of M&S in the commerce and defence domains that could inspire some improvements in the healthcare. Interim results show that in the commercial sector Discrete-Event Simulation (DES) has been the most widely used technique with System Dynamics (SD) in second place. However in the defence sector, SD has gained relatively more attention. SD has been found quite useful for qualitative and soft factors analysis. From both the surveys it becomes clear that there is a growing trend towards using hybrid M&S approaches
Towards a simulation interoperability framework between an agent-based simulator and a BPMN engine using REST protocol
O paradigma atual de um modelo de processo de negócio é que é uma representação de uma sequência de tarefas que atuam sobre um “input” de dados,
para produzir uma “output”, visando a produção de um novo serviço ou produto. Embora esta seja uma forma válida de interpretar um processo de negócio, ela não considera em pormenor a influência de fenómenos externos,
por exemplo, comportamento humano, comunicação, interações sociais, a
cultura organizacional que pode ter um efeito significativo na eficiência um
processo de negócio.
Como a dinâmica destes fenómenos externos não é linear, eles podem ser
interpretados como um sistema complexo, que são sistemas que se comportam de tal forma que não podem ser explicados simplesmente olhando para o
comportamento das suas partes individuais. Esta forma holística de pensar
sobre os processos de negócio abre as portas à possibilidade de combinar
diferentes métodos de simulação para modelar diferentes aspetos que influenciam um processo.
A simulação baseada em agentes (ABS) e BPMN são escolhidas como os
dois métodos de simulação para estudar o potencial dessa integração em processos de negócio, e a nossa abordagem para os combinar consiste em modelar o comportamento do utilizador em ABS e o próprio processo de negócio
utilizando o BPMN. Por fim, a integração entre os dois motores de simulação
acontece durante o decurso da simulação através da invocação de APIs
usando o protocolo REST, onde os agentes controlam a dinâmica de execução do processo no BPMN. Esta abordagem de integração é validada através
da construção de uma experiência, com o objetivo de determinar se os resultados de simulação obtidos são estatisticamente coerentesThe current paradigm of a business process model is that it is a representation
of a sequence of tasks that act upon some data input, to produce an output,
aiming the production of a new service or product to be delivered from a producer to a customer. Although this is a valid way of thinking, it neglects to
consider in enough detail the influence of some phenomenon on inputs, e.g.
human behaviour, communication, social interactions, the organisational culture which can have a significant effect on the output delivered by a business
process. As the dynamics of these phenomena are non-linear, they can be
interpreted as a complex system. This holistic way of thinking about business
processes opens the doors to the possibility of combining different simulation
methods to model different aspects that influence a process. A BPMN engine
and an agent-based simulation (ABS) engine are chosen to serve the basis of
our framework. In its conception, we not only consider the technical aspects of
the framework but also delve into exploring its management and organizational
dimensions, with the intent of facilitating its adoption in enterprises, as a tool
to support decision support systems. We analyse how accurate the simulation
results can be when using these two tools as well as what considerations need
to be considered within organizations
A systematic methodology to analyse the performance and design configurations of business interoperability in cooperative industrial networks
This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate
A Framework For Workforce Management An Agent Based Simulation Approach
In today\u27s advanced technology world, enterprises are in a constant state of competition. As the intensity of competition increases the need to continuously improve organizational performance has never been greater. Managers at all levels must be on a constant quest for finding ways to maximize their enterprises\u27 strategic resources. Enterprises can develop sustained competitiveness only if their activities create value in unique ways. There should be an emphasis to transfer this competitiveness to the resources it has on hand and the resources it can develop to be used in this environment. The significance of human capital is even greater now, as the intangible value and the tacit knowledge of enterprises\u27 resources should be strategically managed to achieve a greater level of continuous organizational success. This research effort seeks to provide managers with means for accurate decision making for their workforce management. A framework for modeling and managing human capital to achieve effective workforce planning strategies is built to assist enterprise in their long term strategic organizational goals
High Speed Simulation Analytics
Simulation, especially Discrete-event simulation (DES) and Agent-based simulation (ABS), is widely used in industry to support decision making. It is used to create predictive models or Digital Twins of systems used to analyse what-if scenarios, perform sensitivity analytics on data and decisions and even to optimise the impact of decisions. Simulation-based Analytics, or just Simulation Analytics, therefore has a major role to play in Industry 4.0. However, a major issue in Simulation Analytics is speed. Extensive, continuous experimentation demanded by Industry 4.0 can take a significant time, especially if many replications are required. This is compounded by detailed models as these can take a long time to simulate. Distributed Simulation (DS) techniques use multiple computers to either speed up the simulation of a single model by splitting it across the computers and/or to speed up experimentation by running
experiments across multiple computers in parallel. This chapter discusses how DS and Simulation Analytics, as well as concepts from contemporary e-Science, can be combined to contribute to the speed problem by creating a new approach called High Speed Simulation Analytics. We present a vision of High Speed Simulation Analytics to show how this might be integrated with the future of Industry 4.0
A new MDA-SOA based framework for intercloud interoperability
Cloud computing has been one of the most important topics in Information Technology which aims to assure scalable and reliable on-demand services over the Internet. The expansion of the application scope of cloud services would require cooperation between clouds from different providers that have heterogeneous functionalities. This collaboration between different cloud vendors can provide better Quality of Services (QoS) at the lower price. However, current cloud systems have been developed without concerns of seamless cloud interconnection, and actually they do not support intercloud interoperability to enable collaboration between cloud service providers. Hence, the PhD work is motivated to address interoperability issue between cloud providers as a challenging research objective.
This thesis proposes a new framework which supports inter-cloud interoperability in a heterogeneous computing resource cloud environment with the goal of dispatching the workload to the most effective clouds available at runtime.
Analysing different methodologies that have been applied to resolve various problem scenarios related to interoperability lead us to exploit Model Driven Architecture (MDA) and Service Oriented Architecture (SOA) methods as appropriate approaches for our inter-cloud framework. Moreover, since distributing the operations in a cloud-based environment is a nondeterministic polynomial time (NP-complete) problem, a Genetic Algorithm (GA) based job scheduler proposed as a part of interoperability framework, offering workload migration with the best performance at the least cost. A new Agent Based Simulation (ABS) approach is proposed to model the inter-cloud environment with three types of agents: Cloud Subscriber agent, Cloud Provider agent, and Job agent. The ABS model is proposed to evaluate the proposed framework.Fundação para a Ciência e a Tecnologia (FCT) - (Referencia da bolsa: SFRH SFRH / BD / 33965 / 2009) and EC 7th Framework Programme under grant agreement n° FITMAN 604674 (http://www.fitman-fi.eu
Second Workshop on Modelling of Objects, Components and Agents
This report contains the proceedings of the workshop Modelling of Objects, Components, and Agents (MOCA'02), August 26-27, 2002.The workshop is organized by the 'Coloured Petri Net' Group at the University of Aarhus, Denmark and the 'Theoretical Foundations of Computer Science' Group at the University of Hamburg, Germany. The homepage of the workshop is: http://www.daimi.au.dk/CPnets/workshop02
High Speed Simulation Analytics
Simulation, especially Discrete-event simulation (DES) and Agent-based simulation (ABS), is widely used in industry to support decision making. It is used to create predictive models or Digital Twins of systems used to analyse what-if scenarios, perform sensitivity analytics on data and decisions and even to optimise the impact of decisions. Simulation-based Analytics, or just Simulation Analytics, therefore has a major role to play in Industry 4.0. However, a major issue in Simulation Analytics is speed. Extensive, continuous experimentation demanded by Industry 4.0 can take a significant time, especially if many replications are required. This is compounded by detailed models as these can take a long time to simulate. Distributed Simulation (DS) techniques use multiple computers to either speed up the simulation of a single model by splitting it across the computers and/or to speed up experimentation by running
experiments across multiple computers in parallel. This chapter discusses how DS and Simulation Analytics, as well as concepts from contemporary e-Science, can be combined to contribute to the speed problem by creating a new approach called High Speed Simulation Analytics. We present a vision of High Speed Simulation Analytics to show how this might be integrated with the future of Industry 4.0
- …