66 research outputs found

    On The Foundations of Digital Games

    Get PDF
    Computers have lead to a revolution in the games we play, and, following this, an interest for computer-based games has been sparked in research communities. However, this easily leads to the perception of a one-way direction of influence between that the field of game research and computer science. This historical investigation points towards a deep and intertwined relationship between research on games and the development of computers, giving a richer picture of both fields. While doing so, an overview of early game research is presented and an argument made that the distinction between digital games and non-digital games may be counter-productive to game research as a whole

    A unifying mathematical definition enables the theoretical study of the algorithmic class of particle methods.

    Get PDF
    Mathematical definitions provide a precise, unambiguous way to formulate concepts. They also provide a common language between disciplines. Thus, they are the basis for a well-founded scientific discussion. In addition, mathematical definitions allow for deeper insights into the defined subject based on mathematical theorems that are incontrovertible under the given definition. Besides their value in mathematics, mathematical definitions are indispensable in other sciences like physics, chemistry, and computer science. In computer science, they help to derive the expected behavior of a computer program and provide guidance for the design and testing of software. Therefore, mathematical definitions can be used to design and implement advanced algorithms. One class of widely used algorithms in computer science is the class of particle-based algorithms, also known as particle methods. Particle methods can solve complex problems in various fields, such as fluid dynamics, plasma physics, or granular flows, using diverse simulation methods, including Discrete Element Methods (DEM), Molecular Dynamics (MD), Reproducing Kernel Particle Methods (RKPM), Particle Strength Exchange (PSE), and Smoothed Particle Hydrodynamics (SPH). Despite the increasing use of particle methods driven by improved computing performance, the relation between these algorithms remains formally unclear. In particular, particle methods lack a unifying mathematical definition and precisely defined terminology. This prevents the determination of whether an algorithm belongs to the class and what distinguishes the class. Here we present a rigorous mathematical definition for determining particle methods and demonstrate its importance by applying it to several canonical algorithms and those not previously recognized as particle methods. Furthermore, we base proofs of theorems about parallelizability and computational power on it and use it to develop scientific computing software. Our definition unified, for the first time, the so far loosely connected notion of particle methods. Thus, it marks the necessary starting point for a broad range of joint formal investigations and applications across fields.:1 Introduction 1.1 The Role of Mathematical Definitions 1.2 Particle Methods 1.3 Scope and Contributions of this Thesis 2 Terminology and Notation 3 A Formal Definition of Particle Methods 3.1 Introduction 3.2 Definition of Particle Methods 3.2.1 Particle Method Algorithm 3.2.2 Particle Method Instance 3.2.3 Particle State Transition Function 3.3 Explanation of the Definition of Particle Methods 3.3.1 Illustrative Example 3.3.2 Explanation of the Particle Method Algorithm 3.3.3 Explanation of the Particle Method Instance 3.3.4 Explanation of the State Transition Function 3.4 Conclusion 4 Algorithms as Particle Methods 4.1 Introduction 4.2 Perfectly Elastic Collision in Arbitrary Dimensions 4.3 Particle Strength Exchange 4.4 Smoothed Particle Hydrodynamics 4.5 Lennard-Jones Molecular Dynamics 4.6 Triangulation refinement 4.7 Conway's Game of Life 4.8 Gaussian Elimination 4.9 Conclusion 5 Parallelizability of Particle Methods 5.1 Introduction 5.2 Particle Methods on Shared Memory Systems 5.2.1 Parallelization Scheme 5.2.2 Lemmata 5.2.3 Parallelizability 5.2.4 Time Complexity 5.2.5 Application 5.3 Particle Methods on Distributed Memory Systems 5.3.1 Parallelization Scheme 5.3.2 Lemmata 5.3.3 Parallelizability 5.3.4 Bounds on Time Complexity and Parallel Scalability 5.4 Conclusion 6 Turing Powerfulness and Halting Decidability 6.1 Introduction 6.2 Turing Machine 6.3 Turing Powerfulness of Particle Methods Under a First Set of Constraints 6.4 Turing Powerfulness of Particle Methods Under a Second Set of Constraints 6.5 Halting Decidability of Particle Methods 6.6 Conclusion 7 Particle Methods as a Basis for Scientific Software Engineering 7.1 Introduction 7.2 Design of the Prototype 7.3 Applications, Comparisons, Convergence Study, and Run-time Evaluations 7.4 Conclusion 8 Results, Discussion, Outlook, and Conclusion 8.1 Problem 8.2 Results 8.3 Discussion 8.4 Outlook 8.5 Conclusio

    Decoding algorithms using side-effect machines

    Get PDF
    Bioinformatics applies computers to problems in molecular biology. Previous research has not addressed edit metric decoders. Decoders for quaternary edit metric codes are finding use in bioinformatics problems with applications to DNA. By using side effect machines we hope to be able to provide efficient decoding algorithms for this open problem. Two ideas for decoding algorithms are presented and examined. Both decoders use Side Effect Machines(SEMs) which are generalizations of finite state automata. Single Classifier Machines(SCMs) use a single side effect machine to classify all words within a code. Locking Side Effect Machines(LSEMs) use multiple side effect machines to create a tree structure of subclassification. The goal is to examine these techniques and provide new decoders for existing codes. Presented are ideas for best practices for the creation of these two types of new edit metric decoders

    Quantum Cellular Automata Controlled Self-Organizing Networks

    Get PDF

    Flexible high performance agent based modelling on graphics card hardware

    Get PDF
    Agent Based Modelling is a technique for computational simulation of complex interacting systems, through the specification of the behaviour of a number of autonomous individuals acting simultaneously. This is a bottom up approach, in contrast with the top down one of modelling the behaviour of the whole system through dynamic mathematical equations. The focus on individuals is considerably more computationally demanding, but provides a natural and flexible environment for studying systems demonstrating emergent behaviour. Despite the obvious parallelism, traditionally frameworks for Agent Based Modelling fail to exploit this and are often based on highly serialised mobile discrete agents. Such an approach has serious implications, placing stringent limitations on both the scale of models and the speed at which they may be simulated. Serial simulation frameworks are also unable to exploit multiple processor architectures which have become essential in improving overall processing speed. This thesis demonstrates that it is possible to use the parallelism of graphics card hardware as a mechanism for high performance Agent Based Modelling. Such an approach is in contrast with alternative high performance architectures, such as distributed grids and specialist computing clusters, and is considerably more cost effective. The use of consumer hardware makes the techniques described available to a wide range of users, and the use of automatically generated simulation code abstracts the process of mapping algorithms to the specialist hardware. This approach avoids the steep learning curve associated with the graphics card hardware's data parallel architecture, which has previously limited the uptake of this emerging technology. The performance and flexibility of this approach are considered through the use of benchmarking and case studies. The resulting speedup and locality of agent data within the graphics processor also allow real time visualisation of computationally and demanding high population models

    Simulation techniques in an artificial society model

    Get PDF
    Artificial society refers to a generic class of agent-based simulation models used to discover global social structures and collective behavior produced by simple local rules and interaction mechanisms. Artificial society models are applicable in a variety of disciplines, including the modeling of chemical and biological processes, natural phenomena, and complex adaptive systems. We focus on the underlying simulation techniques used in artificial society discrete-event simulation models, including model time evolution and computational performance.;Although for some applications synchronous time evolution is the correct modeling approach, many other applications are better represented using asynchronous time evolution. We claim that asynchronous time evolution can eliminate potential simulation artifacts produced using synchronous time evolution. Using an adaptation of a popular artificial society model, we show that very different output can result based solely on the choice of asynchronous or synchronous time evolution. Based on the event list implementation chosen, the use of discrete-event simulation to incorporate asynchronous time evolution can incur a substantial loss in computational performance. Accordingly, we evaluate select event list implementations within the artificial society simulation model and demonstrate that acceptable performance can be achieved.;In addition to the artificial society model, we show that transforming from a synchronous to an asynchronous system proves beneficial for scheduling resources in a parallel system. We focus on non-FCFS job scheduling policies that permit jobs to backfill, i.e., to move ahead in the queue, given that they do not delay certain previously submitted jobs. Instead of using a single queue of jobs, we propose a simple yet effective backfilling scheduling policy that effectively separates short from long jobs by incorporating multiple queues. By monitoring system performance, our policy adapts its configuration parameters in response to severe changes in the job arrival pattern and/or resource demands. Detailed performance comparisons via simulation using actual parallel workload traces indicate that our proposed policy consistently outperforms traditional backfilling in a variety of contexts

    Stochastic models for biological evolution

    Get PDF
    In this work, we deal with the problem of creating a model that describes a population of agents undergoing Darwinian Evolution, which takes into account the basic phenomena of this process. According to the principles of evolutionary biology, Evolution occurs if there is selection and adaptation of phenotypes, mutation of genotypes, presence of physical space. The evolution of a biological population is then described by a system of ordinary stochastic differential equations; the basic model of dynamics represents the trend of a population divided into different types, with relative frequency in a simplex. The law governing this dynamics is called Replicator Dynamics: the growth rate of type k is measured in terms of evolutionary advantage, with its own fitness compared to the average in the population. The replicator dynamics model turns into a stochastic process when we consider random mutations that can transform fractions of individuals into others. The two main forces of Evolution, selection and mutation, act on different layers: the environment acts on the phenotype, selecting the fittest, while the randomness of the mutations affects the genotype. This difference is underlined in the model, where each genotype express a phenotype, and fitness influences emerging traits, not explicitly encoded in genotypes. The presence of a potentially infinite space of available genomes makes sure that variants of individuals with characteristics never seen before can be generated. In conclusion, numerical simulations are provided for some applications of the model, such as a variation of Conway's Game of Lif

    Organic information design

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2000.Includes bibliographical references (p. 93-94).Design techniques for static information are well understood, their descriptions and discourse thorough and well-evolved. But these techniques fail when dynamic information is considered. There is a space of highly complex systems for which we lack deep understanding because few techniques exist for visualization of data whose structure and content are continually changing. To approach these problems, this thesis introduces a visualization process titled Organic Information Design. The resulting systems employ simulated organic properties in an interactive, visually refined environment to glean qualitative facts from large bodies of quantitative data generated by dynamic information sources.Benjamin Jotham Fry.S.M

    Stochastic neural networks

    Get PDF
    Artificial neural networks are brain-like models of parallel computations and cognitive phenomena. We sample some basic results about neural networks as they relate to stochastic and statistical processes. Given the explosivo amount of material, only models bearing a stochastic component in the function or analysis are presented, such as Hopfield and feedforward nets, Boltzman machines and some recurrent networks. Basic algorithms for learning such as backpropagation and gradient descent are sketched. A handful of applications (associative memories, pattem recognition, time series forecast) aredescribed. Finally, some current trends in the field are discussed
    • …
    corecore