100 research outputs found

    Multi-agent based simulation of self-governing knowledge commons

    No full text
    The potential of user-generated sensor data for participatory sensing has motivated the formation of organisations focused on the exploitation of collected information and associated knowledge. Given the power and value of both the raw data and the derived knowledge, we advocate an open approach to data and intellectual-property rights. By treating user-generated content as well as derived information and knowledge as a common-pool resource, we hypothesise that all participants can be compensated fairly for their input. To test this hypothesis, we undertake an extensive review of experimental, commercial and social participatory-sensing applications, from which we identify that a decentralised, community-oriented governance model is required to support this open approach. We show that the Institutional Analysis and Design framework as introduced by Elinor Ostrom, in conjunction with a framework for self-organising electronic institutions, can be used to give both an architectural and algorithmic base for the necessary governance model, in terms of operational and collective choice rules specified in computational logic. As a basis for understanding the effect of governance on these applications, we develop a testbed which joins our logical formulation of the knowledge commons with a generic model of the participatory-sensing problem. This requires a multi-agent platform for the simulation of autonomous and dynamic agents, and a method of executing the logical calculus in which our electronic institution is specified. To this end, firstly, we develop a general purpose, high performance platform for multi-agent based simulation, Presage2. Secondly, we propose a method for translating event-calculus axioms into rules compatible with business rule engines, and provide an implementation for JBoss Drools along with a suite of modules for electronic institutions. Through our simulations we show that, when building electronic institutions for managing participatory sensing as a knowledge commons, proper enfranchisement of agents (as outlined in Ostrom's work) is key to striking a balance between endurance, fairness and reduction of greedy behaviour. We conclude with a set of guidelines for engineering knowledge commons for the next generation of participatory-sensing applications.Open Acces

    So You Think You Can Model? A Guide to Building and Evaluating Archaeological Simulation Models of Dispersals

    Get PDF
    With the current surge of simulation studies in archaeology there is a growing concern for the lack of engagement and feedback between modellers and domain specialists. To facilitate this dialogue I present a compact guide to the simulation modelling process applied to a common research topic and the focus of this special issue of Human Biology—human dispersals. The process of developing a simulation is divided into nine steps grouped in three phases. The conceptual phase consists of identifying research questions (step 1) and finding the most suitable method (step 2), designing the general framework and the resolution of the simulation (step 3) and then by filling in that framework with the modelled entities and the rules of interactions (step 4). This is followed by the technical phase of coding and testing (step 5), parameterising the simulation (step 6) and running it (step 7). In the final phase the results of the simulation are analysed and re-contextualised (step 8) and the findings of the model are disseminated in publications and code repositories (step 9). Each step will be defined and characterised and then illustrated with examples of published human dispersals simulation studies. While not aiming to be a comprehensive textbookstyle guide to simulation, this overview of the process of modelling human dispersals should arm any non-modeller with enough understanding to evaluate the quality, strengths and weaknesses of any particular archaeological simulation and provide a starting point for further exploration of this common scientific tool

    Ein verteilter und agentenbasierter Ansatz für gekoppelte Probleme der rechnergestützten Ingenieurwissenschaften

    Get PDF
    Challenging questions in science and engineering often require to decouple a complex problem and to focus on isolated sub-problems first. The knowledge of those individual solutions can later be combined to obtain the result for the full question. A similar technique is applied in numerical modeling. Here, the software solver for subsets of the coupled problem might already exist and can directly be used. This thesis describes a software environment capable of combining multiple software solvers, the result being a new, combined model. Two important design decisions were crucial at the beginning: First, every sub-model keeps full control of its execution. Second, the source code of the sub-model requires only minimal adaptation. The sub-models choose themselves when to issue communication calls, with no outer synchronisation mechanism required. The coupling of heterogeneous hardware is supported as well as the use of homogeneous compute clusters. Furthermore, the coupling framework allows sub-solvers to be written in different programming languages. Also, each of the sub-models may operate on its own spatial and temporal scales. The next challenge was to allow the potential coupling of thousands software agents, being able to utilise today's petascale hardware. For this purpose, a specific coupling framework was designed and implemented, combining the experiences from the previous work with additions required to cope with the targeted number of coupled sub-models. The large number of interacting models required a much more dynamic approach, where the agents automatically detect their communication partners at runtime. This eliminates the need to explicitly specify the coupling graph a~priori. Agents are allowed to enter (and leave) the simulation at any time, with the coupling graph changing accordingly.Da viele Problemstellungen im Ingenieurwesen sehr komplex sind, ist es oft sinnvoll, sie in einzelne Teilprobleme aufzugliedern. Diese Teilbereiche können nun einzeln angegangen und dann zur Gesamtlösung kombiniert werden. Ein ähnlicher Ansatz wird bei der numerischen Modellierung verfolgt: Komplexe Software wird schrittweise erstellt, indem Software-Löser für einzelne Bereiche zuerst separat erarbeitet werden. In dieser Arbeit wird eine Software beschrieben, die eine Vielzahl von unabhängigen Software-Lösern kombinieren kann. Jedes Teilmodell verhält sich weiterhin wie ein selbständiges Programm. Hierfür wird es in einen Software-Agenten gehüllt. Zur Kopplung sind lediglich minimale Ergänzungen am Quellcode des Teilmodells nötig. Möglich wird dies durch die Struktur der Kommunikation zwischen den Teilmodellen. Sie lässt den Modellen die Kontrolle über die Kommunikationsaufrufe und benötigt zur Synchronisation keine Einflussnahme einer übergeordneten Instanz. Manche Teilmodelle sind für den Gebrauch mit einer speziellen Hardware optimiert. Daher musste das Zusammenspiel unterschiedlicher Hardware ebenso berücksichtigt werden wie homogene Rechencluster. Weiterhin ermöglicht das Kopplungs-Framework, dass unterschiedliche Programmiersprachen verbunden werden können. Wie schon der Programmablauf, so können auch die Modellparameter, etwa die räumliche und zeitliche Skala, von Teilmodell zu Teilmodell unterschiedlich bleiben. Weiter behandelt diese Arbeit eine Vorgehensweise um tausende von Software-Agenten zu einem Groß-Modell zu koppeln. Dies ist erforderlich, wenn die Ressourcen heutiger Petascale Rechencluster benutzt werden sollen. Hierzu wurde das bisherige Framework neu aufgelegt, da die große Anzahl von zu koppelnden Modellen einer wesentlich dynamischeren Kommunikationsstruktur bedarf. Die Agenten der Teilmodelle können einer laufenden Simulation hinzugefügt werden (oder diese verlassen) und die globalen Kopplungsbeziehungen passen sich dementsprechend an

    Pertanika Journal of Science & Technology

    Get PDF
    corecore