66 research outputs found

    Solving the division of labour problem using stigmergy and evolved heterogeneity (abstract)

    No full text
    Evolving cooperative teams is a research area with applications in the fields of robotics and software agents. Progress on this problem could also help us to understand the evolution of cooperation in natural systems such as the social insects. The overarching question is how cooperative teams should be represented in order to promote efficient evolutionary search. More specifically, what should serve as our basic unit of selection— the individual or the team? —and how can the division-of-labour problem be solved? In order to answer these questions we have taken a benchmark problem from the genetic programming (GP) literature, the artificial ant problem, and extended it so that teams of ants must cooperate to complete the task. In this model, the ants are centrally placed in a bounded grid with each square containing food. The goal of the team is to harvest all the food in the environment in as few moves as possible. In the initial version of the problem, the members of the team are all clones, each having exactly the same GP controller program. Many solutions will have poor performance as the team members will all behave in the same way, and will therefore fail to cover the grid efficiently. To perform better, the ants must evolve to take advantage of stigmergic interactions to break the symmetry of the problem and clear the world of food efficiently. This division of labour through stigmergy is indeed what is seen to evolve during the simulations. A further extension is made by assigning each member of the team an identity tag, and adding the ability to execute different subtrees of the cloned controller based on this tag. When these operations are allowed, higher fitnesses are achieved than with the purely stigmergic situation above. During evolution, selection acts at the team level. We can therefore view the members of the team as being equivalent to cells in a multicellular organism. The identity branching operation is analogous to cell differentiation within this abstract organism. Using this scheme, the degree of differentiation is not specified a-priori and is controllable through evolution. This allows the full continuum from purely homogeneous teams to entirely heterogeneous teams to be expressed. There is also the potential to use this method as a way of measuring the degree to which a task demands heterogeneous solutions. The relative importance of stigmergy and innate heterogeneity in achieving the necessary division of labour were compared with a third experimental manipulation. The ability to influence each other stigmergically was removed by placing each ant in it’s own world and tallying the pieces of food consumed by the team as a whole. In this scenario, the most efficient way to tackle the problem is for the team to evolve complete heterogeneity. We conclude that the division-of-labour problem in the evolution of cooperative teams can be solved by both stigmergic communication and innate heterogeneity. Furthermore, the technique of allowing the level of heterogeneity of the team to be open to selection shows promise for future work

    Architecture, Space and Information in Constructions Built by Humans and Social Insects: a Conceptual Review

    Get PDF
    The similarities between the structures built by social insects and by humans have led to a convergence of interests between biologists and architects. This new, de facto interdisciplinary community of scholars needs a common terminology and theoretical framework in which to ground its work. In this conceptually oriented review paper, we review the terms “information”, “space” and “architecture” to provide definitions that span biology and architecture. A framework is proposed on which interdisciplinary exchange may be better served, with the view that this will aid better cross fertilisation between disciplines, working in the areas of collective behaviour and analysis of the structures and edifices constructed by non-humans; and to facilitate how this area of study may better contribute to the field of architecture. We then use these definitions to discuss the informational content of constructions built by organisms and the influence these have on behaviour, and vice versa. We review how spatial constraints inform and influence interaction between an organism and its environment, and examine the reciprocity of space and information on construction and the behaviour of humans and social insects

    Evolutionary swarm robotics: a theoretical and methodological itinerary from individual neuro-controllers to collective behaviours

    Get PDF
    In the last decade, swarm robotics gathered much attention in the research community. By drawing inspiration from social insects and other self-organizing systems, it focuses on large robot groups featuring distributed control, adaptation, high robustness, and flexibility. Various reasons lay behind this interest in similar multi-robot systems. Above all, inspiration comes from the observation of social activities, which are based on concepts like division of labor, cooperation, and communication. If societies are organized in such a way in order to be more efficient, then robotic groups also could benefit from similar paradigms

    A general architecture for robotic swarms

    Get PDF
    Swarms are large groups of simplistic individuals that collectively solve disproportionately complex tasks. Individual swarm agents are limited in perception, mechanically simple, have no global knowledge and are cheap, disposable and fallible. They rely exclusively on local observations and local communications. A swarm has no centralised control. These features are typifed by eusocial insects such as ants and termites, who construct nests, forage and build complex societies comprised of primitive agents. This project created the basis of a general swarm architecture for the control of insect-like robots. The Swarm Architecture is inspired by threshold models of insect behaviour and attempts to capture the salient features of the hive in a closely defined computer program that is hardware agnostic, swarm size indifferent and intended to be applicable to a wide range of swarm tasks. This was achieved by exploiting the inherent limitations of swarm agents. Individual insects were modelled as a machine capable only of perception, locomotion and manipulation. This approximation reduced behaviour primitives to a fixed tractable number and abstracted sensor interpretation. Cooperation was achieved through stigmergy and decisions made via a behaviour threshold model. The Architecture represents an advance on previous robotic swarms in its generality - swarm control software has often been tied to one task and robot configuration. The Architecture's exclusive focus on swarms, sets it apart from existing general cooperative systems, which are not usually explicitly swarm orientated. The Architecture was implemented successfully on both simulated and real-world swarms

    Reading the news through its structure: new hybrid connectivity based approaches

    Get PDF
    In this thesis a solution for the problem of identifying the structure of news published by online newspapers is presented. This problem requires new approaches and algorithms that are capable of dealing with the massive number of online publications in existence (and that will grow in the future). The fact that news documents present a high degree of interconnection makes this an interesting and hard problem to solve. The identification of the structure of the news is accomplished both by descriptive methods that expose the dimensionality of the relations between different news, and by clustering the news into topic groups. To achieve this analysis this integrated whole was studied using different perspectives and approaches. In the identification of news clusters and structure, and after a preparatory data collection phase, where several online newspapers from different parts of the globe were collected, two newspapers were chosen in particular: the Portuguese daily newspaper Público and the British newspaper The Guardian. In the first case, it was shown how information theory (namely variation of information) combined with adaptive networks was able to identify topic clusters in the news published by the Portuguese online newspaper Público. In the second case, the structure of news published by the British newspaper The Guardian is revealed through the construction of time series of news clustered by a kmeans process. After this approach an unsupervised algorithm, that filters out irrelevant news published online by taking into consideration the connectivity of the news labels entered by the journalists, was developed. This novel hybrid technique is based on Qanalysis for the construction of the filtered network followed by a clustering technique to identify the topical clusters. Presently this work uses a modularity optimisation clustering technique but this step is general enough that other hybrid approaches can be used without losing generality. A novel second order swarm intelligence algorithm based on Ant Colony Systems was developed for the travelling salesman problem that is consistently better than the traditional benchmarks. This algorithm is used to construct Hamiltonian paths over the news published using the eccentricity of the different documents as a measure of distance. This approach allows for an easy navigation between published stories that is dependent on the connectivity of the underlying structure. The results presented in this work show the importance of taking topic detection in large corpora as a multitude of relations and connectivities that are not in a static state. They also influence the way of looking at multi-dimensional ensembles, by showing that the inclusion of the high dimension connectivities gives better results to solving a particular problem as was the case in the clustering problem of the news published online.Neste trabalho resolvemos o problema da identificação da estrutura das notícias publicadas em linha por jornais e agências noticiosas. Este problema requer novas abordagens e algoritmos que sejam capazes de lidar com o número crescente de publicações em linha (e que se espera continuam a crescer no futuro). Este facto, juntamente com o elevado grau de interconexão que as notícias apresentam tornam este problema num problema interessante e de difícil resolução. A identificação da estrutura do sistema de notícias foi conseguido quer através da utilização de métodos descritivos que expõem a dimensão das relações existentes entre as diferentes notícias, quer através de algoritmos de agrupamento das mesmas em tópicos. Para atingir este objetivo foi necessário proceder a ao estudo deste sistema complexo sob diferentes perspectivas e abordagens. Após uma fase preparatória do corpo de dados, onde foram recolhidos diversos jornais publicados online optou-se por dois jornais em particular: O Público e o The Guardian. A escolha de jornais em línguas diferentes deve-se à vontade de encontrar estratégias de análise que sejam independentes do conhecimento prévio que se tem sobre estes sistemas. Numa primeira análise é empregada uma abordagem baseada em redes adaptativas e teoria de informação (nomeadamente variação de informação) para identificar tópicos noticiosos que são publicados no jornal português Público. Numa segunda abordagem analisamos a estrutura das notícias publicadas pelo jornal Britânico The Guardian através da construção de séries temporais de notícias. Estas foram seguidamente agrupadas através de um processo de k-means. Para além disso desenvolveuse um algoritmo que permite filtrar de forma não supervisionada notícias irrelevantes que apresentam baixa conectividade às restantes notícias através da utilização de Q-analysis seguida de um processo de clustering. Presentemente este método utiliza otimização de modularidade, mas a técnica é suficientemente geral para que outras abordagens híbridas possam ser utilizadas sem perda de generalidade do método. Desenvolveu-se ainda um novo algoritmo baseado em sistemas de colónias de formigas para solução do problema do caixeiro viajante que consistentemente apresenta resultados melhores que os tradicionais bancos de testes. Este algoritmo foi aplicado na construção de caminhos Hamiltonianos das notícias publicadas utilizando a excentricidade obtida a partir da conectividade do sistema estudado como medida da distância entre notícias. Esta abordagem permitiu construir um sistema de navegação entre as notícias publicadas que é dependente da conectividade observada na estrutura de notícias encontrada. Os resultados apresentados neste trabalho mostram a importância de analisar sistemas complexos na sua multitude de relações e conectividades que não são estáticas e que influenciam a forma como tradicionalmente se olha para sistema multi-dimensionais. Mostra-se que a inclusão desta dimensões extra produzem melhores resultados na resolução do problema de identificar a estrutura subjacente a este problema da publicação de notícias em linha

    Biological altruism, eusociality and the superorganism: a critical analysis of the role of biological altruism within eusociality research.

    Get PDF
    232 p.In this thesis I critically assess the role of the concept of biological altruism (BA) within eusociality research by assessing the following questions: 1) Is the concept of BA a correct description of the behaviour of the non-reproductive castes in eusocial insect colonies? 2) Has the widespread use of the concept of BA been problematic for eusociality research? I argue that, not only is the concept of BA unlikely to be the correct description of the behaviour of the non-reproductive castes, but the widespread use of the concept of BA has been problematic for the field. The mainstream focus on BA led to viable alternatives, e.g. parental manipulation, superorganism, etc., receiving much less attention by researchers. However, current evidence supports the view that parental manipulation, but not BA, was the likely cause of the evolution of the non-reproductive castes. Furthermore, I develop a novel organizational approach to the superorganism and argue that colonies of the most complex eusocial insects, e.g. honey bees, are biological individuals in their own right, and thus BA is not applicable to the individual insects in those colonies

    Biological altruism, eusociality and the superorganism: a critical analysis of the role of biological altruism within eusociality research.

    Get PDF
    232 p.In this thesis I critically assess the role of the concept of biological altruism (BA) within eusociality research by assessing the following questions: 1) Is the concept of BA a correct description of the behaviour of the non-reproductive castes in eusocial insect colonies? 2) Has the widespread use of the concept of BA been problematic for eusociality research? I argue that, not only is the concept of BA unlikely to be the correct description of the behaviour of the non-reproductive castes, but the widespread use of the concept of BA has been problematic for the field. The mainstream focus on BA led to viable alternatives, e.g. parental manipulation, superorganism, etc., receiving much less attention by researchers. However, current evidence supports the view that parental manipulation, but not BA, was the likely cause of the evolution of the non-reproductive castes. Furthermore, I develop a novel organizational approach to the superorganism and argue that colonies of the most complex eusocial insects, e.g. honey bees, are biological individuals in their own right, and thus BA is not applicable to the individual insects in those colonies

    Heterogeneous Ant Colony Optimisation Methods and their Application to the Travelling Salesman and PCB Drilling Problems

    Get PDF
    Ant Colony Optimization (ACO) is an optimization algorithm that is inspired by the foraging behaviour of real ants in locating and transporting food source to their nest. It is designed as a population-based metaheuristic and have been successfully implemented on various NP-hard problems such as the well-known Traveling Salesman Problem (TSP), Vehicle Routing Problem (VRP) and many more. However, majority of the studies in ACO focused on homogeneous artificial ants although animal behaviour researchers suggest that real ants exhibit heterogeneous behaviour thus improving the overall efficiency of the ant colonies. Equally important is that most, if not all, optimization algorithms require proper parameter tuning to achieve optimal performance. However, it is well-known that parameters are problem-dependant as different problems or even different instances have different optimal parameter settings. Parameter tuning through the testing of parameter combinations is a computationally expensive procedure that is infeasible on large-scale real-world problems. One method to mitigate this is to introduce heterogeneity by initializing the artificial agents with individual parameters rather than colony level parameters. This allows the algorithm to either actively or passively discover good parameter settings during the search. The approach undertaken in this study is to randomly initialize the ants from both uniform and Gaussian distribution respectively within a predefined range of values. The approach taken in this study is one of biological plausibility for ants with similar roles, but differing behavioural traits, which are being drawn from a mathematical distribution. This study also introduces an adaptive approach to the heterogeneous ant colony population that evolves the alpha and beta controlling parameters for ACO to locate near-optimal solutions. The adaptive approach is able to modify the exploitation and exploration characteristics of the algorithm during the search to reflect the dynamic nature of search. An empirical analysis of the proposed algorithm tested on a range of Travelling Salesman Problem (TSP) instances shows that the approach has better algorithmic performance when compared against state-of-the-art algorithms from the literature
    corecore