899 research outputs found
Principles and Concepts of Agent-Based Modelling for Developing Geospatial Simulations
The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded. The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded
Developing a prototype agent-based pedestrian evacuation model to explore the evacuation of King?s Cross St Pancras underground station
London?s King?s Cross St. Pancras underground station has been the unfortunate location of two major incidents within the last twenty years. A fire in November 1987 and the terrorist bombing in July 2005 both resulted in the loss of lives, and the injury of many people. The implementation of measures to mitigate or neutralise the effect of all possible future incidents at this site is unrealistic. The adoption of preparedness measures is crucial for the emergency services to limit the loss of life and property, and to improve the response phase of an incident. King?s Cross St. Pancras underground station is currently being redeveloped, partly to mitigate the remaining few operational and safety issues identified after the 1987 fire, and also to allow for a predicted increase in passenger use. Despite these modifications and improvements, both the surrounding built environment and the station will necessarily remain complex structures. The local emergency services have several duties placed upon themselves in the event of a major incident at this site, and a computer based model capable of examining the effects of different incident assumptions or contingencies has been identified as potentially beneficial to the local National Health Service (NHS) resilience planning department.The specific aim of this paper is to provide the reader with an overview of this research project. To begin, the aims and deliverables are identified. In light of these, principles of pedestrian evacuation modelling are presented, highlighting a shift in approaches: from aggregate movement, to individual-level movement and behavioural models. The feasibility of using a proprietary pedestrian evacuation model to achieve the research goal is discussed. This is followed by an agenda for developing an agent-based pedestrian evacuation model using the Repast toolkit. This paper concludes with progress of the prototype model to date. London?s King?s Cross St. Pancras underground station has been the unfortunate location of two major incidents within the last twenty years. A fire in November 1987 and the terrorist bombing in July 2005 both resulted in the loss of lives, and the injury of many people. The implementation of measures to mitigate or neutralise the effect of all possible future incidents at this site is unrealistic. The adoption of preparedness measures is crucial for the emergency services to limit the loss of life and property, and to improve the response phase of an incident. King?s Cross St. Pancras underground station is currently being redeveloped, partly to mitigate the remaining few operational and safety issues identified after the 1987 fire, and also to allow for a predicted increase in passenger use. Despite these modifications and improvements, both the surrounding built environment and the station will necessarily remain complex structures. The local emergency services have several duties placed upon themselves in the event of a major incident at this site, and a computer based model capable of examining the effects of different incident assumptions or contingencies has been identified as potentially beneficial to the local National Health Service (NHS) resilience planning department.The specific aim of this paper is to provide the reader with an overview of this research project. To begin, the aims and deliverables are identified. In light of these, principles of pedestrian evacuation modelling are presented, highlighting a shift in approaches: from aggregate movement, to individual-level movement and behavioural models. The feasibility of using a proprietary pedestrian evacuation model to achieve the research goal is discussed. This is followed by an agenda for developing an agent-based pedestrian evacuation model using the Repast toolkit. This paper concludes with progress of the prototype model to date
Trade-offs between causes of mortality in life history evolution: the case of cancers.
International audienceLittle is known about the relative importance of different causes of death in driving the evolution of senescence and longevity across species. Here we argue that cause-specific mortality may be shaped by physiological trade-offs between mortality components, challenging the theoretical view that physiologically independent processes should senesce at the same rate, or that interactions between causes of death will make selection blind to the effects of specific causes of death. We review the evidence that risk of cancers trades off with risks of mortality from other diseases, and investigate whether this might explain two of the most puzzling paradoxes in cancer evolution. First, among species, cancer prevalence is not a function of species’ size and longevity, despite the fact that cancer incidence is known to be a function of the number of cell divisions (and therefore of size) by unit of time (and therefore of longevity). Second, within species, despite the fact that genomic instability is thought to be the proximal cause of both cancer incidence and senescence, mortality rates rise with age while cancer incidence decelerates and declines at old ages. Building on a relatively novel theory from cellular biology, we construct a preliminary model to reveal the degree to which accumulation of senescent cells with age could explain this latter paradox. Diverting damaged stem cells towards a senescent-state reduces their risk of becoming tumorous; however, conversely, the accumulation of senescent cells in tissues compromises their rejuvenation capacity and functioning, leading to organismal senescence. Accumulation of senescent cells with age may then be optimal because it reduces cancer mortality at the cost of faster senescence from other causes. Evolution will drive species towards a balance between these two sources of mortality
Supramolecular Coordination Complexes as Optical Biosensors
In recent years, luminescent supramolecular coordination complexes (SCCs), including 2D-metallacycles and 3D-metallacages have been utilised for biomolecular analysis. Unlike small-molecular probes, the dimensions, size, shape, and flexibility of these complexes can easily be tuned by combining ligands designed with particular geometries, symmetries and denticity with metal ions with strong geometrical binding preferences. The well-defined cavities that result, in combination with the other non-covalent interactions that can be programmed into the ligand design, facilitate great selectivity towards guest binding. In this Review we will discuss the application of luminescent metallacycles and cages in the binding and detection of a wide range of biomolecules, such as carbohydrates, proteins, amino acids, and biogenic amines. We aim to explore the effect of the structural diversity of SCCs on the extent of biomolecular sensing, expressed in terms of sensitivity, selectivity and detection range
(Self) assembled news: recent highlights from the supramolecular chemistry literature (Quarter 2, 2023)
Immune-mediated competition in rodent malaria is most likely caused by induced changes in innate immune clearance of merozoites
Malarial infections are often genetically diverse, leading to competitive interactions between parasites. A quantitative understanding of the competition between strains is essential to understand a wide range of issues, including the evolution of virulence and drug resistance. In this study, we use dynamical-model based Bayesian inference to investigate the cause of competitive suppression of an avirulent clone of Plasmodium chabaudi (AS) by a virulent clone (AJ) in immuno-deficient and competent mice. We test whether competitive suppression is caused by clone-specific differences in one or more of the following processes: adaptive immune clearance of merozoites and parasitised red blood cells (RBCs), background loss of merozoites and parasitised RBCs, RBC age preference, RBC infection rate, burst size, and within-RBC interference. These processes were parameterised in dynamical mathematical models and fitted to experimental data. We found that just one parameter μ, the ratio of background loss rate of merozoites to invasion rate of mature RBCs, needed to be clone-specific to predict the data. Interestingly, μ was found to be the same for both clones in single-clone infections, but different between the clones in mixed infections. The size of this difference was largest in immuno-competent mice and smallest in immuno-deficient mice. This explains why competitive suppression was alleviated in immuno-deficient mice. We found that competitive suppression acts early in infection, even before the day of peak parasitaemia. These results lead us to argue that the innate immune response clearing merozoites is the most likely, but not necessarily the only, mediator of competitive interactions between virulent and avirulent clones. Moreover, in mixed infections we predict there to be an interaction between the clones and the innate immune response which induces changes in the strength of its clearance of merozoites. What this interaction is unknown, but future refinement of the model, challenged with other datasets, may lead to its discovery
Metal-organic ion transport systems
The design of synthetic membrane transporters for ions is a rapidly growing field of research, motivated by the potential medicinal applications of novel systems. Metal-organic coordination complexes provide access to a wide range of geometries, structures and properties and their facile synthesis and tunability offer advantages in the design of ion transport systems. In this review, the application of metal-organic constructs as membrane-spanning channels and ion carriers are explored, and the roles of the metal coordination complexes within these functional assemblies are highlighted
Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector
Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente
Guest Encapsulation within Surface-Adsorbed Self-Assembled Cages
Coordination cages encapsulate a wide variety of guests in the solution state. This ability renders them useful for applications such as catalysis and the sequestration of precious materials. A simple and general method for the immobilization of coordination cages on alumina is reported. Cage loadings are quantified via adsorption isotherms and guest displacement assays demonstrate that the adsorbed cages retain the ability to encapsulate and separate guest and non-guest molecules. Finally, a system of two cages, adsorbed on to different regions of alumina, stabilizes and separates a pair of Diels-Alder reagents. The addition of a single competitive guest results in the controlled release of the reagents, thus triggering their reaction. This method of coordination cage immobilization on solid phases is envisaged to be applicable to the extensive library of reported cages, enabling new applications based upon selective solid-phase molecular encapsulation
- …
