272 research outputs found
Building cities from slime mould, agents and quantum field theory
Managing the unprecedented growth of cities whilst ensuring that they are sustainable, healthy and equitable places to live, presents significant challenges. Our current thinking conceptualise cities as being driven by processes from the bottom-up, with an emphasis on the role that individual decisions and behaviour play. Multiagent systems, and agent-based modelling in particular, are ideal frameworks for the analysis of such systems. However, identifying the important drivers within an urban system, translating key behaviours from data into rules, quantifying uncertainty and running models in real time all present significant challenges. We discuss how innovations in a diverse range of fields are influencing empirical agent-based models, and how models designed for the simplest biological systems might transform the ways that we understand and manage real cities
An agent model of urban economics: Digging into emergence
This paper presents an agent-based ‘monocentric’ model: assuming only a fixed location for firms, outcomes closely parallel those found in classical urban economic models, but emerge through ‘bottom-up’ interaction in an agent-based model. Agents make buying and movement decisions based on a set of simple costs they face from their current location. These spatial costs are reduced to two types: the costs of moving people and goods across geographical distances and the costs (and benefits) of ‘being here’ (the effects of being at a particular location such as land costs, amenities or disamenities). Two approaches to land cost are compared: landlords and a ‘density cost’ proxy. Emergent equilibrium outcomes are found to depend on the interaction of externalities and time. These findings are produced by looking at how agents react to changing four types of cost, two spatial and two non-spatial: commuting, wage, good cost and good delivery. The models explore equilibrium outcomes, the effect of changing costs and the impact of heterogeneous agents, before focusing in on one example to find the source of emergence in the externalities of agent choice. The paper finishes by emphasising the importance of thinking about emergence as a tool, not an end in itself
Place-Based Simulation Modeling: Agent-Based Modeling and Virtual Environments
Since the earliest geographical explorations of criminal phenomena, scientists have come to the realization that crime occurrences can often be best explained by analysis at local scales. For example, the works of Guerry and Quetelet—which are often credited as being the first spatial studies of crime—analyzed data that had been aggregated to regions approximately similar to US states. The next major seminal work on spatial crime patterns was from the Chicago School in the 20th century and increased the spatial resolution of analysis to the census tract (an American administrative area that is designed to contain approximately 4,000 individual inhabitants). With the availability of higher-quality spatial data, as well as improvements in the computing infrastructure (particularly with respect to spatial analysis and mapping), more recent empirical spatial criminology work can operate at even higher resolutions; the “crime at places” literature regularly highlights the importance of analyzing crime at the street segment or at even finer scales. These empirical realizations—that crime patterns vary substantially at micro places—are well grounded in the core environmental criminology theories of routine activity theory, the geometric theory of crime, and the rational choice perspective. Each theory focuses on the individual-level nature of crime, the behavior and motivations of individual people, and the importance of the immediate surroundings. For example, routine activities theory stipulates that a crime is possible when an offender and a potential victim meet at the same time and place in the absence of a capable guardian. The geometric theory of crime suggests that individuals build up an awareness of their surroundings as they undertake their routine activities, and it is where these areas overlap with crime opportunities that crimes are most likely to occur. Finally, the rational choice perspective suggests that the decision to commit a crime is partially a cost-benefit analysis of the risks and rewards. To properly understand or model these three decisions it is important to capture the motivations, awareness, rationality, immediate surroundings, etc., of the individual and include a highly disaggregate representation of space (i.e. “micro-places”). Unfortunately one of the most common methods for modeling crime, regression, is somewhat poorly suited capturing these dynamics. As with most traditional modeling approaches, regression models represent the underlying system through mathematical aggregations. The resulting models are therefore well suited to systems that behave in a linear fashion (e.g., where a change in model input leads to a predictable change in the model output) and where low-level heterogeneity is not important (i.e., we can assume that everyone in a particular group of people will behave in the same way). However, as alluded to earlier, the crime system does not necessarily meet these assumptions. To really understand the dynamics of crime patterns, and to be able to properly represent the underlying theories, it is necessary to represent the behavior of the individual system components (i.e. people) directly. For this reason, many scientists from a variety of different disciplines are turning to individual-level modeling techniques such as agent-based modeling
Towards the development of societal twins
A digital twin is a virtual data-driven replica of a real-world system. Recently, digital twins have become popular in engineering and infrastructure planning where they provide insights into complex physical systems or processes. Yet, to date, considerably less research has explored how digital replicas of social systems - representing the decisions, behaviors and interactions of individual people, and, in turn, their emergent outcomes - might be developed and integrated with those of physical systems. In this position paper we discuss the need for such societal twins, what they might look like, and set out key challenges that will need to be overcome if these benefits are to be realised
Using graph structural information about flows to enhance short-term demand prediction in bike-sharing systems
Short-term demand prediction is important for managing transportation infrastructure, particularly in times of disruption, or around new developments. Many bike-sharing schemes face the challenges of managing service provision and bike fleet rebalancing due to the “tidal flows” of travel and use. For them, it is crucial to have precise predictions of travel demand at a fine spatiotemporal granularities. Despite recent advances in machine learning approaches (e.g. deep neural networks) and in short-term traffic demand predictions, relatively few studies have examined this issue using a feature engineering approach to inform model selection. This research extracts novel time-lagged variables describing graph structures and flow interactions from real-world bike usage datasets, including graph node Out-strength, In-strength, Out-degree, In-degree and PageRank. These are used as inputs to different machine learning algorithms to predict short-term bike demand. The results of the experiments indicate the graph-based attributes to be more important in demand prediction than more commonly used meteorological information. The results from the different machine learning approaches (XGBoost, MLP, LSTM) improve when time-lagged graph information is included. Deep neural networks were found to be better able to handle the sequences of the time-lagged graph variables than other approaches, resulting in more accurate forecasting. Thus incorporating graph-based features can improve understanding and modelling of demand patterns in urban areas, supporting bike-sharing schemes and promoting sustainable transport. The proposed approach can be extended into many existing models using spatial data and can be readily transferred to other applications for predicting dynamics in mass transit systems. A number of limitations and areas of further work are discussed
A spatiotemporal and graph-based analysis of dockless bike sharing patterns to understand urban flows over the last mile
The recent emergence of dockless bike sharing systems has resulted in new patterns of urban transport. Users can begin and end trips from their origin and destination locations rather than docking stations. Analysis of changes in the spatiotemporal availability of such bikes has the ability to provide insights into urban dynamics at a finer granularity than is possible through analysis of travel card or dock-based bike scheme data. This study analyses dockless bike sharing in Nanchang, China over a period when a new metro line came into operation. It uses spatial statistics and graph-based approaches to quantify changes in travel behaviours and generates previously unobtainable insights about urban flow structures. Geostatistical analyses support understanding of large-scale changes in spatiotemporal travel behaviours and graph-based approaches allow changes in local travel flows between individual locations to be quantified and characterized. The results show how the new metro service boosted nearby bike demand, but with considerable spatial variation, and changed the spatiotemporal patterns of bike travel behaviour. The analysis also quantifies the evolution of travel flow structures, indicating the resilience of dockless bike schemes and their ability to adapt to changes in travel behaviours. More widely, this study demonstrates how an enhanced understanding of urban dynamics over the “last-mile” is supported by the analyses of dockless bike data. These allow changes in local spatiotemporal interdependencies between different transport systems to be evaluated, and support spatially detailed urban and transport planning. A number of areas of further work are identified to better to understand interdependencies between different transit system components
Dealing with uncertainty in agent-based models for short-term predictions
Agent-based models (ABMs) are gaining traction as one of the most powerful modelling tools within the social sciences. They are particularly suited to simulating complex systems. Despite many methodological advances within ABM, one of the major drawbacks is their inability to incorporate real-time data to make accurate short-term predictions. This paper presents an approach that allows ABMs to be dynamically optimized. Through a combination of parameter calibration and data assimilation (DA), the accuracy of model-based predictions using ABM in real time is increased. We use the exemplar of a bus route system to explore these methods. The bus route ABMs developed in this research are examples of ABMs that can be dynamically optimized by a combination of parameter calibration and DA. The proposed model and framework is a novel and transferable approach that can be used in any passenger information system, or in an intelligent transport systems to provide forecasts of bus locations and arrival times
Creating realistic synthetic populations at varying spatial scales: A comparative critique of population synthesis techniques
There are several established methodologies for generating synthetic populations. These include deterministic reweighting, conditional probability (Monte Carlo simulation) and simulated annealing. However, each of these approaches is limited by, for example, the level of geography to which it can be applied, or number of characteristics of the real population that can be replicated. The research examines and critiques the performance of each of these methods over varying spatial scales. Results show that the most consistent and accurate populations generated over all the spatial scales are produced from the simulated annealing algorithm. The relative merits and limitations of each method are evaluated in the discussion
The spatial economics of energy justice: modelling the trade impacts of increased transport costs in a low carbon transition and the implications for UK regional inequality
Spatial economic change is an energy justice issue (Bouzarovski and Simcock, 2017) - an essential consideration in how we choose to re-wire the economy for a carbon-free future. Nothing like the conscious system-wide change required has been attempted before. Rapid policy decisions risk embedding existing injustices or creating new ones unless steps are taken to ameliorate those risks. We present a model that takes a whole-system view of the UK spatial economy, examining how increasing distance costs (e.g. through fuel tax hikes) have unequal impacts on regions and sectors. The model establishes an important carbon transition policy principle: change in spatial flows of internal trade, which are certain to occur rapidly during transition, have measurable energy justice implications. Peripheral economic regions, in rural and coastal areas and many city outskirts are most vulnerable, as are petrochemical, agricultural and connected sectors. Policy must go beyond identifying places and sectors most affected: it is the connections between them that matter most. The "push" of spatially aware fiscal policy needs to be combined with the "pull" of targeted interventions designed to promote low-carbon intermediate connections. This is not only just, but would help make (potentially costly) transition more politically acceptable
Cutaneous nociceptors lack sensitisation, but reveal \u3bc-opioid receptor-mediated reduction in excitability to mechanical stimulation in neuropathy
Background: Peripheral nerve injuries often trigger a hypersensitivity to tactile stimulation. Behavioural studies demonstrated efficient and side effect-free analgesia mediated by opioid receptors on peripheral sensory neurons. However, mechanistic approaches addressing such opioid properties in painful neuropathies are lacking. Here we investigated whether opioids can directly inhibit primary afferent neuron transmission of mechanical stimuli in neuropathy. We analysed the mechanical thresholds, the firing rates and response latencies of sensory fibres to mechanical stimulation of their cutaneous receptive fields.Results: Two weeks following a chronic constriction injury of the saphenous nerve, mice developed a profound mechanical hypersensitivity in the paw innervated by the damaged nerve. Using an in vitro skin-nerve preparation we found no changes in the mechanical thresholds and latencies of sensory fibres from injured nerves. The firing rates to mechanical stimulation were unchanged or reduced following injury. Importantly, \u3bc-opioid receptor agonist [D-Ala2,N-Me-Phe4,Gly5]-ol-enkephalin (DAMGO) significantly elevated the mechanical thresholds of nociceptive A\u3b4 and C fibres. Furthermore, DAMGO substantially diminished the mechanically evoked discharges of C nociceptors in injured nerves. These effects were blocked by DAMGO washout and pre-treatment with the selective \u3bc-opioid receptor antagonist Cys2-Tyr3-Orn5-Pen7-amide. DAMGO did not alter the responses of sensory fibres in uninjured nerves.Conclusions: Our findings suggest that behaviourally manifested neuropathy-induced mechanosensitivity does not require a sensitised state of cutaneous nociceptors in damaged nerves. Yet, nerve injury renders nociceptors sensitive to opioids. Prevention of action potential generation or propagation in nociceptors might represent a cellular mechanism underlying peripheral opioid-mediated alleviation of mechanical hypersensitivity in neuropathy. \ua9 2012 Schmidt et al.; licensee BioMed Central Ltd
- …