612 research outputs found

    Network-based space-time scan statistics for detecting micro-scale hotspots

    Get PDF
    Events recorded in urban areas are often confined by the micro-scale geography of street networks, yet existing spatial–analytical methods do not usually account for the shortest-path distance of street networks. We propose space–time NetScan, a new spatial–temporal analytical method with improved accuracy for detecting patterns of concentrations across space and time. It extends the notion of a scan-statistic-type search window by measuring space-time patterns along street networks in order to detect micro-scale concentrations of events at the street-address level with high accuracy. Performance tests with synthetic data demonstrate that space-time NetScan outperforms existing methods in detecting the location, shape, size and duration of hotspots. An empirical study with drug-related incidents shows how space-time NetScan can improve our understanding of the micro-scale geography of crime. Aside from some abrupt one-off incidents, many hotspots form recurrent hotbeds, implying that drug-related crimes tend to persist in specific problem places

    Graph deep learning model for network-based predictive hotspot mapping of sparse spatio-temporal events

    Get PDF
    The predictive hotspot mapping of sparse spatio-temporal events (e.g., crime and traffic accidents) aims to forecast areas or locations with higher average risk of event occurrence, which is important to offer insight for preventative strategies. Although a network-based structure can better capture the micro-level variation of spatio-temporal events, existing deep learning methods of sparse events forecasting are either based on area or grid units due to the data sparsity in both space and time, and the complex network topology. To overcome these challenges, this paper develops the first deep learning (DL) model for network-based predictive mapping of sparse spatio-temporal events. Leveraging a graph-based representation of the network-structured data, a gated localised diffusion network (GLDNet) is introduced, which integrating a gated network to model the temporal propagation and a novel localised diffusion network to model the spatial propagation confined by the network topology. To deal with the sparsity issue, we reformulate the research problem as an imbalance regression task and employ a weighted loss function to train the DL model. The framework is validated on a crime forecasting case of South Chicago, USA, which outperforms the state-of-the-art benchmark by 12% and 25% in terms of the mean hit rate at 10% and 20% coverage level, respectively

    Complex Network Tools to Understand the Behavior of Criminality in Urban Areas

    Full text link
    Complex networks are nowadays employed in several applications. Modeling urban street networks is one of them, and in particular to analyze criminal aspects of a city. Several research groups have focused on such application, but until now, there is a lack of a well-defined methodology for employing complex networks in a whole crime analysis process, i.e. from data preparation to a deep analysis of criminal communities. Furthermore, the "toolset" available for those works is not complete enough, also lacking techniques to maintain up-to-date, complete crime datasets and proper assessment measures. In this sense, we propose a threefold methodology for employing complex networks in the detection of highly criminal areas within a city. Our methodology comprises three tasks: (i) Mapping of Urban Crimes; (ii) Criminal Community Identification; and (iii) Crime Analysis. Moreover, it provides a proper set of assessment measures for analyzing intrinsic criminality of communities, especially when considering different crime types. We show our methodology by applying it to a real crime dataset from the city of San Francisco - CA, USA. The results confirm its effectiveness to identify and analyze high criminality areas within a city. Hence, our contributions provide a basis for further developments on complex networks applied to crime analysis.Comment: 7 pages, 2 figures, 14th International Conference on Information Technology : New Generation

    The mortality rates and the space-time patterns of John Snow’s cholera epidemic map

    Get PDF
    Background Snow’s work on the Broad Street map is widely known as a pioneering example of spatial epidemiology. It lacks, however, two significant attributes required in contemporary analyses of disease incidence: population at risk and the progression of the epidemic over time. Despite this has been repeatedly suggested in the literature, no systematic investigation of these two aspects was previously carried out. Using a series of historical documents, this study constructs own data to revisit Snow’s study to examine the mortality rate at each street location and the space-time pattern of the cholera outbreak. Methods This study brings together records from a series of historical documents, and prepares own data on the estimated number of residents at each house location as well as the space-time data of the victims, and these are processed in GIS to facilitate the spatial-temporal analysis. Mortality rates and the space-time pattern in the victims’ records are explored using Kernel Density Estimation and network-based Scan Statistic, a recently developed method that detects significant concentrations of records such as the date and place of victims with respect to their distance from others along the street network. The results are visualised in a map form using a GIS platform. Results Data on mortality rates and space-time distribution of the victims were collected from various sources and were successfully merged and digitised, thus allowing the production of new map outputs and new interpretation of the 1854 cholera outbreak in London, covering more cases than Snow’s original report and also adding new insights into their space-time distribution. They confirmed that areas in the immediate vicinity of the Broad Street pump indeed suffered from excessively high mortality rates, which has been suspected for the past 160 years but remained unconfirmed. No distinctive pattern was found in the space-time distribution of victims’ locations. Conclusions The high mortality rates identified around the Broad Street pump are consistent with Snow’s theory about cholera being transmitted through contaminated water. The absence of a clear space-time pattern also indicates the water-bourne, rather than the then popular belief of air bourne, nature of cholera. The GIS data constructed in this study has an academic value and would cater for further research on Snow’s map

    CrimeTelescope: crime hotspot prediction based on urban and social media data fusion

    Get PDF
    Crime is a complex social issue impacting a considerable number of individuals within a society. Preventing and reducing crime is a top priority in many countries. Given limited policing and crime reduction resources, it is often crucial to identify effective strategies to deploy the available resources. Towards this goal, crime hotspot prediction has previously been suggested. Crime hotspot prediction leverages past data in order to identify geographical areas susceptible of hosting crimes in the future. However, most of the existing techniques in crime hotspot prediction solely use historical crime records to identify crime hotspots, while ignoring the predictive power of other data such as urban or social media data. In this paper, we propose CrimeTelescope, a platform that predicts and visualizes crime hotspots based on a fusion of different data types. Our platform continuously collects crime data as well as urban and social media data on the Web. It then extracts key features from the collected data based on both statistical and linguistic analysis. Finally, it identifies crime hotspots by leveraging the extracted features, and offers visualizations of the hotspots on an interactive map. Based on real-world data collected from New York City, we show that combining different types of data can effectively improve the crime hotspot prediction accuracy (by up to 5.2%), compared to classical approaches based on historical crime records only. In addition, we demonstrate the usability of our platform through a System Usability Scale (SUS) survey on a full prototype of CrimeTelescope

    Predictive Crime Mapping: Arbitrary Grids or Street Networks?

    Get PDF
    OBJECTIVES: Decades of empirical research demonstrate that crime is concentrated at a range of spatial scales, including street segments. Further, the degree of clustering at particular geographic units remains noticeably stable and consistent; a finding that Weisburd (Criminology 53:133–157, 2015) has recently termed the ‘law of crime concentration at places’. Such findings suggest that the future locations of crime should—to some extent at least—be predictable. To date, methods of forecasting where crime is most likely to next occur have focused either on area-level or grid-based predictions. No studies of which we are aware have developed and tested the accuracy of methods for predicting the future risk of crime at the street segment level. This is surprising given that it is at this level of place that many crimes are committed and policing resources are deployed. METHODS: Using data for property crimes for a large UK metropolitan police force area, we introduce and calibrate a network-based version of prospective crime mapping [e.g. Bowers et al. (Br J Criminol 44:641–658, 2004)], and compare its performance against grid-based alternatives. We also examine how measures of predictive accuracy can be translated to the network context, and show how differences in performance between the two cases can be quantified and tested. RESULTS: Findings demonstrate that the calibrated network-based model substantially outperforms a grid-based alternative in terms of predictive accuracy, with, for example, approximately 20 % more crime identified at a coverage level of 5 %. The improvement in accuracy is highly statistically significant at all coverage levels tested (from 1 to 10 %). CONCLUSIONS: This study suggests that, for property crime at least, network-based methods of crime forecasting are likely to outperform grid-based alternatives, and hence should be used in operational policing. More sophisticated variations of the model tested are possible and should be developed and tested in future research

    The genius loci of crime: revealing associations in time and space

    Get PDF
    In most police services the only spatial and temporal analysis of crime was conducted until recently by statisticians at the force headquarters, with little or no regard for any short term or localised patterns of crime. In recent years there has been a move towards a more decentralised, proactive style of British policing focused at the police divisional and community level. This has left an intelligence void where force level analysis techniques are neither appropriate nor subtle enough to elicit any meaningful information at a local level from the mass of crime data generated within the police service. This thesis reveals patterns in community level crime which have not been recognised previously using traditional techniques in spatial and temporal investigation which tend to lack the necessary analytical ability. Current policing considerations are recognised and the thesis concentrates on three aspects of police crime concern: accurate temporal analysis, repeat victimisation, and the identification of hotspots. A number of new techniques are presented which are designed with the needs of a crime analyst at a divisional police station in mind, an individual who has until now lacked the necessary analytical tools to perform the role effectively

    Project 7708 understanding heritage crime in Kent and Medway – a data analytical approach

    Get PDF
    Our research was concerned with the geographical areas of Kent and Medway and involved the spatial and temporal analysis of ‘heritage-specific offences’, ‘targeted heritage crime’ and ‘crime within, at or close to heritage sites’. The crime data consisted of offence type and location details for the 1,122,180 crimes recorded by Kent Police during the period under study. The geographical data we utilised included locations of Conservation Areas, Listed Buildings, Scheduled Monuments, Registered Parks and Gardens, Registered Battlefields, World Heritage Sites, Protected Wreck Sites and ‘Heritage at Risk’ sites in Kent and Medway. Our best estimates suggest that currently approximately one in five Listed Buildings and one in four Places of Worship in Kent and Medway experience some form of crime each year. About one in ten Scheduled Monuments suffer crime, or it occurs nearby. Just over one half of Registered Parks or Gardens have one or more crimes a year within them. For Conservation Areas the proportion is (not unexpectedly) much larger, at closer to four in five. We utilised local Moran’s I to identify spatial clusters at a regional level using LSOA-level crime data. This revealed several LSOAs on the fringes of areas of high levels of crime that could be particularly vulnerable to the spread of crime and therefore heritage-specific locations within these areas could be managed to halt the ‘spread’ of crime towards the periphery of the town. In Kent and Medway Places of Worship (mostly Christian churches) are experiencing increasing numbers of crimes, and this has been particularly the case since around summer 2016. The rate of increase appears higher than that of all other crimes in the same period, both in general and at other heritage locations. There is clear statistical evidence that metal thefts from churches have also been increasing markedly since around summer 2016. The rate of increase appears higher than that of most other crimes. There is statistically significant correlation between metal thefts from churches in Kent and Medway with both the price of lead and mixed brass. Finally, we discovered that machine learning as a method of heritage crime prevention shows promise

    Determining the optimal spatial and temporal thresholds that maximise the predictive accuracy of the Prospective Space-Time Scan Statistic (PSTSS) hotspot method

    Get PDF
    The spatial and temporal thresholds (K and T) are two key parameters that control the performance of the prospective space-time scan statistical (PSTSS) hotspot method. This study proposes an objective function approach, in which the optimal values of K and T that maximize the mean hit rate (a measure of predictive accuracy), are determined. The proposed approach involves sweeping through a range of values defined for each parameter and monitors their impacts on the mean hit rate. A case study of the crime data sets of the South Chicago area is presented in which 100 one-day consecutive predictions are carried out. Two aspects of the derived results are significant. First, is that there is a trade-off between the predictive accuracy obtainable from the use of PSTSS and the level of hotspot coverage. Second, is that K is found to have more influence on the accuracy than T. As K increases in size, the accuracy level decreases, whereas there is no notable impact of T on the accuracy, particularly when T _ 30 days. This study also demonstrated the distinctiveness of PSTSS as a hotspot method as compared to other conventional hotspot methods. Lastly, it is argued that the approach demonstrated in this study is not only applicable to crime hotspot prediction, but could also be used in many other domains where the PSTSS technique is used
    • …
    corecore