1,005 research outputs found

    Mining complex trees for hidden fruit : a graph–based computational solution to detect latent criminal networks : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Technology at Massey University, Albany, New Zealand.

    Get PDF
    The detection of crime is a complex and difficult endeavour. Public and private organisations – focusing on law enforcement, intelligence, and compliance – commonly apply the rational isolated actor approach premised on observability and materiality. This is manifested largely as conducting entity-level risk management sourcing ‘leads’ from reactive covert human intelligence sources and/or proactive sources by applying simple rules-based models. Focusing on discrete observable and material actors simply ignores that criminal activity exists within a complex system deriving its fundamental structural fabric from the complex interactions between actors - with those most unobservable likely to be both criminally proficient and influential. The graph-based computational solution developed to detect latent criminal networks is a response to the inadequacy of the rational isolated actor approach that ignores the connectedness and complexity of criminality. The core computational solution, written in the R language, consists of novel entity resolution, link discovery, and knowledge discovery technology. Entity resolution enables the fusion of multiple datasets with high accuracy (mean F-measure of 0.986 versus competitors 0.872), generating a graph-based expressive view of the problem. Link discovery is comprised of link prediction and link inference, enabling the high-performance detection (accuracy of ~0.8 versus relevant published models ~0.45) of unobserved relationships such as identity fraud. Knowledge discovery uses the fused graph generated and applies the “GraphExtract” algorithm to create a set of subgraphs representing latent functional criminal groups, and a mesoscopic graph representing how this set of criminal groups are interconnected. Latent knowledge is generated from a range of metrics including the “Super-broker” metric and attitude prediction. The computational solution has been evaluated on a range of datasets that mimic an applied setting, demonstrating a scalable (tested on ~18 million node graphs) and performant (~33 hours runtime on a non-distributed platform) solution that successfully detects relevant latent functional criminal groups in around 90% of cases sampled and enables the contextual understanding of the broader criminal system through the mesoscopic graph and associated metadata. The augmented data assets generated provide a multi-perspective systems view of criminal activity that enable advanced informed decision making across the microscopic mesoscopic macroscopic spectrum

    The will-to-incapacitate: An experiment in actuarial justice in the period between 1970 and 1987 in the United States.

    Get PDF
    This thesis interrogates incapacitation as it developed in the 1970s and 1980s in the United States to conduct a genealogy of the conditions of emergence of actuarial justice (Foucault, 1981; Feeley and Simon, 1992; 1994) as it is enacted within this particular knowledge-power formation. Incapacitation is a penal rationale that concentrates on anticipating future crimes, and preventing offenders from committing crimes, effectively prioritizing public safety above all other considerations. My mapping of incapacitation demonstrates that it is recursively performed along two mutually conditioning poles that are illustrative of Foucault’s account of biopolitics and security (1978, 2003, 2007). These poles are: technocratic penal managerialism, which regulates the actions of diverse agents and authorities as they participate in a program of reducing recidivism within a mobile population of offenders; and, danger management of this distributed population of offenders, driven by a desire to anticipate and selectively incapacitate the most dangerous offenders. This analysis supports the mapping of actuarial justice provided by Feeley and Simon; however, my typology uses Galloway’s (2004) concept of protocol, to extend and refine their diagram about actuarial power. Given the high levels of scientific uncertainty about the efficacy of selective incapacitation as a penal policy, and the poor predictive powers of actuarial instruments in accurately classifying high-rate offenders in the early 1980s, my analysis demonstrates how protocollary power established the rules for modulating the participation of autonomous and diverse agents that are enlisted within the distributed networks of actuarial justice to propel its movement forward, this being the birth of evidence-based penal policy and practice. This protocol projects an ontological view of recidivism derived from criminal career research that filters and experiments with probabilistic actuarial codes or profiles of risk. These biopolitical codes regulate future research into advancing knowledge, predicting and controlling levels of dangerousness, and auditing of governmental performance in reducing recidivism, all of which are contingent upon the anticipatory longitudinal tracking of an aleatory population of offenders within the penal environment. Protocol is a biopolitical form of management that is central in the logistical control of this penal network and its nodes of operation and decision-making, constantly mining data for new possibilities. At the same time, I demonstrate that this will-to knowledge uses its technocratic expertise to distort, exaggerate, or conceal difference in its struggle for authority given high levels of uncertainty about recidivism and how to control it

    Good Onlife Governance: On Law, Spontaneous Orders, and Design

    Get PDF

    The crime kaleidoscope: A cross-jurisdictional analysis of place features and crime in three urban environments

    Full text link
    Research identifies various place features (e.g., bars, schools, public transportation stops) that generate or attract crime. What is less clear is how the spatial influence of these place features compares across relatively similar environments, even for the same crime. In this study, risk terrain modeling (RTM), a geospatial crime forecasting and diagnostic tool, is utilized to identify place features that increase the risk of robbery and their particular spatial influence in Chicago, Illinois; Newark, New Jersey; and Kansas City, Missouri. The results show that the risk factors for robbery are similar between environments, but not necessarily identical. Further, some factors were riskier for robbery and affected their surrounding landscape in different ways that others. Consistent with crime pattern theory, the results suggest that the broader organization of the environmental backcloth affects how constituent place features relate to and influence crime. Implications are discussed with regard to research and practice

    Boundaries and Policing: Space, Jurisdictions, and Roles in the Collection of Official Crime Data

    Get PDF
    The Uniform Crime Reporting (UCR) Program is a law enforcement statistical system open to unreported information due to its voluntary nature. As such, there need to be a valid and accepted means to estimate official reports of crime for those different levels of geography where reporting may be incomplete. Current methods of imputing and modeling UCR data, which have not been updated since the 1960s, are based upon conceptualizations of law enforcement agencies that may no longer be valid. These older models do not appropriately represent the law enforcement assessment of space and place and its effects on discretionary recording behavior. The number of specialized agencies that share jurisdiction and population with primary law enforcement agencies has increased since early data modeling techniques were developed around the 1960s. This study explores the connection between the policing and the collection of crime data to advance our understanding of how differences among types of law enforcement may impact the discretionary decision to record data. To explore this topic, I have divided this study into three papers touching on differing dimensions of place, scale, and uncertainty connected to the recording of law enforcement data. The data for these papers includes national UCR Program data, as well as calls for service and recorded incident data from two law enforcement agencies in the mid-South—Knoxville Police Department and the University of Tennessee Police Department. Firstly, this research explores the influence of agency attributes to assess their possible impact on the treatment of missing data. The coefficient of variation (CV) is used to measure the internal variation of reported crime within various groups of agencies. The average CVs calculated with and without specialized agencies are compared using a Jackknifing technique to test whether the presence of specialized agencies increases the internal variation within the group or not. The comparison demonstrates that eliminating specialized agencies from the strata has a statistically significant effect on reducing internal variation for property crimes. For violent offenses, however, the results are more modest. While the average CV for violent crime does decrease with the elimination of specialized agencies, the improvements are not statistically significant. The results from this research point to a greater need to address the changing circumstances to incorporate the diversity of law enforcement agency type. Secondly, although there is an interest by researchers to use calls for service (CFS) as a useful proxy for recorded incident information by law enforcement as more of this type of data is made available in open data initiatives, the assumption that CFS could serve as a proxy for incident information in spatial analysis is not supported by the evidence. Instead, there is some indication that law enforcement activities are mediated by the agency’s goals for its data, such as intelligence-led policing or fulfillment of Clery Act reporting, thus affecting the recording of incident information. Using data from two different types of law enforcement agency within the same community, CFS and incident reports for property crimes in April 2014 were tested for spatial association using both the Cross-K function and the Co-location Quotient. Findings from this study show there is a modest amount of detectable clustering of CFS for the agency that fits a model of traditional municipal law enforcement. However, the law enforcement agencies serving a large university campus did not show any detectable spatial association for these events. The findings suggest that in the movement towards using open data researchers will need to take greater care in the selection of data to understand if underlying spatial assumptions about the data can be supported. Thirdly, an increasing quantity of data is currently being made available by law enforcement agencies, but frequently that data is not a consistent level of areal aggregation and scale. Factors such as the Modifiable Areal Unit Problem (MAUP) and the Uncertain Geographic Context Problem (UGCoP) make rectifying differing scales problematic. Central to this problem are the dynamics of recording crime data and whether law enforcement activity—specifically the concept of the patrol officer in a boundary role—is a key influence that should be accounted for in crime data models. With data from a midsized, southern municipal police department, two dasymetric allocation techniques using street networks and street networks weighted by calls for service are used to test potential improvements on the scale and aggregation problem through the introduction of law enforcement activity into allocation models for recorded crime data. Results demonstrate that the introduction of law enforcement activity—especially officer-initiated activity—improves the overall fit of the allocation of recorded crime into smaller subjurisdictional units. In addition, there is modest evidence to advocate for the use of law enforcement-generated subjurisdictional units (such as a precinct or beat) as opposed to population-based Census Tracts. These findings suggest that the production of crime statistics is subject to influences originating from law enforcement agency policy and the recording behavior of its officers. The findings of the three studies inform important discussions in the geographic community on the heterogeneous nature of law enforcement. More explicitly, combining the conclusions of these three papers contributes to an evolving understanding of the representations of place by geographic information science (GISc) and criminology, and the construction of place through the roles and behaviors of individuals, and the increasing use of “Big Geodata”. Future research with data collected from official police activities should consider the degree of uncertainty introduced by the nature of the activities themselves — especially considering the growing use, influence and reliance on georeferenced data produced by individuals not particularly informed about the nuances of geography
    corecore