1,991 research outputs found

    Bad moon on the rise? Lunar cycles and incidents of crime

    Get PDF
    Popular cultures in Western societies have long espoused the notion that phases of the moon influence human behavior. In particular, there is a common belief the full moon increases incidents of aberrant, deviant, and criminal behavior. Using police, astronomical, and weather data from a major southwestern American city, this study assessed whether lunar cycles related with rates of reported crime. The findings fail to support popular lore, which has suggested that lunar phase influenced the volume of crime reported to the police. Future research directions examining qualitative rather than quantitative aspects of this problem may yield further inform the understanding of whether lunar cycles appreciably influence demands for policing services

    Murder Clearance Rates: Guest Editors\u27 Introduction

    Get PDF
    The journal Homicide Studies has long been devoted to empirical studies addressing issues pertinent to the study of homicide and violence. Although a large variety of theoretical papers, research summaries, and public policy reviews of issues concerning homicide and violence have been explored in the journal over the past 10 years, at least one issue has garnered relatively little attention—the law enforcement response to homicide. This special issue attempts to begin filling this gap in the literature

    Clearing Murders: Is It about Time?

    Get PDF
    This study uses data from the National Incident-Based Reporting System (NIBRS) to explore the impact of model selection on determining the association of victim-level and incident-level factors to the likelihood of homicide clearance. We compare both traditional operationalizations of clearance rates as well as the time to clearance as dependent variables in examinations of correlates of solvability in homicide cases. Using a different approach than most other analyses of this problem, the results affirm the consistency of some effects but also reveal some important differences when the aspect of time is factored into the model. Implications for analyses of efficiency and effectiveness of police response to homicide, cold-case analyses, and other strategies for solving crime are discussed

    Clearing Murders: Is It about Time?

    Get PDF
    This study uses data from the National Incident-Based Reporting System (NIBRS) to explore the impact of model selection on determining the association of victim-level and incident-level factors to the likelihood of homicide clearance. We compare both traditional operationalizations of clearance rates as well as the time to clearance as dependent variables in examinations of correlates of solvability in homicide cases. Using a different approach than most other analyses of this problem, the results affirm the consistency of some effects but also reveal some important differences when the aspect of time is factored into the model. Implications for analyses of efficiency and effectiveness of police response to homicide, cold-case analyses, and other strategies for solving crime are discussed

    Youthful Familicidal Offenders: Targeted Victims, Planned Attacks

    Get PDF
    A nonrandom national sample of 16 familicides, which involved 19 offenders (ages 14 to 21 years) who either killed or made a serious attempt to kill their families, was studied. The majority of offenders were Caucasian (78.91 %) males (84.21 %) with interpersonal family conflicts due to parental control, substance use, or physical violence. Prior to the murders, 50 % of the offenders reported to others their intent to kill their families. All of the 42 reported victims were specifically targeted and most of the homicides were planned shooting attacks (75 %) rather than spontaneous eruptions. Immediately following the homicides, 75 % of the offenders stole money from their families, and in 50 % of the cases they either called their friends to report the murders or to plan leisure activities. All offenders were immediate suspects and 81.25 % confessed to the homicides. Implications for furthering our understanding of this group of young offenders are offered

    Performance of second order particle-in-cell methods on modern many-core architectures

    Get PDF
    The emergence of modern many-core architectures that offer an extreme level of parallelism makes methods that were previously infeasible due to computational expense now achievable. Particle-in-Cell (PIC) codes often fail to fully leverage this increased performance potential due to their high use of memory bandwidth. The use of higher order PIC methods may offer a solution to this by improving simulation accuracy significantly for an increase in computational intensity when compared to their first order counterparts. This greater expense is accompanied with only a minor increase in the amount of memory throughput required during the simulation. In this presentation we will show the performance of a second order PIC algorithm. Our implementation uses second order finite elements and particles that are represented with a collection of surrounding ghost particles. These ghost particles each have associated weights and offsets around the true particle position and therefore represent a charge distribution. We test our PIC implementation against a first order algorithm on various modern compute architectures including Intel’s Knights Landing (KNL) and NVIDIA’s Tesla P100. Our preliminary results show the viability of second order methods for PIC applications on these architectures when compared to previous generations of many-core hardware. Specifically, we see an order of magnitude improvement in performance for second order methods between the Pascal and Kepler GPU architectures, despite only a 4× improvement in theoretical peak performance between the architectures. Although these initial results show a large increase in runtime over first order methods, we hope to be able to show improved scaling behaviour and increased simulation accuracy in the future

    Performance of second order particle-in-cell methods on modern many-core architectures

    Get PDF
    The emergence of modern many-core architectures that offer an extreme level of parallelism makes methods that were previously infeasible due to computational expense now achievable. Particle-in-Cell (PIC) codes often fail to fully leverage this increased performance potential due to their high use of memory bandwidth. The use of higher order PIC methods may offer a solution to this by improving simulation accuracy significantly for an increase in computational intensity when compared to their first order counterparts. This greater expense is accompanied with only a minor increase in the amount of memory throughput required during the simulation. In this presentation we will show the performance of a second order PIC algorithm. Our implementation uses second order finite elements and particles that are represented with a collection of surrounding ghost particles. These ghost particles each have associated weights and offsets around the true particle position and therefore represent a charge distribution. We test our PIC implementation against a first order algorithm on various modern compute architectures including Intel?s Knights Landing (KNL) and NVIDIA?s Tesla P100. Our preliminary results show the viability of second order methods for PIC applications on these architectures when compared to previous generations of many-core hardware. Specifically, we see an order of magnitude improvement in performance for second order methods between the Pascal and Kepler GPU architectures, despite only a 4$ improvement in theoretical peak performance between the architectures. Although these initial results show a large increase in runtime over first order methods, we hope to be able to show improved scaling behaviour and increased simulation accuracy in the future

    The Third Gravitational Lensing Accuracy Testing (GREAT3) Challenge Handbook

    Full text link
    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is the third in a series of image analysis challenges, with a goal of testing and facilitating the development of methods for analyzing astronomical images that will be used to measure weak gravitational lensing. This measurement requires extremely precise estimation of very small galaxy shape distortions, in the presence of far larger intrinsic galaxy shapes and distortions due to the blurring kernel caused by the atmosphere, telescope optics, and instrumental effects. The GREAT3 challenge is posed to the astronomy, machine learning, and statistics communities, and includes tests of three specific effects that are of immediate relevance to upcoming weak lensing surveys, two of which have never been tested in a community challenge before. These effects include realistically complex galaxy models based on high-resolution imaging from space; spatially varying, physically-motivated blurring kernel; and combination of multiple different exposures. To facilitate entry by people new to the field, and for use as a diagnostic tool, the simulation software for the challenge is publicly available, though the exact parameters used for the challenge are blinded. Sample scripts to analyze the challenge data using existing methods will also be provided. See http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/ for more information.Comment: 30 pages, 13 figures, submitted for publication, with minor edits (v2) to address comments from the anonymous referee. Simulated data are available for download and participants can find more information at http://great3.projects.phys.ucl.ac.uk/leaderboard

    Beyond the social production of homicide rates: Extending social disorganization theory to explain homicide case outcomes

    Get PDF
    This paper examines the intersection of social disorganization at a community level with responses to crime. In contrast to other works examining the impact of social disorganization on the production of crime rates, we examine the role of social disorganization theory in responses to crime rates (i.e. the arrest and conviction of perpetrators of crime). In an effort to examine these dynamics, we use law enforcement data from Cleveland, Ohio to explore the role of social disorganization in the ability of police and the courts to respond to homicide cases. Such an examination suggests not only how far the law extends in community responses to homicide but also reveals an extension of social disorganization theory beyond its established role in explaining the production of crime rates

    Higher-order particle representation for particle-in-cell simulations

    Get PDF
    In this paper we present an alternative approach to the representation of simulation particles for unstructured electrostatic and electromagnetic PIC simulations. In our modified PIC algorithm we represent particles as having a smooth shape function limited by some specified finite radius, . A unique feature of our approach is the representation of this shape by surrounding simulation particles with a set of virtual particles with delta shape, with fixed offsets and weights derived from Gaussian quadrature rules and the value of . As the virtual particles are purely computational, they provide the additional benefit of increasing the arithmetic intensity of traditionally memory bound particle kernels. The modified algorithm is implemented within Sandia National Laboratories' unstructured EMPIRE-PIC code, for electrostatic and electromagnetic simulations, using periodic boundary conditions. We show results for a representative set of benchmark problems, including electron orbit, a transverse electromagnetic wave propagating through a plasma, numerical heating, and a plasma slab expansion. Good error reduction across all of the chosen problems is achieved as the particles are made progressively smoother, with the optimal particle radius appearing to be problem-dependent
    • …
    corecore