813 research outputs found

    Improving the Creation of Hot Spot Policing Patrol Routes: Comparing Cognitive Heuristic Performance to an Automated Spatial Computation Approach

    Get PDF
    Hot spot policing involves the deployment of police patrols to places where high levels of crime have previously concentrated. The creation of patrol routes in these hot spots is mainly a manual process that involves using the results from an analysis of spatial patterns of crime to identify the areas and draw the routes that police officers are required to patrol. In this article we introduce a computational approach for automating the creation of hot spot policing patrol routes. The computational techniques we introduce created patrol routes that covered areas of higher levels of crime than an equivalent manual approach for creating hot spot policing patrol routes, and were more efficient in how they covered crime hot spots. Although the evidence on hot spot policing interventions shows they are effective in decreasing crime, the findings from the current research suggest that the impact of these interventions can potentially be greater when using the computational approaches that we introduce for creating hot spot policing patrol routes

    Determinants of the income velocity of money in Portugal: 1891–1998

    Get PDF
    This paper performs a long-run time series analysis of the behaviour of the income velocity of money in Portugal between 1891 and 1998 by assessing the importance of both macroeconomic and institutional factors and looking for particularities in the Portuguese case. We estimate two cointegration vectors for the income velocity of money, macroeconomic variables and institutional variables. It is apparent that one of these vectors reflects the relationship between income velocity and macroeconomic variables, while the other reflects the relationship between income velocity and institutional variables. Moreover, a regression analysis reveals that the usual U-shaped pattern is displayed with a relatively late inflection point located around 1970, which is consistent with the Spanish case. It is further noted that this is a feature of countries with a late economic and institutional development process.info:eu-repo/semantics/publishedVersio

    Evaluation of qPCR-Based Assays for Leprosy Diagnosis Directly in Clinical Specimens

    Get PDF
    The increased reliability and efficiency of the quantitative polymerase chain reaction (qPCR) makes it a promising tool for performing large-scale screening for infectious disease among high-risk individuals. To date, no study has evaluated the specificity and sensitivity of different qPCR assays for leprosy diagnosis using a range of clinical samples that could bias molecular results such as difficult-to-diagnose cases. In this study, qPCR assays amplifying different M. leprae gene targets, sodA, 16S rRNA, RLEP and Ag 85B were compared for leprosy differential diagnosis. qPCR assays were performed on frozen skin biopsy samples from a total of 62 patients: 21 untreated multibacillary (MB), 26 untreated paucibacillary (PB) leprosy patients, as well as 10 patients suffering from other dermatological diseases and 5 healthy donors. To develop standardized protocols and to overcome the bias resulted from using chromosome count cutoffs arbitrarily defined for different assays, decision tree classifiers were used to estimate optimum cutoffs and to evaluate the assays. As a result, we found a decreasing sensitivity for Ag 85B (66.1%), 16S rRNA (62.9%), and sodA (59.7%) optimized assay classifiers, but with similar maximum specificity for leprosy diagnosis. Conversely, the RLEP assay showed to be the most sensitive (87.1%). Moreover, RLEP assay was positive for 3 samples of patients originally not diagnosed as having leprosy, but these patients developed leprosy 5–10 years after the collection of the biopsy. In addition, 4 other samples of patients clinically classified as non-leprosy presented detectable chromosome counts in their samples by the RLEP assay suggesting that those patients either had leprosy that was misdiagnosed or a subclinical state of leprosy. Overall, these results are encouraging and suggest that RLEP assay could be useful as a sensitive diagnostic test to detect M. leprae infection before major clinical manifestations

    Brane-World Gravity

    Get PDF
    The observable universe could be a 1+3-surface (the "brane") embedded in a 1+3+\textit{d}-dimensional spacetime (the "bulk"), with Standard Model particles and fields trapped on the brane while gravity is free to access the bulk. At least one of the \textit{d} extra spatial dimensions could be very large relative to the Planck scale, which lowers the fundamental gravity scale, possibly even down to the electroweak (∌\sim TeV) level. This revolutionary picture arises in the framework of recent developments in M theory. The 1+10-dimensional M theory encompasses the known 1+9-dimensional superstring theories, and is widely considered to be a promising potential route to quantum gravity. At low energies, gravity is localized at the brane and general relativity is recovered, but at high energies gravity "leaks" into the bulk, behaving in a truly higher-dimensional way. This introduces significant changes to gravitational dynamics and perturbations, with interesting and potentially testable implications for high-energy astrophysics, black holes, and cosmology. Brane-world models offer a phenomenological way to test some of the novel predictions and corrections to general relativity that are implied by M theory. This review analyzes the geometry, dynamics and perturbations of simple brane-world models for cosmology and astrophysics, mainly focusing on warped 5-dimensional brane-worlds based on the Randall--Sundrum models. We also cover the simplest brane-world models in which 4-dimensional gravity on the brane is modified at \emph{low} energies -- the 5-dimensional Dvali--Gabadadze--Porrati models. Then we discuss co-dimension two branes in 6-dimensional models.Comment: A major update of Living Reviews in Relativity 7:7 (2004) "Brane-World Gravity", 119 pages, 28 figures, the update contains new material on RS perturbations, including full numerical solutions of gravitational waves and scalar perturbations, on DGP models, and also on 6D models. A published version in Living Reviews in Relativit

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Observation of associated near-side and away-side long-range correlations in √sNN=5.02  TeV proton-lead collisions with the ATLAS detector

    Get PDF
    Two-particle correlations in relative azimuthal angle (Δϕ) and pseudorapidity (Δη) are measured in √sNN=5.02  TeV p+Pb collisions using the ATLAS detector at the LHC. The measurements are performed using approximately 1  Όb-1 of data as a function of transverse momentum (pT) and the transverse energy (ÎŁETPb) summed over 3.1<η<4.9 in the direction of the Pb beam. The correlation function, constructed from charged particles, exhibits a long-range (2<|Δη|<5) “near-side” (Δϕ∌0) correlation that grows rapidly with increasing ÎŁETPb. A long-range “away-side” (Δϕ∌π) correlation, obtained by subtracting the expected contributions from recoiling dijets and other sources estimated using events with small ÎŁETPb, is found to match the near-side correlation in magnitude, shape (in Δη and Δϕ) and ÎŁETPb dependence. The resultant Δϕ correlation is approximately symmetric about π/2, and is consistent with a dominant cos⁥2Δϕ modulation for all ÎŁETPb ranges and particle pT
    • 

    corecore