524 research outputs found

    Improving Data Infrastructure to Reduce Firearms Violence

    Get PDF
    In the fall of 2020, Arnold Ventures, a philanthropy dedicated to maximizing opportunity and minimizing injustice, and NORC at University of Chicago, an objective nonpartisan research institution, released the Blueprint for a US Firearms Infrastructure (Roman, 2020). The Blueprint is the consensus report of an expert panel of distinguished academics, trailblazing practitioners, and government leaders. It describes 17 critical reforms required to modernize how data about firearms violence of all types (intentional, accidental, and self-inflicted) are collected, integrated and disseminated. This project, which is also supported by Arnold Ventures, takes the conceptual priorities described in the Blueprint and proposes specific new steps for implementation.The first step in building a better firearms data infrastructure is to acknowledge where we currently stand. In The State of Firearm Data in 2019 (Roman, 2019), the expert panel found that while there are a substantial number of data sources that collect data on firearms violence, existing datasets and data collections are limited, particularly around intentional injuries. There is some surveillance data, but health data on firearms injuries are kept separately from data on crimes, and there are few straightforward ways to link those data. Data that provide context for a shooting--where the event took place, and what the relationship was between victim and shooter--are not available alongside data on the nature of injuries. Valuable data collections have been discontinued, data are restricted by policy, important data are not collected, data are often difficult to access, and contemporary data are often not released in a timely fashion or not available outside of specialized settings. As a result, researchers face vast gaps in knowledge and are unable to leverage existing data to build the evidence base necessary to adequately answer key policy questions and inform firearms policymaking.In the Blueprint, the expert panel developed a set of recommendations organized around a reconceptualization of how data are collected and who collects data. The broad themes from the Blueprint are as follows:Almost all surveillance data in health and criminal justice is generated locally. It is a high priority to provide information, technical assistance, implementation supports, and funding to state and local governments to improve their collections.Comprehensive monitoring of all federal data collections is needed to ensure that important data elements are being collected, data gaps are being addressed, and quality issues are quickly resolved.Timely dissemination of key data is important, including the development of guidelines to ensure consistency across collections and that resources are made available to speed reporting for collections with historical delays.Improvement is needed in strategic communication about the purpose and use of data to federal agencies, researchers and to the general public.The current report builds on the Blueprint by developing implementation guidance for key recommendations. Where the Blueprint included actionable recommendations, such as naming discontinued surveys that should be resurrected, this report develops specific recommendations for implementation. The report is centered on three topics that were the highest priority for the expert panel but that required additional research before guidance could be disseminated. The research findings from that additional investigation are reported here, and recommendations to facilitate implementation are described. The three topic areas are as follows:The creation of a nonfatal firearms injury databaseIncreasing the quality, availability, and usefulness of firearms data for research and policyPractical steps for building state capacity and infrastructure to use data for evidence-based decision-makin

    Transmission in double quantum dots in the Kondo regime: Quantum-critical transitions and interference effects

    Full text link
    We study the transmission through a double quantum-dot system in the Kondo regime. An exact expression for the transmission coefficient in terms of fully interacting many-body Green's functions is obtained. By mapping the system into an effective Anderson impurity model, one can determine the transmission using numerical renormalization-group methods. The transmission exhibits signatures of the different Kondo regimes of the effective model, including an unusual Kondo phase with split peaks in the spectral function, as well as a pseudogapped regime exhibiting a quantum critical transition between Kondo and unscreened phases.Comment: 4 pages, 3 figures; Submitted to Physica E (EP2DS-17 proceedings, oral presentation), updated Ref

    Creating useful integrated data sets to inform public policy

    No full text
    The costs of traditional primary data collection have risen dramatically over the past decade. For example, the cost of the decennial census of population and housing, conducted by the U.S. Census Bureau, has risen from 6billionin2000toanestimated6 billion in 2000 to an estimated 14.5 billion in 2010. Other surveys and censuses conducted by the government have also risen in costs. Yet some of the same data are collected by other federal agencies and contained in administrative records such as Medicare and tax records. Sharing of administrative record data between federal agencies has the potential to increase the information that is available for policy makers while saving money. Significant policy issues related to safeguarding privacy and confidentiality, as well as questions about data quality have resulted in barriers that slow down or stop record sharing. But do the barriers address real or perceived problems? This research used two exploratory case studies to examine the creation of integrated data sets among three government agencies, the Internal Revenue Service (IRS), the U.S. Census Bureau (Census), and the Centers for Medicare and Medicaid Services (CMS). It identified the policy issues raised by the creation of such data pools and examined how these issues are approached in a decentralized governmental statistical system, such as that found in the United States. The creation of new, combined data sets and the related policy issues were examined through five dimensions, legal, technical, organizational, perceptual, and human. The case studies addressed the following research questions related to the sharing of administrative records between U.S. Federal agencies: (1) What is the life cycle flow of administrative records data on individuals and businesses between IRS, CMS, and the Census Bureau? (2) What are the significant issues that have arisen as a result of sharing administrative records related to the need to protect privacy and confidentiality? (3) What insights and potential solutions can be learned from the experience of those who have worked within the federal statistical system that would help address the significant data-sharing issues that have been identified? The study found that each agency involved in sharing administrative records is governed by a different set of statutes and regulations that only partially overlap. This patchwork of laws and regulations greatly slows down the initiation of record sharing projects. Participants at the agencies believe that privacy safeguards are adequate and effective. Participants at the agencies expend significant effort to assure that data are protected as required by law and by interagency agreements. Each agency has its own distinct internal processes for approving and tracking record sharing projects. There are no mature government-wide shared processes or criteria for reviewing or approving projects involving multiple agencies. The current processes are slow and burdensome and discourage initiation of new projects

    Developing, Validating, and Obtaining Stakeholder Buy-In for Criteria for Applying Social Science to Policymaking

    No full text
    This report outlines: a) a need for objective, transparent and usable criteria for judging the decision-readiness of published research evidence and b) the many, important research challenges associated with producing such criteria and ensuring their uptake in the scientific community and beyond. It was produced by Focus Group 2 at TECSS

    Developing, Validating, and Obtaining Stakeholder Buy-In for Criteria for Applying Social Science to Policymaking

    No full text
    This report outlines: a) a need for objective, transparent and usable criteria for judging the decision-readiness of published research evidence and b) the many, important research challenges associated with producing such criteria and ensuring their uptake in the scientific community and beyond. It was produced by Focus Group 2 at TECSS

    What Protects the Autonomy of the Federal Statistical Agencies? An Assessment of the Procedures in Place to Protect the Independence and Objectivity of Official U.S. Statistics

    No full text
    AbstractWe assess the professional autonomy of the 13 principal U.S. federal statistical agencies. We define six components or measures of such autonomy and evaluate each of the 13 principal statistical agencies according to each measure. Our assessment yields three main findings: (a) Challenges to the objectivity, credibility, and utility of federal statistics arise largely as a consequence of insufficient autonomy. (b) There is remarkable variation in autonomy protections and a surprising lack of statutory protections for many agencies for many of the proposed measures. (c) Many existing autonomy rules and guidelines are weakened by unclear or unactionable language. We conclude that a lack of professional autonomy unduly exposes the principal federal statistical agencies to efforts to undermine the objectivity of their products and that agencies cannot completely rebuff these efforts. Our main recommendations are to strengthen the role of the OMB Chief Statistician and to legislate new statutory autonomy protections, including explicit authorization for the principal federal statistical agencies that currently have no recognition in statute. We also recommend periodic assessments of the health of the federal statistical system, including not only autonomy protections and resources, but also how well agencies are satisfying data needs for the public good and using best methods to do so

    Nonresonant central exclusive production of charged-hadron pairs in proton-proton collisions at s\sqrt{s} = 13 TeV

    No full text
    The central exclusive production of charged-hadron pairs in pp collisions at a centre-of-mass energy of 13\TeV is examined, based on data collected in a special high-β\beta^* run of the LHC. The nonresonant continuum processes are studied with the invariant mass of the centrally produced two-pion system in the resonance-free region, mπ+πm_{\pi^+\pi^-}<\lt 0.7 GeV or mπ+πm_{\pi^+\pi^-}>\gt 1.8 GeV. Differential cross sections as functions of the azimuthal angle between the surviving protons, squared exchanged four-momenta, and mπ+πm_{\pi^+\pi^-} are measured in a wide region of scattered proton transverse momenta, between 0.2 and 0.8 GeV, and for pion rapidities y\lvert y\rvert<\lt 2. A rich structure of interactions related to double-pomeron exchange is observed. A parabolic minimum in the distribution of the two-proton azimuthal angle is observed for the first time. It can be interpreted as an effect of additional pomeron exchanges between the protons from the interference between the bare and the rescattered amplitudes. After model tuning, various physical quantities are determined that are related to the pomeron cross section, proton-pomeron and meson-pomeron form factors, pomeron trajectory and intercept, and coefficients of diffractive eigenstates of the proton

    Nonresonant central exclusive production of charged-hadron pairs in proton-proton collisions at s\sqrt{s} = 13 TeV

    No full text
    International audienceThe central exclusive production of charged-hadron pairs in pp collisions at a centre-of-mass energy of 13\TeV is examined, based on data collected in a special high-β\beta^* run of the LHC. The nonresonant continuum processes are studied with the invariant mass of the centrally produced two-pion system in the resonance-free region, mπ+πm_{\pi^+\pi^-}<\lt 0.7 GeV or mπ+πm_{\pi^+\pi^-}>\gt 1.8 GeV. Differential cross sections as functions of the azimuthal angle between the surviving protons, squared exchanged four-momenta, and mπ+πm_{\pi^+\pi^-} are measured in a wide region of scattered proton transverse momenta, between 0.2 and 0.8 GeV, and for pion rapidities y\lvert y\rvert<\lt 2. A rich structure of interactions related to double-pomeron exchange is observed. A parabolic minimum in the distribution of the two-proton azimuthal angle is observed for the first time. It can be interpreted as an effect of additional pomeron exchanges between the protons from the interference between the bare and the rescattered amplitudes. After model tuning, various physical quantities are determined that are related to the pomeron cross section, proton-pomeron and meson-pomeron form factors, pomeron trajectory and intercept, and coefficients of diffractive eigenstates of the proton
    corecore