548 research outputs found

    The People’s Business: The Case for Amending New York Civil Rights Law Section 50-a

    Get PDF
    For more than forty years, New York Civil Rights Law section 50-a has harmed New Yorkers by shielding the release of police officers’ “personnel records,” including in the aftermath of substantiated complaints of misconduct. With the aid of numerous New York Court Appeals decisions, this statute progressively transformed from a relatively nuanced protection for testifying officers during trial, to its ultimate status as an outright bar to virtually all public disclosures. In fact, the New York Court Appeals has even held that section 50-a supersedes New York’s Freedom of Information Law (FOIL), thereby prohibiting even redacted FOIL disclosures. By prioritizing secrecy rather than accountability, the section 50-a system threatens the public’s essential oversight power over police, and harms those who fall victim to misconduct. Moreover, these longstanding concerns are only exacerbated because police departments are typically ineffective self-regulators and often promote cultures that perpetuate misconduct. Despite this cascade of public policy failures, the New York State Legislature largely ignored calls for change, and only altered course in June of 2020 after the historic public demonstrations that occurred in the aftermath of George Floyd’s death. Although inconceivable weeks earlier, the legislature promptly repealed section 50-a and the governor signed the bill into law just days later. While undoubtedly rendered moot immediately prior to publication, this note proposes an unadopted alternative solution to the section 50-a problem, which includes amending the statute and reasserting FOIL’s preeminence over the disclosure of police records in New York. This remedy is derived after a careful analysis of the varying disclosure laws adopted across the United States, represented here by Connecticut, Massachusetts, and California. With a goal of promoting high levels of public accessibility to police records, this note urges for the adoption of a system much like the one used in Connecticut, which prioritizes FOIL disclosures while carefully weighing law enforcement’s legitimate privacy concerns. After decades of secrecy, this straightforward approach will ensure that police records, like all government documents, belong to the people

    Estimating Markov Chain Mixing Times: Convergence Rate Towards Equilibrium of a Stochastic Process Traffic Assignment Model

    Get PDF
    Network equilibrium models have been extensively used for decades. The rationale for using equilibrium as a predictor is essentially that (i) a unique and globally stable equilibrium point is guaranteed to exist and (ii) the transient period over which a system adapts to a change is sufficiently short in time that it can be neglected. However, we find transport problems without a unique and stable equilibrium in the literature. Even if it exists, it is not certain how long it takes for the system to reach an equilibrium point after an external shock onto the transport system, such as infrastructure improvement and damage by a disaster. The day-to-day adjustment process must be analysed to answer these questions. Among several models, the Markov chain approach has been claimed to be the most general and flexible. It is also advantageous as a unique stationary distribution is guaranteed in mild conditions, even when a unique and stable equilibrium does not exist. In the present paper, we first aim to develop a methodology for estimating the Markov chain mixing time (MCMT), a worst-case assessment of the convergence time of a Markov chain to its stationary distribution. The main tools are coupling and aggregation, which enable us to analyse MCMTs in large-scale transport systems. Our second aim is to conduct a preliminary examination of the relationships between MCMTs and critical properties of the system, such as travellers’ sensitivity to differences in travel cost and the frequency of travellers’ revisions of their choices. Through analytical and numerical analyses, we found key relationships in a few transport problems, including those without a unique and stable equilibrium. We also showed that the proposed method, combined with coupling and aggregation, can be applied to larger transport models

    Community Member Perspectives from Transgender Women and Men Who Have Sex with Men on Pre-Exposure Prophylaxis as an HIV Prevention Strategy: Implications for Implementation

    Get PDF
    Background: An international randomized clinical trial (RCT) on pre-exposure prophylaxis (PrEP) as an human immunodeficiency virus (HIV)-prevention intervention found that taken on a daily basis, PrEP was safe and effective among men who have sex with men (MSM) and male-to-female transgender women. Within the context of the HIV epidemic in the United States (US), MSM and transgender women are the most appropriate groups to target for PrEP implementation at the population level; however, their perspectives on evidenced-based biomedical research and the results of this large trial remain virtually unknown. In this study, we examined the acceptability of individual daily use of PrEP and assessed potential barriers to community uptake. Methods: We conducted semi-structured interviews with an ethnoracially diverse sample of thirty HIV-negative and unknown status MSM (n = 24) and transgender women (n = 6) in three California metropolitan areas. Given the burden of disease among ethnoracial minorities in the US, we purposefully oversampled for these groups. Thematic coding and analysis of data was conducted utilizing an approach rooted in grounded theory. Results: While participants expressed general interest in PrEP availability, results demonstrate: a lack of community awareness and confusion about PrEP; reservations about PrEP utilization, even when informed of efficacious RCT results; and concerns regarding equity and the manner in which a PrEP intervention could be packaged and marketed in their communities. Conclusions: In order to effectively reduce HIV health disparities at the population level, PrEP implementation must take into account the uptake concerns of those groups who would actually access and use this biomedical intervention as a prevention strategy. Recommendations addressing these concerns are provided

    Characterization of edge damage induced on REBCO superconducting tape by mechanical slitting

    Get PDF
    Rare-earth barium-copper-oxide (REBCO) superconductors are high-field superconductors fabricated in a tape geometry that can be utilized in magnet applications well in excess of 20 T. Due to the multilayer architecture of the tape, delamination is one cause of mechanical failure in REBCO tapes. During a mechanical slitting step in the manufacturing process, edge cracks can be introduced into the tape. These cracks are thought to be potential initiation sites for crack propagation in the tapes when subjected to stresses in the fabrication and operation of magnet systems. We sought to understand which layers were the mechanically weakest by locating the crack initiation layer and identifying the geometrical conditions of the slitter that promoted or suppressed crack formation. The described cracking was investigated by selectively etching and characterizing each layer with scanning electron microscopy, laser confocal microscopy, and digital image analysis. Our analysis showed that the average crack lengths in the REBCO, LaMnO3 (LMO) and Al2O3 layers were 34 ÎĽm, 28 ÎĽm, and 15 ÎĽm, respectively. The total number of cracks measured in 30mmof wire length was between 3000 and 5700 depending on the layer and their crack densities were 102 cracks mm-1 for REBCO, 108 cracks mm-1 for LMO, and 183 cracks mm-1 for Al2O3. These results indicated that there are separate crack initiation mechanisms for the REBCO and the LMO layers, as detailed in the paper. With a better understanding of the crack growth behavior exhibited by REBCO tapes, the fabrication process can be improved to provide a more mechanically stable and cost-effective superconductor

    Gridded and direct Epoch of Reionisation bispectrum estimates using the Murchison Widefield Array

    Full text link
    We apply two methods to estimate the 21~cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly-spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uvuv-plane. The direct and gridded bispectrum estimators are applied to 21 hours of high-band (167--197~MHz; zz=6.2--7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 hours, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21~cm bispectrum may be accessible in less time than the 21~cm power spectrum for some wave modes, with detections in hundreds of hours.Comment: 19 pages, 10 figures, accepted for publication in PAS

    The Murchison Widefield Array Transients Survey (MWATS). A search for low frequency variability in a bright Southern hemisphere sample

    Get PDF
    We report on a search for low-frequency radio variability in 944 bright (> 4Jy at 154 MHz) unresolved, extragalactic radio sources monitored monthly for several years with the Murchison Widefield Array. In the majority of sources we find very low levels of variability with typical modulation indices < 5%. We detect 15 candidate low frequency variables that show significant long term variability (>2.8 years) with time-averaged modulation indices M = 3.1 - 7.1%. With 7/15 of these variable sources having peaked spectral energy distributions, and only 5.7% of the overall sample having peaked spectra, we find an increase in the prevalence of variability in this spectral class. We conclude that the variability seen in this survey is most probably a consequence of refractive interstellar scintillation and that these objects must have the majority of their flux density contained within angular diameters less than 50 milli-arcsec (which we support with multi-wavelength data). At 154 MHz we demonstrate that interstellar scintillation time-scales become long (~decades) and have low modulation indices, whilst synchrotron driven variability can only produce dynamic changes on time-scales of hundreds of years, with flux density changes less than one milli-jansky (without relativistic boosting). From this work we infer that the low frequency extra-galactic southern sky, as seen by SKA-Low, will be non-variable on time-scales shorter than one year.Comment: 19 pages, 11 figure

    Comparison of selected foreign plans and practices for spent fuel and high-level waste management

    Get PDF
    This report describes the major parameters for management of spent nuclear fuel and high-level radioactive wastes in selected foreign countries as of December 1989 and compares them with those in the United States. The foreign countries included in this study are Belgium, Canada, France, the Federal Republic of Germany, Japan, Sweden, Switzerland, and the United Kingdom. All the countries are planning for disposal of spent fuel and/or high-level wastes in deep geologic repositories. Most countries (except Canada and Sweden) plan to reprocess their spent fuel and vitrify the resultant high-level liquid wastes; in comparison, the US plans direct disposal of spent fuel. The US is planning to use a container for spent fuel as the primary engineered barrier. The US has the most developed repository concept and has one of the earliest scheduled repository startup dates. The repository environment presently being considered in the US is unique, being located in tuff above the water table. The US also has the most prescriptive regulations and performance requirements for the repository system and its components. 135 refs., 8 tabs

    WSClean : an implementation of a fast, generic wide-field imager for radio astronomy

    Get PDF
    This article has been accepted for publication in Monthly Notices of the Royal Astronomical Society. © 2014 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.Astronomical widefield imaging of interferometric radio data is computationally expensive, especially for the large data volumes created by modern non-coplanar many-element arrays. We present a new widefield interferometric imager that uses the w-stacking algorithm and can make use of the w-snapshot algorithm. The performance dependencies of CASA's w-projection and our new imager are analysed and analytical functions are derived that describe the required computing cost for both imagers. On data from the Murchison Widefield Array, we find our new method to be an order of magnitude faster than w-projection, as well as being capable of full-sky imaging at full resolution and with correct polarisation correction. We predict the computing costs for several other arrays and estimate that our imager is a factor of 2-12 faster, depending on the array configuration. We estimate the computing cost for imaging the low-frequency Square-Kilometre Array observations to be 60 PetaFLOPS with current techniques. We find that combining w-stacking with the w-snapshot algorithm does not significantly improve computing requirements over pure w-stacking. The source code of our new imager is publicly released.Peer reviewedFinal Published versio
    • …
    corecore