8,809 research outputs found

    Child Abuse Reporting: Rethinking Child Protection

    Get PDF
    The general public has been bewildered by the magnitude of sex abuse cases and the widespread failure by pillars of the community to notify appropriate authorities. The crime of sexually abusing children is punishable in all jurisdictions and this article examines the duty to report suspected cases by individuals in positions of trust over young people, such as in the church or university sports. The Federal Child Abuse Prevention and Treatment Act (CAPTA) defines child maltreatment as an act or failure to act on the part of a parent or caregiver that results in death, serious physical or emotional harm, sexual abuse, or exploitation, and establishes minimum federal standards. Each state has its own definitions of maltreatment and every state identifies persons who are required to report child abuse. As such, state law is highly variable in defining who has a mandatory duty to report, and clergy and other individuals in close supervision of children (e.g., athletic coaches, scout leaders, volunteers in religious programs, and university officials) may necessarily hold such duty. The article outlines why there are strong moral reasons the law should require all adults in close supervision of children to report any individual who they have good reason to believe has abused a child and moreover outlines how to ensure prompt reporting of abuse, while still ensuring that respected individuals are not falsely accused

    High Stakes Institutional Translation: Establishing North America’s First Government-Sanctioned Supervised Injection Site

    Get PDF
    Around the world, potentially effective responses to serious social problems are left untried because those responses are politically, culturally or morally problematic in affected communities. I describe the process through which communities import such practices as “high-stakes institutional translation”. Drawing on a study of North America’s first supervised injection site for users of illegal drugs, I propose a process model of high-stakes institutional translation that involves a triggering period of public expressions of intense emotion, followed by waves of translations in which the controversial practice is constructed in discursive and material terms many times over

    Delayed Sampling and Automatic Rao-Blackwellization of Probabilistic Programs

    Full text link
    We introduce a dynamic mechanism for the solution of analytically-tractable substructure in probabilistic programs, using conjugate priors and affine transformations to reduce variance in Monte Carlo estimators. For inference with Sequential Monte Carlo, this automatically yields improvements such as locally-optimal proposals and Rao-Blackwellization. The mechanism maintains a directed graph alongside the running program that evolves dynamically as operations are triggered upon it. Nodes of the graph represent random variables, edges the analytically-tractable relationships between them. Random variables remain in the graph for as long as possible, to be sampled only when they are used by the program in a way that cannot be resolved analytically. In the meantime, they are conditioned on as many observations as possible. We demonstrate the mechanism with a few pedagogical examples, as well as a linear-nonlinear state-space model with simulated data, and an epidemiological model with real data of a dengue outbreak in Micronesia. In all cases one or more variables are automatically marginalized out to significantly reduce variance in estimates of the marginal likelihood, in the final case facilitating a random-weight or pseudo-marginal-type importance sampler for parameter estimation. We have implemented the approach in Anglican and a new probabilistic programming language called Birch.Comment: 13 pages, 4 figure

    Changing District Culture and Capacity: The Impact of the Merck Institute for Science Education Partnership

    Get PDF
    In 1993, Merck & Co., Inc. began an endeavor to make a significant and visible commitment to improving science education by creating the Merck Institute for Science Education (MISE) and supported the new venture with a 10-year, $20-million financial commitment. From its inception, MISE had two goals: to raise the interest, participation, and performance of public school students in science, and to demonstrate to other businesses that direct, focused involvement would hasten the improvement of science teaching and learning in the public schools. MISE initiated its work by forming partnerships with four public school districts — Linden, Rahway, and Readington Township in New Jersey, and North Penn in Pennsylvania — where Merck has major facilities. CPRE was contracted by MISE in 1993 to document the implementation of the initiative and assess its impact on districts, schools, classrooms, and students. Throughout the evaluation, CPRE conducted interviews with teachers, instructional leaders, and district personnel; surveyed teachers; developed case studies of schools; and examined student achievement data in order to provide feedback on the progress of the MISE Partnershi

    P-Code-Enhanced Encryption-Mode Processing of GPS Signals

    Get PDF
    A method of processing signals in a Global Positioning System (GPS) receiver has been invented to enable the receiver to recover some of the information that is otherwise lost when GPS signals are encrypted at the transmitters. The need for this method arises because, at the option of the military, precision GPS code (P-code) is sometimes encrypted by a secret binary code, denoted the A code. Authorized users can recover the full signal with knowledge of the A-code. However, even in the absence of knowledge of the A-code, one can track the encrypted signal by use of an estimate of the A-code. The present invention is a method of making and using such an estimate. In comparison with prior such methods, this method makes it possible to recover more of the lost information and obtain greater accuracy

    The Scientific Reach of Multi-Ton Scale Dark Matter Direct Detection Experiments

    Get PDF
    The next generation of large scale WIMP direct detection experiments have the potential to go beyond the discovery phase and reveal detailed information about both the particle physics and astrophysics of dark matter. We report here on early results arising from the development of a detailed numerical code modeling the proposed DARWIN detector, involving both liquid argon and xenon targets. We incorporate realistic detector physics, particle physics and astrophysical uncertainties and demonstrate to what extent two targets with similar sensitivities can remove various degeneracies and allow a determination of dark matter cross sections and masses while also probing rough aspects of the dark matter phase space distribution. We find that, even assuming dominance of spin-independent scattering, multi-ton scale experiments still have degeneracies that depend sensitively on the dark matter mass, and on the possibility of isospin violation and inelasticity in interactions. We find that these experiments are best able to discriminate dark matter properties for dark matter masses less than around 200 GeV. In addition, and somewhat surprisingly, the use of two targets gives only a small improvement (aside from the advantage of different systematics associated with any claimed signal) in the ability to pin down dark matter parameters when compared with one target of larger exposure.Comment: 23 pages; updated to match PRD versio

    Searching for Dark Matter at the LHC with a Mono-Z

    Full text link
    We investigate a mono-Z process as a potential dark matter search strategy at the LHC. In this channel a single Z boson recoils against missing transverse momentum, attributed to dark matter particles, χ\chi, which escape the detector. This search strategy is related, and complementary to, monojet and monophoton searches. For illustrative purposes we consider the process qqˉ>χχZq\bar{q} -> \chi\chi Z in a toy dark matter model, where the Z boson is emitted from either the initial state quarks, or from the internal propagator. Among the signatures of this process will be a pair of muons with high pT that reconstruct to the invariant mass of the Z, and large amounts of missing transverse energy. Being a purely electroweak signal, QCD and other Standard Model backgrounds are relatively easily removed with modest selection cuts. We compare the signal to Standard Model backgrounds and demonstrate that, even for conservative cuts, there exist regions of parameter space where the signal may be clearly visible above background in future LHC data, allowing either new discovery potential or the possibility of supplementing information about the dark sector beyond that available from other observable channels.Comment: 11 pages, 13 figure
    corecore