16 research outputs found

    A mixed methods study of special education administrators in Tennessee on transition services for high school students with disabilities

    No full text
    Students with disabilities are faced with many challenges as they transition from high school to postsecondary education, enter the workforce, and begin independent living. The Individuals with Disabilities Education Act of 1990 (IDEA) initially addressed these challenges by mandating that transition planning be included as a part of the Individualized Education Plan (IEP) for a student with disabilities by the age of 16 or younger, if approved by a state education department. The state of Tennessee requires transition planning to begin by the age of 14. The federal IDEA mandate for transition planning does not mandate the type of transition program(s) to be provided in school districts, nor does IDEA mandate that a specific amount of federal money be spent on transition programs in school districts. This lack of federal guidance has become problematic and the result is that transition programs are not consistent as one compares among school districts and across states. Despite the efforts of the federal IDEA regarding transition programs, students with disabilities, as compared to their non-disabled peers, typically have experienced lower graduation rates, higher drop-out rates, and have graduated from high school and are unprepared for the world ahead of them. This mixed-methods research study analyzed the perceptions of special education administrators and the provision of transition programs in 120 Tennessee K-12 school districts. The participants responded to survey questions regarding the current transition programs and the future plans for transition programs in their local school district. Two special education administrators were interviewed to obtain an in-depth perception of transition programs in their respective school districts. Additionally, this study analyzed the effects of work-based learning opportunities and the hiring of transition coordinators on graduation rates of students with disabilities, drop-out rate, and ACT scores of K-12 school districts in Tennessee. The major findings of this study included the students who attend schools in the more populated districts, typically have access to more types of transition programs than those who attend in smaller districts. Additionally, the offering of work-based learning programs and the employment of transition coordinators do not have an effect on ACT scores, the drop-out rate, or attendance rates. Special education administrators understand the value and importance of transition services, however, budget and time constraints do not allow the administrators to address transition services as they would desire. The above findings from this study may provide information for the Tennessee State Department of Education, Division of Special Populations, the Tennessee State Department of Education Transition Planning Team, and local school district administrators as the state continues to promote transition program improvements

    \u3ci\u3eDrosophila\u3c/i\u3e Muller F Elements Maintain a Distinct Set of Genomic Properties Over 40 Million Years of Evolution

    Get PDF
    The Muller F element (4.2 Mb, ~80 protein-coding genes) is an unusual autosome of Drosophila melanogaster; it is mostly heterochromatic with a low recombination rate. To investigate how these properties impact the evolution of repeats and genes, we manually improved the sequence and annotated the genes on the D. erecta, D. mojavensis, and D. grimshawi F elements and euchromatic domains from the Muller D element. We find that F elements have greater transposon density (25–50%) than euchromatic reference regions (3–11%). Among the F elements, D. grimshawi has the lowest transposon density (particularly DINE-1: 2% vs. 11–27%). F element genes have larger coding spans, more coding exons, larger introns, and lower codon bias. Comparison of the Effective Number of Codons with the Codon Adaptation Index shows that, in contrast to the other species, codon bias in D. grimshawi F element genes can be attributed primarily to selection instead of mutational biases, suggesting that density and types of transposons affect the degree of local heterochromatin formation. F element genes have lower estimated DNA melting temperatures than D element genes, potentially facilitating transcription through heterochromatin. Most F element genes (~90%) have remained on that element, but the F element has smaller syntenic blocks than genome averages (3.4–3.6 vs. 8.4–8.8 genes per block), indicating greater rates of inversion despite lower rates of recombination. Overall, the F element has maintained characteristics that are distinct from other autosomes in the Drosophila lineage, illuminating the constraints imposed by a heterochromatic milieu

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
    corecore