707 research outputs found

    Sterile neutrino search with KATRIN - modeling and design-criteria of a novel detector system

    Get PDF
    A fundamental phenomenon in particle physics is the absence of massive objects in our universe: Dark Matter. A promising candidate that could explain these observations are sterile neutrinos with a mass of several keV/c2\mathrm{keV}/c^2. While it is presumed that sterile neutrinos do not interact via the weak force, they, due to their mass, still partake in neutrino oscillation. Consequently, it is experimentally possible to investigate their imprint in beta-decay experiments, such as the Karlsruhe tritium neutrino experiment (KATRIN). A dedicated search for sterile neutrinos however ensues a steep increase in the electron rate and thus requires the development of a new detector system, the TRISTAN detector. In addition, as the imprint of sterile neutrinos is presumably <107<10^{-7}, systematic uncertainties have to be understood and modeled with high precision. In this thesis systematics prevalent at the detector and spectrometer section of KATRIN will be discussed and their impact to a sterile neutrino sensitivity illuminated. The derived model is compared with data of the current KATRIN detector and with characterization measurements of the first TRISTAN prototype detectors, seven pixel silicon drift detectors. It is shown that the final TRISTAN detector requires a sophisticated redesign of the KATRIN detector section. Moreover, the combined impact of the back-scattering and electron charge-sharing systematic lead to an optimal detector magnetic field of Bdet=0.70.8TB_\mathrm{det}=0.7\dots0.8\,\mathrm{T}, which translates to a pixel radius of rpx=1.51.6mmr_\mathrm{px}=1.5\dots1.6\,\mathrm{mm}. The sensitivity analysis discusses individual effects as well as the combined impact of systematic uncertainties. It is demonstrated that the individual effects can be largely mitigated by shifting the tritium \bd energy spectrum above the \bd endpoint. In contrast, their combined impact to the sensitivity leads to an overall degradation and only mixing amplitudes of sin2θ4<3106\sin^2\theta_4<3\cdot10^{-6} would be reachable, even in an optimized case with very low and homogeneous detection deadlayer zdl=20±1nmz_\mathrm{dl}=20\pm1\,\mathrm{nm}. Assessing sterile neutrino mixing amplitudes of sin2θ4<107\sin^2\theta_4<10^{-7} thus requires disentangling of systematic effects. In a future measurement this could be for example achieved by vetoing detector events with large signal rise-times and small inter-event times

    Incorporation of uncertainties in real-time catchment flood forecasting

    Get PDF
    Floods have become the most prevalent and costly natural hazards in the U.S. When preparing real-time flood forecasts for a catchment flood warning and preparedness system, consideration must be given to four sources of uncertainty -- natural, data, model parameters, and model structure. A general procedure has been developed for applying reliability analysis to evaluate the effects of the various sources of uncertainty on hydrologic models used for forecasting and prediction of catchment floods. Three reliability analysis methods -- Monte Carlo simulation, mean value and advanced first-order second moment analyses (MVFOSM and AFOSM, respectively) - - were applied to the rainfall -runoff modeling reliability problem. Comparison of these methods indicates that the AFOSM method is probably best suited to the rainfall-runoff modeling reliability problem with the MVFOSM showing some promise. The feasibility and utility of the reliability analysis procedure are shown for a case study employing as an example the HEC-1 and RORB rainfall-runoff watershed models to forecast flood events on the Vermilion River watershed at Pontiac, Illinois. The utility of the reliability analysis approach is demonstrated for four important hydrologic problems: 1) determination of forecast (or prediction) reliability, 2) determination of the flood level exceedance probability due to a current storm and development of "rules of thumb" for flood warning decision making considering this probabilistic information, 3) determination of the key sources of uncertainty influencing model forecast reliability, 4) selection of hydrologic models based on comparison of model forecast reliability. Central to this demonstration is the reliability analysis methods' ability to estimate the exceedance probability for any hydrologic target level of interest and, hence, to produce forecast cumulative density functions and probability distribution functions. For typical hydrologic modeling cases, reduction of the underlying modeling uncertainties is the key to obtaining useful, reliable forecasts. Furthermore, determination of the rainfall excess is the primary source of uncertainty, especially in the estimation of the temporal and areal rainfall distributions.U.S. Department of the InteriorU.S. Geological SurveyOpe

    FABRICATION, MEASUREMENTS, AND MODELING OF SEMICONDUCTOR RADIATION DETECTORS FOR IMAGING AND DETECTOR RESPONSE FUNCTIONS

    Get PDF
    In the first part of this dissertation, we cover the development of a diamond semiconductor alpha-tagging sensor for associated particle imaging to solve challenges with currently employed scintillators. The alpha-tagging sensor is a double-sided strip detector made from polycrystalline CVD diamond. The performance goals of the alpha-tagging sensor are 700-picosecond timing resolution and 0.5 mm spatial resolution. A literature review summarizes the methodology, goals, and challenges in associated particle imaging. The history and current state of alpha-tagging sensors, followed by the properties of diamond semiconductors are discussed to close the literature review. The materials and methods used to calibrate the detector readout, fabricate the sensor, perform simulations, take measurements, and conduct data analysis are discussed. The results of our simulations and measurements are described with challenges and interpretations. The first part of the dissertation is concluded with potential solutions to challenges with our diamond alpha-tagging sensor design, recommendations of work to help further verify or refute diamonds viability for alpha tagging in associated particle imaging. In the second part of this dissertation, we cover the development of a high-purity germanium detector response function for the Los Alamos National Laboratory Detector Response Function Toolkit. The goal is to accurately model the pulse-height spectra measured by semiconductor radiation detectors. The literature review provides information on high-purity germanium radiation detectors and semiconductor charge transport kinematics. The components of the electronic readout and their effect on radiation measurements are discussed. The literature review ends with a discussion on different methods for building detector response functions. In the methods section, we explain our methodology for building detector response functions. This includes models of radiation transport, electrostatics, charge transport, and electronic readout components. Within the methods section, there are results from individual components to demonstrate their functionality. The results section is reserved for demonstrating the use of the detector response function as a whole. We provide the modeled pulse-height spectra for different radiation sources and user input parameters. These are compared to experimentally measured datasets. The second part of the dissertation concludes with a discussion of the benefits, drawbacks, and future improvements that could be made

    The 10th Jubilee Conference of PhD Students in Computer Science

    Get PDF

    Techniques for Low Voltage Scanning Electron Microscopy Linewidth Measurements

    Get PDF
    Present scanning electron microscopy (SEM) linewidth measurement systems, although state of the art , require better defined techniques in deriving operating parameters for precision measurements. Experiments were performed to check techniques used on cleaved and uncleaved specimens, void of conductive coatings to obtain optimum SEM operating parameters, and the variations in results due to changes in system operating conditions. In addition, a method was devised to select and use different calibration standards and evaluate SEM linewidth measurement systems

    Modern Data Acquisition, System Design, and Analysis Techniques and their Impact on the Physics-based Understanding of Neutron Coincidence Counters used for International Safeguards (draft)

    Get PDF
    Neutron coincidence counting is a technique widely used in the field of international safeguards for the mass quantification of a fissioning item. It can exploit either passive or active interrogation techniques to assay a wide range of plutonium, uranium, and mixed oxide items present in nuclear facilities worldwide. Because neutrons are highly penetrating, and the time correlation between events provides an identifiable signature, when combined with gamma spectroscopy, it has been used for nondestructive assays of special nuclear material for decades. When neutron coincidence counting was first established, a few system designs emerged as standards for assaying common containers. Over successive decades, new systems were developed for a wider variety of inspection assays. Simultaneously, new system characterization procedures, data acquisition technologies, and performance optimizations were made. The International Atomic Energy Agency has been using many of these original counters for decades, despite the large technological growth in recent years. This is both a testament and an opportunity. This dissertation explores several topics in which the performance of neutron coincidence counting systems is studied such that their behavior may be better understood from physical models, and their applications may be expanded to a greater field of interest. Using modern list mode data acquisition and analysis, procedures are developed, implemented, and exploited to expand the information obtained of both these systems and sources in question in a common measurement. System parameters such as coincidence time windows, dead time, efficiency, die-away time, and non-ideal double pulsing are explored in new ways that are not possible using traditional shift register logic. In addition, modern amplifier electronics are retrofitted in one model, the Uranium Neutron Coincidence Collar, to allow for a count rate-based source spatial response matrix to be measured, ultimately for the identification of diversion in a fresh fuel assembly. The testing, evaluation, and optimization of these electronics is described; they may serve as a more capable alternative to existing electronics used in IAEA systems. Finally, with a thorough understanding of the system characteristics and performance, neutron coincidence counters may be used to self-certify calibration sources with superior precision to national metrological laboratories

    Building performance simulation in the brave new world of Artificial Intelligence and Digital Twins : a systematic review

    Get PDF
    In an increasingly digital world, there are fast-paced developments in fields such as Artificial Intelligence, Machine Learning, Data Mining, Digital Twins, Cyber-Physical Systems and the Internet of Things. This paper reviews and discusses how these new emerging areas relate to the traditional domain of building performance simulation. It explores the boundaries between building simulation and these other fields in order to identify conceptual differences and similarities, strengths and limitations of each of these areas. The paper critiques common notions about these new domains and how they relate to building simulation, reviewing how the field of building performance may evolve and benefit from the new developments

    2009 International SWAT Conference Conference Proceedings

    Get PDF
    corecore