1,209 research outputs found

    The G0 Experiment: Apparatus for Parity-Violating Electron Scattering Measurements at Forward and Backward Angles

    Full text link
    In the G0 experiment, performed at Jefferson Lab, the parity-violating elastic scattering of electrons from protons and quasi-elastic scattering from deuterons is measured in order to determine the neutral weak currents of the nucleon. Asymmetries as small as 1 part per million in the scattering of a polarized electron beam are determined using a dedicated apparatus. It consists of specialized beam-monitoring and control systems, a cryogenic hydrogen (or deuterium) target, and a superconducting, toroidal magnetic spectrometer equipped with plastic scintillation and aerogel Cerenkov detectors, as well as fast readout electronics for the measurement of individual events. The overall design and performance of this experimental system is discussed.Comment: Submitted to Nuclear Instruments and Method

    Performance Evaluation of Open Graded Base Course with Doweled and Non-Doweled Transverse Joints

    Get PDF
    The objectives of this study were to investigate the performance of 20-year old doweled/non-doweled and dense-graded/permeable base test sections on three concrete pavement segments in Wisconsin: USH 18/151 in Iowa and Dane counties, STH 29 in Brown County, and USH 151 in Columbia and Dane Counties. Five pavement bases were placed including: dense graded, asphalt-stabilized permeable, cement-stabilized permeable, and untreated permeable having two gradation sizes. USH 18/151 test sections had similar performance (PDI) for doweled unsealed pavement on dense and permeable base. Distresses common to all segments included slight to moderate distressed joints/cracks and slight transverse faulting. Asphalt-stabilized permeable base had no slab breakup or surface distresses, however it measured a greater severity of distressed joints and cracks. Non-doweled sections having asphalt-stabilized permeable base and Transverse Inter Channel drains had better performance and ride than the other non-doweled sections. IRI was generally higher on non-doweled pavements, but many doweled sections had an equal roughness to non-doweled sections. Sealed non-doweled joints produced a better performing pavement, however, sealant did not appear to improve ride. STH 29 unsealed sections performed better than the median PDI for the sealed sections. The sealed doweled pavement did perform a little better than the non-doweled section, but the opposite occurred on the non-doweled sections. Sealed doweled joints had a smoother ride than the other combinations. USH 151 test sections found the finer-graded New Jersey permeable base had the smoothest ride when compared to other permeable sections. Asphalt-stabilized permeable base had the roughest ride, and unstabilized and cement-stabilized permeable bases had intermediate values. The average hydraulic conductivity for the unstabilized permeable base was 17,481 feet per day and there appears little variation due to doweling or joint sealant. Deflection load transfer results indicate expected high average values for the doweled sections and fair to poor values for the non-doweled sections. Slab support ratios indicate variable results based on base type and joint reinforcement/sealant. Life-cycle cost analysis found dense-graded base was the least cost among all base alternatives, with a total estimated present-worth life-cycle cost of $665,133 per roadway mile. Untreated and asphalt-stabilized permeable bases were more expensive by 13% and 27%, respectively. Other factors in selecting dense-graded base over permeable base include project drainage conditions set forth in the FDM guidelines an anticipated increase in pavement surface roughness

    A novel method for subjective picture quality assessment and further studies of HDTV formats

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ IEEE 2008.This paper proposes a novel method for the assessment of picture quality, called triple stimulus continuous evaluation scale (TSCES), to allow the direct comparison of different HDTV formats. The method uses an upper picture quality anchor and a lower picture quality anchor with defined impairments. The HDTV format under test is evaluated in a subjective comparison with the upper and lower anchors. The method utilizes three displays in a particular vertical arrangement. In an initial series of tests with the novel method, the HDTV formats 1080p/50,1080i/25, and 720p/50 were compared at various bit-rates and with seven different content types on three identical 1920 times 1080 pixel displays. It was found that the new method provided stable and consistent results. The method was tested with 1080p/50,1080i/25, and 720p/50 HDTV images that had been coded with H.264/AVC High profile. The result of the assessment was that the progressive HDTV formats found higher appreciation by the assessors than the interlaced HDTV format. A system chain proposal is given for future media production and delivery to take advantage of this outcome. Recommendations for future research conclude the paper

    Dynamic forecasts of qualitative variables: a Qual VAR model of U.S. recessions

    Get PDF
    This article presents a new Qual VAR model for incorporating information from qualitative and/or discrete variables in vector autoregressions. With a Qual VAR, it is possible to create dynamic forecasts of the qualitative variable using standard VAR projections. Previous forecasting methods for qualitative variables, in contrast, only produce static forecasts. I apply the Qual VAR to forecasting the 2001 business recession out of sample and to analyzing the Romer and Romer (1989) narrative measure of monetary policy contractions as an endogenous variable in a VAR. Out of sample, the model predicts the timing of the 2001 recession quite well relative to the recession probabilities put forth at the time by professional forecasters. Qual VARs -- which include information about the qualitative variable -- can also enhance the quality of density forecasts of the other variables in the system.Forecasting ; Recessions ; Vector autoregression

    Non-invasive vascular assessment using photoplethysmography

    Get PDF
    Photoplethysmography (PPG) has become widely accepted as a valuable clinical tool for performing non-invasive biomedical monitoring. The dominant clinical application of PPG has been pulse oximetry, which uses spectral analysis of the peripheral blood supply to establish haemoglobin saturation. PPG has also found success in screening for venous dysfunction, though to a limited degree. Arterial Disease (AD) is a condition where blood flow in the arteries of the body is reduced,a condition known as ischaernia. Ischaernia can result in pain in the affected areas, such as chest pain for an ischearnic heart, but does not always produce symptoms. The most common form of AD is arteriosclerosis, which affects around 5% of the population over 50 years old. Arteriosclerosis, more commonly known as 'hardening of the arteries' is a condition that results in a gradual thickening, hardening and loss of elasticity in the walls of the arteries, reducing overall blood flow. This thesis investigates the possibility of employing PPG to perform vascular assessment, specifically arterial assessment, in two ways. PPG based perfusion monitoring may allow identification of ischaernia in the periphery. To further investigate this premise, prospective experimental trials are performed, firstly to assess the viability of PPG based perfusion monitoring and culminating in the development of a more objective method for determining ABPI using PPG based vascular assessment. A complex interaction between the heart and the connective vasculature, detected at the measuring site, generates the PPG signal. The haemodynamic properties of the vasculature will affect the shape of the PPG waveform, characterising the PPG signal with the properties of the intermediary vasculature. This thesis investigates the feasibility of deriving quantitative vascular parameters from the PPG signal. A quantitative approach allows direct identification of pathology, simplifying vascular assessment. Both forward and inverse models are developed in order to investigate this topic. Application of the models in prospective experimental trials with both normal subjects and subjects suffering PVD have shown encouraging results. It is concluded that the PPG signal contains information on the connective vasculature of the subject. PPG may be used to perform vascular assessment using either perfusion based techniques, where the magnitude of the PPG signal is of interest, or by directly assessing the connective vasculature using PPG, where the shape of the PPG signal is of interest. it is argued that PPG perfusion based techniques for performing the ABPI diagnosis protocol can offer greater sensitivity to the onset of PAD, compared to more conventional methods. It is speculated that the PPG based ABPI diagnosis protocol could provide enhanced PAD diagnosis, detecting the onset of the disease and allowing a treatmenpt lan to be formed soonert han was possible previously. The determination of quantitative vascular parameters using PPG shape could allow direct vascular diagnosis, reducing subjectivity due to interpretation. The prospective trials investigating PPG shape analysis concentrated on PVD diagnosis, but it is speculated that quantitative PPG shaped based vascular assessment could be a powerful tool in the diagnosis of many vascular based pathological conditions

    Sterile neutrino search with KATRIN - modeling and design-criteria of a novel detector system

    Get PDF
    A fundamental phenomenon in particle physics is the absence of massive objects in our universe: Dark Matter. A promising candidate that could explain these observations are sterile neutrinos with a mass of several keV/c2\mathrm{keV}/c^2. While it is presumed that sterile neutrinos do not interact via the weak force, they, due to their mass, still partake in neutrino oscillation. Consequently, it is experimentally possible to investigate their imprint in beta-decay experiments, such as the Karlsruhe tritium neutrino experiment (KATRIN). A dedicated search for sterile neutrinos however ensues a steep increase in the electron rate and thus requires the development of a new detector system, the TRISTAN detector. In addition, as the imprint of sterile neutrinos is presumably <107<10^{-7}, systematic uncertainties have to be understood and modeled with high precision. In this thesis systematics prevalent at the detector and spectrometer section of KATRIN will be discussed and their impact to a sterile neutrino sensitivity illuminated. The derived model is compared with data of the current KATRIN detector and with characterization measurements of the first TRISTAN prototype detectors, seven pixel silicon drift detectors. It is shown that the final TRISTAN detector requires a sophisticated redesign of the KATRIN detector section. Moreover, the combined impact of the back-scattering and electron charge-sharing systematic lead to an optimal detector magnetic field of Bdet=0.70.8TB_\mathrm{det}=0.7\dots0.8\,\mathrm{T}, which translates to a pixel radius of rpx=1.51.6mmr_\mathrm{px}=1.5\dots1.6\,\mathrm{mm}. The sensitivity analysis discusses individual effects as well as the combined impact of systematic uncertainties. It is demonstrated that the individual effects can be largely mitigated by shifting the tritium \bd energy spectrum above the \bd endpoint. In contrast, their combined impact to the sensitivity leads to an overall degradation and only mixing amplitudes of sin2θ4<3106\sin^2\theta_4<3\cdot10^{-6} would be reachable, even in an optimized case with very low and homogeneous detection deadlayer zdl=20±1nmz_\mathrm{dl}=20\pm1\,\mathrm{nm}. Assessing sterile neutrino mixing amplitudes of sin2θ4<107\sin^2\theta_4<10^{-7} thus requires disentangling of systematic effects. In a future measurement this could be for example achieved by vetoing detector events with large signal rise-times and small inter-event times

    Decision support system for forecasting product and waste disposal over time

    Get PDF
    A current barrier to the end-of-life product strategies is a lack of knowledge of the quantity and timing of product returns or waste generation. Consequently, a Decision Support System (DSS), to predict the timing and quantity of waste generation is needed. Therefore a web-based system that predicts the rate of product entering the waste-stream has been developed. The system accepts information relating to sales, reliability, storage and disposal behavior. This data is used to simulate the return waste flow. The DSS can be used to simulate the effect of different policies and to provide data for end-of-life and multi-lifecycle strategy formulation. Cathode Ray Tubes (CRTs) are evaluated as a test case. In the test case, the amount of CRT based color televisions entering the waste-stream is estimated using the Forecasting software. The effects of certain parameters like pre-disposal storage and disposal due to obsolescence of the televisions are computed. Effect of the introduction of televisions using the newer Flat Panel Display technology is also estimated. The effect of a legal ban on the disposal of CRT based televisions in the state of Massachusetts is also modelled
    corecore