2,158 research outputs found

    CHANCE CONSTRAINED PROGRAMMING MODELS FOR RISK-BASED ECONOMIC AND POLICY ANALYSIS OF SOIL CONSERVATION

    Get PDF
    The random nature of soil loss under alternative land-use practices should be an important consideration of soil conservation planning and analysis under risk. Chance constrained programming models can provide information on the trade-offs among pre-determined tolerance levels of soil loss, probability levels of satisfying the tolerance levels, and economic profits or losses resulting from soil conservation to soil conservation policy makers. When using chance constrained programming models, the distribution of factors being constrained must be evaluated. If random variables follow a log-normal distribution, the normality assumption, which is generally used in the chance constrained programming models, can bias the results.Risk and Uncertainty,

    Secondary Electron Emission Beam Loss Monitor for LHC

    Get PDF
    Beam Loss Monitoring (BLM) system is a vital part of the active protection of the LHC accelerators' elements. It should provide the number of particles lost from the primary hadron beam by measuring the radiation field induced by their interaction with matter surrounding the beam pipe. The LHC BLM system will use ionization chambers as standard detectors but in the areas where very high dose rates are expected, the Secondary Emission Monitor (SEM) chambers will be employed because of their high linearity, low sensitivity and fast response. The SEM needs a high vacuum for proper operation and has to be functional for up to 20 years, therefore all the components were designed according to the UHV requirements and a getter pump was included. The SEM electrodes are made of Ti because of its Secondary Emission Yield (SEY) stability. The sensitivity of the SEM was modeled in Geant4 via the Photo-Absorption Ionization module together with custom parameterization of the very low energy secondary electron production. The prototypes were calibrated by proton beams in CERN PS Booster dump line, SPS transfer line and in PSI Optis line. The results were compared to the simulations

    Abort Gap Cleaning using the Transverse Feedback System: Simulation and Measurements in the SPS for the LHC Beam Dump System

    Get PDF
    The critical and delicate process of dumping the beams of the LHC requires very low particle densities within the 3Ό3 \mus of the dump kicker rising edge. High beam population in this so-called 'abort gap' might cause magnet quenches or even damage. Constant refilling due to diffusion processes is expected which will be counter-acted by an active abort gap cleaning system employing the transverse feedback kickers. In order to assess the feasibility and performance of such an abort gap cleaning system, simulations and measurements with beam in the SPS have been performed. Here we report on the results of these studies

    Classification of the LHC BLM Ionization Chamber

    Get PDF
    The LHC beam loss monitoring (BLM) system must prevent the super conducting magnets from quenching and protect the machine components from damage. The main monitor type is an ionization chamber. About 4000 of them will be installed around the ring. The lost beam particles initiate hadronic showers through the magnets and other machine components. These shower particles are measured by the monitors installed on the outside of the accelerator equipment. For the calibration of the BLM system the signal response of the ionization chamber is simulated in GEANT4 for all relevant particle types and energies (keV to TeV range). For validation, the simulations are compared to measurements using protons, neutrons, photons and mixed radiation fields at various energies and intensities. This paper will focus on the signal response of the ionization chamber to various particle types and energies including space charge effects at high ionization densities

    Utility and lower limits of frequency detection in surface electrode stimulation for somatosensory brain-computer interface in humans

    Get PDF
    Objective: Stimulation of the primary somatosensory cortex (S1) has been successful in evoking artificial somatosensation in both humans and animals, but much is unknown about the optimal stimulation parameters needed to generate robust percepts of somatosensation. In this study, the authors investigated frequency as an adjustable stimulation parameter for artificial somatosensation in a closed-loop brain-computer interface (BCI) system. Methods: Three epilepsy patients with subdural mini-electrocorticography grids over the hand area of S1 were asked to compare the percepts elicited with different stimulation frequencies. Amplitude, pulse width, and duration were held constant across all trials. In each trial, subjects experienced 2 stimuli and reported which they thought was given at a higher stimulation frequency. Two paradigms were used: first, 50 versus 100 Hz to establish the utility of comparing frequencies, and then 2, 5, 10, 20, 50, or 100 Hz were pseudorandomly compared. Results: As the magnitude of the stimulation frequency was increased, subjects described percepts that were “more intense” or “faster.” Cumulatively, the participants achieved 98.0% accuracy when comparing stimulation at 50 and 100 Hz. In the second paradigm, the corresponding overall accuracy was 73.3%. If both tested frequencies were less than or equal to 10 Hz, accuracy was 41.7% and increased to 79.4% when one frequency was greater than 10 Hz (p = 0.01). When both stimulation frequencies were 20 Hz or less, accuracy was 40.7% compared with 91.7% when one frequency was greater than 20 Hz (p < 0.001). Accuracy was 85% in trials in which 50 Hz was the higher stimulation frequency. Therefore, the lower limit of detection occurred at 20 Hz, and accuracy decreased significantly when lower frequencies were tested. In trials testing 10 Hz versus 20 Hz, accuracy was 16.7% compared with 85.7% in trials testing 20 Hz versus 50 Hz (p < 0.05). Accuracy was greater than chance at frequency differences greater than or equal to 30 Hz. Conclusions: Frequencies greater than 20 Hz may be used as an adjustable parameter to elicit distinguishable percepts. These findings may be useful in informing the settings and the degrees of freedom achievable in future BCI systems

    The key role of surface tension in the transport and quantification of plastic pollution in rivers

    Get PDF
    Current riverine plastic monitoring best practices mainly consider surface observations, thus neglecting the underlying distribution of plastics in the water column. Bias on plastic budgets estimations hinders advances on modelling and prediction of plastics fate. Here, we experimentally disclose the structure of plastics transport in surface water flows by investigating how thousands of samples of plastics commonly found in fluvial environments travel in turbulent river flows. We show for the first time that surface tension plays a key role in the transport of plastics since its effects can be of the same magnitude as buoyancy and turbulence, therefore holding a part of the dispersed buoyant plastics captive by the water surface. We investigate two types of transport; surfaced plastics (surface tension-turbulence-buoyancy dominated), in contact with the free surface, and suspended plastics (turbulence-buoyancy dominated). We prove that this duality in transport modes is a major source of error in the estimation of plastic budgets, which can be underestimated by 90 % following current, well-established monitoring protocols if sampling is conducted solely in the water surface. Based on our empirical findings, we optimize physics-driven monitoring strategies for plastic fluxes in rivers, thereby achieving over a ten-fold reduction of the bias and uncertainty of riverine plastic pollution estimates.</p

    LHC Beam Loss Detector Design: Simulations and Measurements

    Get PDF
    The Beam Loss Monitoring (BLM) system is integrated in the active equipment protection system of the LHC. It determines the number of particles lost from the primary hadron beam by measuring the radiation field of the shower particles outside of the vacuum chamber. The LHC BLM system will use ionization chambers as its standard detectors but in the areas where very high dose rates are expected, the Secondary Emission Monitor (SEM) chambers will be additionally employed because of their high linearity, low sensitivity and fast response.The sensitivity of the SEM was modeled in Geant4 via the Photo-Absorption Ionization module together with custom parameterization of the very low energy secondary electron production. The prototypes were calibrated by proton beams. For the calibration of the BLM system the signal response of the ionization chamber is simulated in Geant4 for all relevant particle types and energies (keV to TeV range). The results are validated by comparing the simulations to measurements using protons, neutrons, photons and mixed radiation fields at various energies and intensities

    Generation of 1.5 Million Beam Loss Threshold Values

    Get PDF
    CERN's Large Hadron Collider will store an unprecedented amount of energy in its circulating beams. Beamloss monitoring (BLM) is, therefore, critical for machine protection. It must protect against the consequences (equipment damage, quenches of superconducting magnets) of excessive beam loss. About 4000 monitors will be installed at critical loss locations. Each monitor has 384 beam abort thresholds associated; for 12 integrated loss durations (40Ό40\mus to 83 s) and 32 energies (450GeV to 7 TeV). Depending on monitor location, the thresholds vary by orders of magnitude. For simplification, the monitors are grouped in 'families'. Monitors of one family protect similar magnets against equivalent loss scenarios. Therefore, they are given the same thresholds. The start-up calibration of the BLM system is required to be within a factor of five in accuracy; and the final accuracy should be a factor of two. Simulations (backed-up by control measurements) determine the relation between the BLM signal, the deposited energy and the critical energy deposition for damage or quench (temperature of the coil). The paper presents the strategy of determining 1.5 million threshold values

    Smartphone-recorded physical activity for estimating cardiorespiratory fitness

    Get PDF
    Abstract While cardiorespiratory fitness is strongly associated with mortality and diverse outcomes, routine measurement is limited. We used smartphone-derived physical activity data to estimate fitness among 50 older adults. We recruited iPhone owners undergoing cardiac stress testing and collected recent iPhone physical activity data. Cardiorespiratory fitness was measured as peak metabolic equivalents of task (METs) achieved on cardiac stress test. We then estimated peak METs using multivariable regression models incorporating iPhone physical activity data, and validated with bootstrapping. Individual smartphone variables most significantly correlated with peak METs (p-values both < 0.001) included daily peak gait speed averaged over the preceding 30 days (r = 0.63) and root mean square of the successive differences of daily distance averaged over 365 days (r = 0.57). The best-performing multivariable regression model included the latter variable, as well as age and body mass index. This model explained 68% of variability in observed METs (95% CI 46%, 81%), and estimated peak METs with a bootstrapped mean absolute error of 1.28 METs (95% CI 0.98, 1.60). Our model using smartphone physical activity estimated cardiorespiratory fitness with high performance. Our results suggest larger, independent samples might yield estimates accurate and precise for risk stratification and disease prognostication
    • 

    corecore