929 research outputs found
Analyses of risks associated with radiation exposure from past major solar particle events
Radiation exposures and cancer induction/mortality risks were investigated for several major solar particle events (SPE's). The SPE's included are: February 1956, November 1960, August 1972, October 1989, and the September, August, and October 1989 events combined. The three 1989 events were treated as one since all three could affect a single lunar or Mars mission. A baryon transport code was used to propagate particles through aluminum and tissue shield materials. A free space environment was utilized for all calculations. Results show the 30-day blood forming organs (BFO) limit of 25 rem was surpassed by all five events using 10 g/sq cm of shielding. The BFO limit is based on a depth dose of 5 cm of tissue, while a more detailed shield distribution of the BFO's was utilized. A comparison between the 5 cm depth dose and the dose found using the BFO shield distribution shows that the 5 cm depth value slightly higher than the BFO dose. The annual limit of 50 rem was exceeded by the August 1972, October 1989, and the three combined 1989 events with 5 g/sq cm of shielding. Cancer mortality risks ranged from 1.5 to 17 percent at 1 g/sq cm and 0.5 to 1.1 percent behind 10 g/sq cm of shielding for five events. These ranges correspond to those for a 45 year old male. It is shown that secondary particles comprise about 1/3 of the total risk at 10 g/sq cm of shielding. Utilizing a computerized Space Shuttle shielding model to represent a typical spacecraft configuration in free space at the August 1972 SPE, average crew doses exceeded the BFO dose limit
Single-mode regime in large-mode-area rare-earth-doped rod-type PCFs
In this paper, large-mode-area, double-cladding, rare-earth-doped photonic crystal fibers are investigated in order to understand how the refractive index distribution and the mode competition given by the amplification can assure singlemode
propagation. Fibers with different core diameters, i.e., 35,60, and 100 μm, are considered. The analysis of the mode effective index, overlap, effective area, gain, and power evolution along the doped fiber provides clear guidelines on the fiber physical characteristics to be matched in the fabrication process to obtain a truly or effectively single-mode output beam
BRYNTRN: A baryon transport model
The development of an interaction data base and a numerical solution to the transport of baryons through an arbitrary shield material based on a straight ahead approximation of the Boltzmann equation are described. The code is most accurate for continuous energy boundary values, but gives reasonable results for discrete spectra at the boundary using even a relatively coarse energy grid (30 points) and large spatial increments (1 cm in H2O). The resulting computer code is self-contained, efficient and ready to use. The code requires only a very small fraction of the computer resources required for Monte Carlo codes
The ALTCRISS project on board the International Space Station
The Altcriss project aims to perform a long term survey of the radiation
environment on board the International Space Station. Measurements are being
performed with active and passive devices in different locations and
orientations of the Russian segment of the station. The goal is to perform a
detailed evaluation of the differences in particle fluence and nuclear
composition due to different shielding material and attitude of the station.
The Sileye-3/Alteino detector is used to identify nuclei up to Iron in the
energy range above 60 MeV/n. Several passive dosimeters (TLDs, CR39) are also
placed in the same location of Sileye-3 detector. Polyethylene shielding is
periodically interposed in front of the detectors to evaluate the effectiveness
of shielding on the nuclear component of the cosmic radiation. The project was
submitted to ESA in reply to the AO in the Life and Physical Science of 2004
and data taking began in December 2005. Dosimeters and data cards are rotated
every six months: up to now three launches of dosimeters and data cards have
been performed and have been returned with the end of expedition 12 and 13.Comment: Accepted for publication on Advances in Space Research
http://dx.doi.org/10.1016/j.asr.2007.04.03
Earth‐Moon‐Mars Radiation Environment Module framework
[1] We are preparing to return humans to the Moon and setting the stage for exploration to Mars and beyond. However, it is unclear if long missions outside of low-Earth orbit can be accomplished with acceptable risk. The central objective of a new modeling project, the Earth-Moon-Mars Radiation Exposure Module (EMMREM), is to develop and validate a numerical module for characterizing time-dependent radiation exposure in the Earth-Moon-Mars and interplanetary space environments. EMMREM is being designed for broad use by researchers to predict radiation exposure by integrating over almost any incident particle distribution from interplanetary space. We detail here the overall structure of the EMMREM module and study the dose histories of the 2003 Halloween storm event and a June 2004 event. We show both the event histories measured at 1 AU and the evolution of these events at observer locations beyond 1 AU. The results are compared to observations at Ulysses. The model allows us to predict how the radiation environment evolves with radial distance from the Sun. The model comparison also suggests areas in which our understanding of the physics of particle propagation and energization needs to be improved to better forecast the radiation environment. Thus, we introduce the suite of EMMREM tools, which will be used to improve risk assessment models so that future human exploration missions can be adequately planned for
Threshold meson production and cosmic ray transport
An interesting accident of nature is that the peak of the cosmic ray
spectrum, for both protons and heavier nuclei, occurs near the pion production
threshold. The Boltzmann transport equation contains a term which is the cosmic
ray flux multiplied by the cross section. Therefore when considering pion and
kaon production from proton-proton reactions, small cross sections at low
energy can be as important as larger cross sections at higher energy. This is
also true for subthreshold kaon production in nuclear collisions, but not for
subthreshold pion production.Comment: 9 pages, 1 figur
Cross Section Sensitivity and Propagated Errors in HZE Exposures
It has long been recognized that galactic cosmic rays are of such high energy that they tend to pass through available shielding materials resulting in exposure of astronauts and equipment within space vehicles and habitats. Any protection provided by shielding materials result not so much from stopping such particles but by changing their physical character in interaction with shielding material nuclei forming, hopefully, less dangerous species. Clearly, the fidelity of the nuclear cross-sections is essential to correct specification of shield design and sensitivity to cross-section error is important in guiding experimental validation of cross-section models and database. We examine the Boltzmann transport equation which is used to calculate dose equivalent during solar minimum, with units (cSv/yr), associated with various depths of shielding materials. The dose equivalent is a weighted sum of contributions from neutrons, protons, light ions, medium ions and heavy ions. We investigate the sensitivity of dose equivalent calculations due to errors in nuclear fragmentation cross-sections. We do this error analysis for all possible projectile-fragment combinations (14,365 such combinations) to estimate the sensitivity of the shielding calculations to errors in the nuclear fragmentation cross-sections. Numerical differentiation with respect to the cross-sections will be evaluated in a broad class of materials including polyethylene, aluminum and copper. We will identify the most important cross-sections for further experimental study and evaluate their impact on propagated errors in shielding estimates
Period Estimation for Linux-based Edge Computing Virtualization with Strong Temporal Isolation
Virtualization of edge nodes is paramount to avoid their under-exploitation, allowing applications from different tenants to share the underlying computing platform. Neverthe-less, enabling different applications to share the same hardware may expose them to uncontrolled mutual timing interference, as well as timing-related security attacks. Strong timing isolation through SCHED_DEADLINE reservations is an interesting solution to facilitate the safe and secure sharing of the processing platform: nevertheless, SCHED_DEADLINE reservations require proper parameter tuning that can be hard to achieve, especially in the case of highly dynamic environments, characterized by workloads that need to be served without knowing any accurate information about their timing. This paper presents an approach for estimating the periods of SCHED_DEADLINE reservations based on a spectral analysis of the activation pattern of the workload running in the reservation, which can be used to assign and refine reservation parameters in edge systems
Operating System Noise in the Linux Kernel
As modern network infrastructure moves from hardware-based to software-based using Network Function Virtualization, a new set of requirements is raised for operating system developers. By using the real-time kernel options and advanced CPU isolation features common to the HPC use-cases, Linux is becoming a central building block for this new architecture that aims to enable a new set of low latency networked services. Tuning Linux for these applications is not an easy task, as it requires a deep understanding of the Linux execution model and the mix of user-space tooling and tracing features. This paper discusses the internal aspects of Linux that influence the Operating System Noise from a timing perspective. It also presents Linux’s osnoise tracer, an in-kernel tracer that enables the measurement of the Operating System Noise as observed by a workload, and the tracing of the sources of the noise, in an integrated manner, facilitating the analysis and debugging of the system. Finally, this paper presents a series of experiments demonstrating both Linux’s ability to deliver low OS noise (in the single-digit μs order), and the ability of the proposed tool to provide precise information about root-cause of timing-related OS noise problems
Priority-Driven Differentiated Performance for NoSQL Database-As-a-Service
Designing data stores for native Cloud Computing services brings a number of challenges, especially if the Cloud Provider wants to offer database services capable of controlling the response time for specific customers. These requests may come from heterogeneous data-driven applications with conflicting responsiveness requirements. For instance, a batch processing workload does not require the same level of responsiveness as a time-sensitive one. Their coexistence may interfere with the responsiveness of the time-sensitive workload, such as online video gaming, virtual reality, and cloud-based machine learning. This paper presents a modification to the popular MongoDB NoSQL database to enable differentiated per-user/request performance on a priority basis by leveraging CPU scheduling and synchronization mechanisms available within the Operating System. This is achieved with minimally invasive changes to the source code and without affecting the performance and behavior of the database when the new feature is not in use. The proposed extension has been integrated with the access-control model of MongoDB for secure and controlled access to the new capability. Extensive experimentation with realistic workloads demonstrates how the proposed solution is able to reduce the response times for high-priority users/requests, with respect to lower-priority ones, in scenarios with mixed-priority clients accessing the data store
- …
