1,140 research outputs found
Radioprotection calculations for the TRADE experiment
The TRADE project is based on the coupling of, in a sub-critical configuration, of a 115 MeV, 2 mA proton cyclotron with a TRIGA research reactor at the ENEA Casaccia centre (Rome). Detailed radioprotection calculations using the FLUKA and EA-MC Monte Carlo codes were performed during the feasibility study. The study concentrated on dose rates due to beam losses in normal operating conditions and in the calculation of activation in the most sensitive components of the experiment. Results show that a shielding of 1.4 m of barytes concrete around the beam line will be sufficient to maintain the effective doses below the level of 10 Mu Sv/h, provided that the beam losses are at the level of 10 nA/m. The activation level around the beam line and in the water will be negligible, while the spallation target will reach an activation level comparable to the one of a fuel element at maximum burnup
Parity Violation in Neutron Resonances in 107,109Ag
Parity nonconservation (PNC) was studied in p-wave resonances in Ag by measuring the helicity dependence of the neutron total cross section. Transmission measurements on natural Ag were performed in the energy range 32 to 422 eV with the time-of-flight method at the Manuel Lujan Neutron Scattering Center at Los Alamos National Laboratory. A total of 15 p-wave neutron resonances were studied in 107Ag and ninep-wave resonances in 109Ag. Statistically significant asymmetries were observed for eight resonances in 107Ag and for four resonances in109Ag. An analysis treating the PNC matrix elements as random variables yields a weak spreading width of Γw=(2.67-1.21+2.65)×10-7 eV for107Ag and Γw=(1.30-0.74+2.49)×10-7 eV for 109Ag
Cost of a population-based programme of chest x-ray screening for lung cancer.
Background. After the implementation of a population-
based programme of chest x-ray (CXR) screening on
smokers in Varese, Italy, lung cancer (LC) mortality was
significantly reduced. Analysis of the incremental costs
due to this type of screening programme is needed to evaluate its economic impact on the healthcare system.
Methods. In July 1997 a population-based cohort,
consisting of all high-risk smokers (n=5,815) identified
among 60,000 adult residents from the Varese province,
was invited to a LC screening programme (an annual
CXR for five years) in a general practice setting, and was
observed through 2006. Invitees received National Health
Service (NHS) usual care, with the addition of CXRs in
screening participants. At the end of observation, among
the 245 LCs diagnosed in the entire screening-invited cohort
the observed LC deaths were 38 fewer than expected.
To estimate the incremental direct cost due to screening
in the invited cohort for the period July 1997-2006, we
compared the direct cost of screening administration,
CXR screens and LC management in the invited cohort
and in the uninvited and unscreened controls in NHS
usual care setting.
Results. Over the 9.5 years, the total incremental direct
healthcare costs (including screening organization/administration, CXR screens, additional procedures prompted by false-positive tests, overdiagnosed LCs) were estimated to range from \u20ac 607,440 to \u20ac 618,370 (in euros as of 2012), equating to between \u20ac 15,985- \u20ac 16,273 per patient out of the 38 LC deaths averted.
Conclusions. In a general practice setting, the incremental
cost for a CXR screening programme targeted at
all high-risk smokers in a population of 60,000 adults was
estimated to be about \u20ac65,000 per annum, approx. \u20ac16,000
for each LC death averted
Shrec'16 Track: Retrieval of Human Subjects from Depth Sensor Data
International audienceIn this paper we report the results of the SHREC 2016 contest on "Retrieval of human subjects from depth sensor data". The proposed task was created in order to verify the possibility of retrieving models of query human subjects from single shots of depth sensors, using shape information only. Depth acquisition of different subjects were realized under different illumination conditions, using different clothes and in three different poses. The resulting point clouds of the partial body shape acquisitions were segmented and coupled with the skeleton provided by the OpenNI software and provided to the participants together with derived triangulated meshes. No color information was provided. Retrieval scores of the different methods proposed were estimated on the submitted dissimilarity matrices and the influence of the different acquisition conditions on the algorithms were also analyzed. Results obtained by the participants and by the baseline methods demonstrated that the proposed task is, as expected, quite difficult, especially due the partiality of the shape information and the poor accuracy of the estimated skeleton, but give useful insights on potential strategies that can be applied in similar retrieval procedures and derived practical applications. Categories and Subject Descriptors (according to ACM CCS): I.4.8 [IMAGE PROCESSING AND COMPUTER VISION]: Scene Analysis—Shap
Parity Violation in Neutron Resonances in 115In
Parity nonconservation (PNC) was studied in p-wave resonances in indium by measuring the helicity dependence of the neutron total cross section in the neutron energy range 6.0–316 eV with the time-of-flight method at LANSCE. A total of 36 p-wave neutron resonances were studied in 115In, and statistically significant asymmetries were observed for nine cases. An analysis treating the PNC matrix elements as random variables yields a weak matrix element of M=(0.67-0.12+0.16) meV and a weak spreading width of Γw=(1.30-0.43+0.76)×10-7 eV
TRY plant trait database - enhanced coverage and open access
Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives
A specific nanobody prevents amyloidogenesis of D76N \u3b22-microglobulin in vitro and modifies its tissue distribution in vivo
Systemic amyloidosis is caused by misfolding and aggregation of globular proteins in vivo for which
effective treatments are urgently needed. Inhibition of protein self-aggregation represents an attractive
therapeutic strategy. Studies on the amyloidogenic variant of \u3b22-microglobulin, D76N, causing
hereditary systemic amyloidosis, have become particularly relevant since fibrils are formed in vitro in
physiologically relevant conditions. Here we compare the potency of two previously described inhibitors
of wild type \u3b22-microglobulin fibrillogenesis, doxycycline and single domain antibodies (nanobodies).
The \u3b22-microglobulin -binding nanobody, Nb24, more potently inhibits D76N \u3b22-microglobulin
fibrillogenesis than doxycycline with complete abrogation of fibril formation. In \u3b22-microglobulin knock
out mice, the D76N \u3b22-microglobulin/ Nb24 pre-formed complex, is cleared from the circulation at the
same rate as the uncomplexed protein; however, the analysis of tissue distribution reveals that the
interaction with the antibody reduces the concentration of the variant protein in the heart but does
not modify the tissue distribution of wild type \u3b22-microglobulin. These findings strongly support the
potential therapeutic use of this antibody in the treatment of systemic amyloidosis
Spheroid arrays for high-throughput single-cell analysis of spatial patterns and biomarker expression in 3D
We describe and share a device, methodology and image analysis algorithms, which allow up to 66 spheroids to be arranged into a gel-based array directly from a culture plate for downstream processing and analysis. Compared to processing individual samples, the technique uses 11-fold less reagents, saves time and enables automated imaging. To illustrate the power of the technology, we showcase applications of the methodology for investigating 3D spheroid morphology and marker expression and for in vitro safety and efficacy screens. Firstly, spheroid arrays of 11 cell-lines were rapidly assessed for differences in spheroid morphology. Secondly, highly-positive (SOX-2), moderately-positive (Ki-67) and weakly-positive (βIII-tubulin) protein targets were detected and quantified. Third, the arrays enabled screening of ten media compositions for inducing differentiation in human neurospheres. Lastly, the application of spheroid microarrays for spheroid-based drug-screens was demonstrated by quantifying the dose-dependent drop in proliferation and increase in differentiation in etoposide-treated neurospheres
Spallation reactions. A successful interplay between modeling and applications
The spallation reactions are a type of nuclear reaction which occur in space
by interaction of the cosmic rays with interstellar bodies. The first
spallation reactions induced with an accelerator took place in 1947 at the
Berkeley cyclotron (University of California) with 200 MeV deuterons and 400
MeV alpha beams. They highlighted the multiple emission of neutrons and charged
particles and the production of a large number of residual nuclei far different
from the target nuclei. The same year R. Serber describes the reaction in two
steps: a first and fast one with high-energy particle emission leading to an
excited remnant nucleus, and a second one, much slower, the de-excitation of
the remnant. In 2010 IAEA organized a worskhop to present the results of the
most widely used spallation codes within a benchmark of spallation models. If
one of the goals was to understand the deficiencies, if any, in each code, one
remarkable outcome points out the overall high-quality level of some models and
so the great improvements achieved since Serber. Particle transport codes can
then rely on such spallation models to treat the reactions between a light
particle and an atomic nucleus with energies spanning from few tens of MeV up
to some GeV. An overview of the spallation reactions modeling is presented in
order to point out the incomparable contribution of models based on basic
physics to numerous applications where such reactions occur. Validations or
benchmarks, which are necessary steps in the improvement process, are also
addressed, as well as the potential future domains of development. Spallation
reactions modeling is a representative case of continuous studies aiming at
understanding a reaction mechanism and which end up in a powerful tool.Comment: 59 pages, 54 figures, Revie
- …
