604 research outputs found
Application of the EXtrapolated Efficiency Method (EXEM) to infer the gamma-cascade detection efficiency in the actinide region
The study of transfer-induced gamma-decay probabilities is very useful for
understanding the surrogate-reaction method and, more generally, for
constraining statistical-model calculations. One of the main difficulties in
the measurement of gamma-decay probabilities is the determination of the
gamma-cascade detection efficiency. In [Nucl. Instrum. Meth. A 700, 59 (2013)]
we developed the Extrapolated Efficiency Method (EXEM), a new method to measure
this quantity. In this work, we have applied, for the first time, the EXEM to
infer the gamma-cascade detection efficiency in the actinide region. In
particular, we have considered the 238U(d,p)239U and 238U(3He,d)239Np
reactions. We have performed Hauser-Feshbach calculations to interpret our
results and to verify the hypothesis on which the EXEM is based. The
determination of fission and gamma-decay probabilities of 239Np below the
neutron separation energy allowed us to validate the EXEM
Preliminary results on the 233U capture cross section and alpha ratio measured at n_TOF (CERN) with the fission tagging technique
233U is of key importance among the fissile nuclei in the Th-U fuel cycle. A particularity of 233U is its small neutron capture cross-section, which is on average about one order of magnitude lower than the fission cross-section.The accuracy in the measurement of the 233U capture cross-section depends crucially onan efficient capture-fission discrimination, thus a combined set-up of fission and ¿-detectors is needed. A measurement of the 233U capture cross-section and capture-to-fissionratio was performed at the CERN n_TOF facility. The Total Absorption Calorimeter (TAC) of n_TOF was employed as ¿-detector coupled with a novel compact ionization chamber as fission detector. A brief description of the experimental set-up will be given, and essential parts of the analysis procedure as well as the preliminary response of the set-up to capture are presented and discussedPostprint (published version
Chance-constrained programming with fuzzy stochastic coefficients
International audienceWe consider fuzzy stochastic programming problems with a crisp objective function and linear constraints whose coefficients are fuzzy random variables, in particular of type L-R. To solve this type of problems, we formulate deterministic counterparts of chance-constrained programming with fuzzy stochastic coefficients, by combining constraints on probability of satisfying constraints, as well as their possibility and necessity. We discuss the possible indices for comparing fuzzy quantities by putting together interval orders and statistical preference. We study the convexity of the set of feasible solutions under various assumptions. We also consider the case where fuzzy intervals are viewed as consonant random intervals. The particular cases of type L-R fuzzy Gaussian and discrete random variables are detailed
Inferring Proteolytic Processes from Mass Spectrometry Time Series Data Using Degradation Graphs
Background: Proteases play an essential part in a variety of biological
processes. Besides their importance under healthy conditions they are also
known to have a crucial role in complex diseases like cancer. In recent years,
it has been shown that not only the fragments produced by proteases but also
their dynamics, especially ex vivo, can serve as biomarkers. But so far, only
a few approaches were taken to explicitly model the dynamics of proteolysis in
the context of mass spectrometry. Results: We introduce a new concept to model
proteolytic processes, the degradation graph. The degradation graph is an
extension of the cleavage graph, a data structure to reconstruct and visualize
the proteolytic process. In contrast to previous approaches we extended the
model to incorporate endoproteolytic processes and present a method to
construct a degradation graph from mass spectrometry time series data. Based
on a degradation graph and the intensities extracted from the mass spectra it
is possible to estimate reaction rates of the underlying processes. We further
suggest a score to rate different degradation graphs in their ability to
explain the observed data. This score is used in an iterative heuristic to
improve the structure of the initially constructed degradation graph.
Conclusion: We show that the proposed method is able to recover all degraded
and generated peptides, the underlying reactions, and the reaction rates of
proteolytic processes based on mass spectrometry time series data. We use
simulated and real data to demonstrate that a given process can be
reconstructed even in the presence of extensive noise, isobaric signals and
false identifications. While the model is currently only validated on peptide
data it is also applicable to proteins, as long as the necessary time series
data can be produced
Scissors resonance in the quasi-continuum of Th, Pa and U isotopes
The gamma-ray strength function in the quasi-continuum has been measured for
231-233Th, 232,233Pa and 237-239U using the Oslo method. All eight nuclei show
a pronounced increase in gamma strength at omega_SR approx 2.4 MeV, which is
interpreted as the low-energy M1 scissors resonance (SR). The total strength is
found to be B_SR = 9-11 mu_N^2 when integrated over the 1 - 4 MeV gamma-energy
region. The SR displays a double-hump structure that is theoretically not
understood. Our results are compared with data from (gamma, gamma') experiments
and theoretical sum-rule estimates for a nuclear rigid-body moment of inertia.Comment: 11 pages, 9 figure
The Restructuring of the Quartier De La Marine in Algiers: An Urban Opportunity
As many other historic centres which underwent profound changes the old Algiers had to accommodate contemporary socio-economic, cultural and technological activities. The Casbah, old town of Algiers has been subjected to these regrettable and irreversible changes and is left today with a physically and functionally damaged urban fabric. This thesis is concerned with the coexistence and reconciliation of the old core with the present and future urban fabric in order to ensure that its links with the modern urban development are maintained. This is therefore a study on the old core of Algiers which explores its historical growth and urban structure, underlying the conflict between old and new parts. It establishes at the end a general framework for its future development
scalable bioinformatics via workflow conversion
Background Reproducibility is one of the tenets of the scientific method.
Scientific experiments often comprise complex data flows, selection of
adequate parameters, and analysis and visualization of intermediate and end
results. Breaking down the complexity of such experiments into the joint
collaboration of small, repeatable, well defined tasks, each with well defined
inputs, parameters, and outputs, offers the immediate benefit of identifying
bottlenecks, pinpoint sections which could benefit from parallelization, among
others. Workflows rest upon the notion of splitting complex work into the
joint effort of several manageable tasks. There are several engines that give
users the ability to design and execute workflows. Each engine was created to
address certain problems of a specific community, therefore each one has its
advantages and shortcomings. Furthermore, not all features of all workflow
engines are royalty-free —an aspect that could potentially drive away members
of the scientific community. Results We have developed a set of tools that
enables the scientific community to benefit from workflow interoperability. We
developed a platform-free structured representation of parameters, inputs,
outputs of command-line tools in so-called Common Tool Descriptor documents.
We have also overcome the shortcomings and combined the features of two
royalty-free workflow engines with a substantial user community: the Konstanz
Information Miner, an engine which we see as a formidable workflow editor, and
the Grid and User Support Environment, a web-based framework able to interact
with several high-performance computing resources. We have thus created a free
and highly accessible way to design workflows on a desktop computer and
execute them on high-performance computing resources. Conclusions Our work
will not only reduce time spent on designing scientific workflows, but also
make executing workflows on remote high-performance computing resources more
accessible to technically inexperienced users. We strongly believe that our
efforts not only decrease the turnaround time to obtain scientific results but
also have a positive impact on reproducibility, thus elevating the quality of
obtained scientific results
A one-dimensional lattice model for a quantum mechanical free particle
Two types of particles, A and B with their corresponding antiparticles, are
defined in a one dimensional cyclic lattice with an odd number of sites. In
each step of time evolution, each particle acts as a source for the
polarization field of the other type of particle with nonlocal action but with
an effect decreasing with the distance: A -->...\bar{B} B \bar{B} B \bar{B} ...
; B --> A \bar{A} A \bar{A} A ... . It is shown that the combined distribution
of these particles obeys the time evolution of a free particle as given by
quantum mechanics.Comment: 8 pages. Revte
- …
