31 research outputs found

    An absolute measurement of the neutron production rate of a spent nuclear fuel sample used for depletion code validation

    Get PDF
    A method to determine the neutron production rate of a spent nuclear fuel segment sample by means of non-destructive assay conducted under standard controlled-area conditions is described and demonstrated. A neutron well counter designed for routine nuclear safeguards applications is applied. The method relies on a transfer procedure that is adapted to the hot cell facilities at the Laboratory for High and Medium level Activity of SCK CEN in Belgium. Experiments with 252Cf(sf) sources, certified for their neutron emission rate, were carried out at the Joint Research Centre to determine the characteristics of the detection device. Measurements of a segment of a spent nuclear fuel rod were carried out at SCK CEN resulting in an absolute and non-destructive measurement of the neutron production rate avoiding any reference to a representative spent nuclear fuel sample to calibrate the device. Results of these measurements were used to study the performance of depletion codes, i.e., ALEPH2, SCALE, and Serpent2. The study includes a code-to-code and code-to-experiment comparison using different nuclear data libraries

    The joint evaluated fission and fusion nuclear data library, JEFF-3.3

    Get PDF
    The joint evaluated fission and fusion nuclear data library 3.3 is described. New evaluations for neutron-induced interactions with the major actinides 235^{235}U, 238^{238}U and 239^{239}Pu, on 241^{241}Am and 23^{23}Na, 59^{59}Ni, Cr, Cu, Zr, Cd, Hf, W, Au, Pb and Bi are presented. It includes new fission yields, prompt fission neutron spectra and average number of neutrons per fission. In addition, new data for radioactive decay, thermal neutron scattering, gamma-ray emission, neutron activation, delayed neutrons and displacement damage are presented. JEFF-3.3 was complemented by files from the TENDL project. The libraries for photon, proton, deuteron, triton, helion and alpha-particle induced reactions are from TENDL-2017. The demands for uncertainty quantification in modeling led to many new covariance data for the evaluations. A comparison between results from model calculations using the JEFF-3.3 library and those from benchmark experiments for criticality, delayed neutron yields, shielding and decay heat, reveals that JEFF-3.3 performes very well for a wide range of nuclear technology applications, in particular nuclear energy

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    An experimental comparison of some heuristics for cardinality constrained bin packing problem

    No full text
    Background: Bin packing is an Nphard optimization problem of packing items of given sizes into minimum number of capacitylimited bins. Besides the basic problem, numerous other variants of bin packing exist. The cardinality constrained bin packing adds an additional constraint that the number of items in a bin must not exceed a given limit Nmax. Objectives: Goal of the paper is to present a preliminary experimental study which demostrates adaptations of the new algorithms to the general cardinality constrained bin packing problem. Methods/Approach: Straightforward modifications of First Fit Decreasing (FFD), Refined First Fit (RFF) and the algorithm by Zhang et al. for the bin packing problem are compared to four cardinality constrained bin packing problem specific algorithms on random lists of items with 0%, 10%, 30% and 50% of large items. The behaviour of all algorithms when cardinality constraint Nmax increases is also studied. Results: Results show that all specific algorithms outperform the general algorithms on lists with low percentage of big items. Conclusions: One of the specific algorithms performs better or equally well even on lists with high percentage of big items and is therefore of significant interest. The behaviour when Nmax increases shows that specific algorithms can be used for solving the general bin packing problem as well

    On the Nyquist frequency of random sampled signals

    No full text
    In modern industry, the wide use of reliable and sophisticated sensors with their connection to internet has introduced the phenomena of Big Data, especially in the field of condition monitoring systems (CMSs) in e-maintenance applications. In particular, in the case of vibration signals, high-performance acquisition systems are required, characterized by anti-aliasing filtering and high uniform sampling rate, in order to properly digitalize the meaningful frequency content of the signals. In this context, the capability of non-uniform random sampling (RS) is assessed in this work. While in different fields, such astronomy, structural and biomedical studies, the RS is a problem to be resolved, due to the unavailability of samples at specific instants (missing data problem), in the field of fault detection & diagnosis (FDD), RS can be a chosen sampling method thanks to its advantages: Anti-aliasing property and low average sampling rate. Therefore, this paper focuses on studying the anti-aliasing property of the random sampled data, verifying the criterion proposed in literature for establish the Nyquist frequency, and analyzing its sensitivity to the sampling parameters. This study is carried out using simulated signals and computing the spectral window, giving the Nyquist frequency for different random sampling parameters; moreover, a spectral analysis method, the Schuster periodogram, is used to verify when the spectrum is actually free of alias. The results show that the Nyquist frequency depends on the numerical accuracy of the randomly generated time instants
    corecore