299 research outputs found
Phase editing as a signal pre-processing step for automated bearing fault detection
Scheduled maintenance and inspection of bearing elements in industrial machinery contributes significantly to the operating costs. Savings can be made through automatic vibration-based damage detection and prognostics, to permit condition-based maintenance.However automation of the detection process is difficult due to the complexity ofvibration signals in realistic operating environments. The sensitivity of existing methods to the choice of parameters imposes a requirement for oversight from a skilled operator.This paper presents a novel approach to the removal of unwanted vibrational components from the signal: phase editing. The approach uses a computationally efficient full band demodulation and requires very little oversight. Its effectiveness is tested on experimental data sets from three different test-rigs, and comparisons are made with two state of the artprocessing techniques: spectral kurtosis and cepstral pre- whitening. The results from the phase editing technique show a 10% improvement in damage detection rates comparedto the state of the art while simultaneously improving on the degree of automation. This outcome represents a significant contribution in the pursuit of fully automatic fault detection
A time-varying SIRD model for dynamic vaccination strategies against COVID-19
The COVID-19 pandemic has demonstrated how the optimal allocation of the limited doses of vaccine available represents one of the main useful measures to mitigate the transmission of the infection and reduce the mortality associated with it, especially at an early stage of the pandemic. The use of a compartmental model allows us to understand which population groups to vaccinate and to what extent to act depending on the type of health or social objective to be achieved
Experimental study of picosecond laser plasma formation in thin foils
A high performance, fully controlled picosecond
laser system has been designed and built with the aid of
a numerical code capable of simulating the temporal behavior
of the laser system, including each active and passive
component. The laser performance was characterized with
an optical streak camera, equivalent plane monitor, and
calorimeter measurements. The laser pulse was focused on
150-nm thick foils to investigate plasma formation and
the related transmittivity of the laser light. The experimental
data are in very good agreement with the predictions of
a simple, 2D analytical model that takes into account the
actual shot-to-shot features of the laser pulse. The temporal
profile of the pulse and the intensity distribution in
the focal spot were found to play a key role in determining
the transmission properties of the laser-irradiated foil.
This work may be relevant to a wide class of laser exploded
foil plasma experiments
Phase editing as a signal pre-processing step for automated bearing fault detection
Scheduled maintenance and inspection of bearing elements in industrial machinery contributes significantly to the operating costs. Savings can be made through automatic vibration-based damage detection and prognostics, to permit condition-based maintenance.However automation of the detection process is difficult due to the complexity ofvibration signals in realistic operating environments. The sensitivity of existing methods to the choice of parameters imposes a requirement for oversight from a skilled operator.This paper presents a novel approach to the removal of unwanted vibrational components from the signal: phase editing. The approach uses a computationally efficient full band demodulation and requires very little oversight. Its effectiveness is tested on experimental data sets from three different test-rigs, and comparisons are made with two state of the artprocessing techniques: spectral kurtosis and cepstral pre- whitening. The results from the phase editing technique show a 10% improvement in damage detection rates comparedto the state of the art while simultaneously improving on the degree of automation. This outcome represents a significant contribution in the pursuit of fully automatic fault detection
A comparative analysis of predictive models of morbidity in intensive care unit after cardiac surgery – Part II: an illustrative example
<p>Abstract</p> <p>Background</p> <p>Popular predictive models for estimating morbidity probability after heart surgery are compared critically in a unitary framework. The study is divided into two parts. In the first part modelling techniques and intrinsic strengths and weaknesses of different approaches were discussed from a theoretical point of view. In this second part the performances of the same models are evaluated in an illustrative example.</p> <p>Methods</p> <p>Eight models were developed: Bayes linear and quadratic models, <it>k</it>-nearest neighbour model, logistic regression model, Higgins and direct scoring systems and two feed-forward artificial neural networks with one and two layers. Cardiovascular, respiratory, neurological, renal, infectious and hemorrhagic complications were defined as morbidity. Training and testing sets each of 545 cases were used. The optimal set of predictors was chosen among a collection of 78 preoperative, intraoperative and postoperative variables by a stepwise procedure. Discrimination and calibration were evaluated by the area under the receiver operating characteristic curve and Hosmer-Lemeshow goodness-of-fit test, respectively.</p> <p>Results</p> <p>Scoring systems and the logistic regression model required the largest set of predictors, while Bayesian and <it>k</it>-nearest neighbour models were much more parsimonious. In testing data, all models showed acceptable discrimination capacities, however the Bayes quadratic model, using only three predictors, provided the best performance. All models showed satisfactory generalization ability: again the Bayes quadratic model exhibited the best generalization, while artificial neural networks and scoring systems gave the worst results. Finally, poor calibration was obtained when using scoring systems, <it>k</it>-nearest neighbour model and artificial neural networks, while Bayes (after recalibration) and logistic regression models gave adequate results.</p> <p>Conclusion</p> <p>Although all the predictive models showed acceptable discrimination performance in the example considered, the Bayes and logistic regression models seemed better than the others, because they also had good generalization and calibration. The Bayes quadratic model seemed to be a convincing alternative to the much more usual Bayes linear and logistic regression models. It showed its capacity to identify a minimum core of predictors generally recognized as essential to pragmatically evaluate the risk of developing morbidity after heart surgery.</p
A bootstrap approach for assessing the uncertainty of outcome probabilities when using a scoring system
Background: Scoring systems are a very attractive family of clinical predictive models, because the patient score can be calculated without using any data processing system. Their weakness lies in the difficulty of associating a reliable prognostic probability with each score. In this study a bootstrap approach for estimating confidence intervals of outcome probabilities is described and applied to design and optimize the performance of a scoring system for morbidity in intensive care units after heart surgery.
Methods: The bias-corrected and accelerated bootstrap method was used to estimate the 95% confidence intervals of outcome probabilities associated with a scoring system. These confidence intervals were calculated for each score and each step of the scoring-system design by means of one thousand bootstrapped samples. 1090 consecutive adult patients who underwent coronary artery bypass graft were assigned at random to two groups of equal size, so as to define random training and testing sets with equal percentage morbidities. A collection of 78 preoperative, intraoperative and postoperative variables were considered as likely morbidity predictors.
Results: Several competing scoring systems were compared on the basis of discrimination, generalization and uncertainty associated with the prognostic probabilities. The results showed that confidence intervals corresponding to different scores often overlapped, making it convenient to unite and thus reduce the score classes. After uniting two adjacent classes, a model with six score groups not only gave a satisfactory trade-off between discrimination and generalization, but also enabled patients to be allocated to classes, most of which were characterized by well separated confidence intervals of prognostic probabilities.
Conclusions: Scoring systems are often designed solely on the basis of discrimination and generalization characteristics, to the detriment of prediction of a trustworthy outcome probability. The present example demonstrates that using a bootstrap method for the estimation of outcome-probability confidence intervals provides useful additional information about score-class statistics, guiding physicians towards the most convenient model for predicting morbidity outcomes in their clinical context
The PLASMONX Project for advanced beam physics experiments
The Project PLASMONX is well progressing into its
design phase and has entered as well its second phase of
procurements for main components. The project foresees
the installation at LNF of a Ti:Sa laser system (peak
power > 170 TW), synchronized to the high brightness
electron beam produced by the SPARC photo-injector.
The advancement of the procurement of such a laser
system is reported, as well as the construction plans of a
new building at LNF to host a dedicated laboratory for
high intensity photon beam experiments (High Intensity
Laser Laboratory). Several experiments are foreseen
using this complex facility, mainly in the high gradient
plasma acceleration field and in the field of mono-
chromatic ultra-fast X-ray pulse generation via Thomson
back-scattering. Detailed numerical simulations have
been carried out to study the generation of tightly focused
electron bunches to collide with laser pulses in the
Thomson source: results on the emitted spectra of X-rays
are presented
The influence of the dechanneling process on the photon emission by an ultra-relativistc positron channeling in a periodically bent crystal
We investigate, both analytically and numerically, the influence of the
dechanneling process on the parameters of undulator radiation generated by
ultra-relativistic positron channelling along a crystal plane, which is
periodically bent. The bending might be due either to the propagation of a
transverse acoustic wave through the crystal, or due to the static strain as it
occurs in superlattices. In either case the periodically bent crystal serves as
an undulator which allows to generate X-ray and gamma-radiation.
We propose the scheme for accurate quantitative treatment of the radiation in
presence of the dechanneling. The scheme includes (i) the analytic expression
for spectral-angular distribution which contains, as a parameter, the
dechanneling length, (ii) the simulation procedure of the dechanneling process
of a positron in periodically bent crystals. Using these we calculate the
dechanneling lengths of 5 GeV positrons channeling in Si, Ge and W crystals,
and the spectral-angular and spectral distributions of the undulator over broad
ranges of the photons. The calculations are performed for various parameters of
the channel bending.Comment: published in J. Phys. G: Nucl. Part. Phys. 27 (2001) 95-125,
http://www.iop.or
A comparative analysis of predictive models of morbidity in intensive care unit after cardiac surgery – Part I: model planning
<p>Abstract</p> <p>Background</p> <p>Different methods have recently been proposed for predicting morbidity in intensive care units (ICU). The aim of the present study was to critically review a number of approaches for developing models capable of estimating the probability of morbidity in ICU after heart surgery. The study is divided into two parts. In this first part, popular models used to estimate the probability of class membership are grouped into distinct categories according to their underlying mathematical principles. Modelling techniques and intrinsic strengths and weaknesses of each model are analysed and discussed from a theoretical point of view, in consideration of clinical applications.</p> <p>Methods</p> <p>Models based on Bayes rule, <it>k-</it>nearest neighbour algorithm, logistic regression, scoring systems and artificial neural networks are investigated. Key issues for model design are described. The mathematical treatment of some aspects of model structure is also included for readers interested in developing models, though a full understanding of mathematical relationships is not necessary if the reader is only interested in perceiving the practical meaning of model assumptions, weaknesses and strengths from a user point of view.</p> <p>Results</p> <p>Scoring systems are very attractive due to their simplicity of use, although this may undermine their predictive capacity. Logistic regression models are trustworthy tools, although they suffer from the principal limitations of most regression procedures. Bayesian models seem to be a good compromise between complexity and predictive performance, but model recalibration is generally necessary. <it>k</it>-nearest neighbour may be a valid non parametric technique, though computational cost and the need for large data storage are major weaknesses of this approach. Artificial neural networks have intrinsic advantages with respect to common statistical models, though the training process may be problematical.</p> <p>Conclusion</p> <p>Knowledge of model assumptions and the theoretical strengths and weaknesses of different approaches are fundamental for designing models for estimating the probability of morbidity after heart surgery. However, a rational choice also requires evaluation and comparison of actual performances of locally-developed competitive models in the clinical scenario to obtain satisfactory agreement between local needs and model response. In the second part of this study the above predictive models will therefore be tested on real data acquired in a specialized ICU.</p
Alpine ethnobotany in Italy: traditional knowledge of gastronomic and medicinal plants among the Occitans of the upper Varaita valley, Piedmont
A gastronomic and medical ethnobotanical study was conducted among the Occitan communities living in Blins/Bellino and Chianale, in the upper Val Varaita, in the Piedmontese Alps, North-Western Italy, and the traditional uses of 88 botanical taxa were recorded. Comparisons with and analysis of other ethnobotanical studies previously carried out in other Piemontese and surrounding areas, show that approximately one fourth of the botanical taxa quoted in this survey are also known in other surrounding Occitan valleys. It is also evident that traditional knowledge in the Varaita valley has been heavily eroded. This study also examined the local legal framework for the gathering of botanical taxa, and the potential utilization of the most quoted medicinal and food wild herbs in the local market, and suggests that the continuing widespread local collection from the wild of the aerial parts of Alpine wormwood for preparing liqueurs (Artemisia genipi, A. glacialis, and A. umbelliformis) should be seriously reconsidered in terms of sustainability, given the limited availability of these species, even though their collection is culturally salient in the entire study area
- …