601 research outputs found

    Videolaryngoscopy improves intubation condition in morbidly obese patients

    Get PDF
    Résumé : Contexte clinique et objectifs: l'intubation oro-trachéale peut être plus difficile chez les patients obèses morbides (index de masse corporelle BMI > 35 kg/m2) que chez les patients non-obèses. Récemment, de nouveaux instruments permettant une intubation assistée au moyen d'une caméra ont été développés. Notre expérience pratique avec la vidéolaryngoscopie nous a conduit à l'hypothèse que celle-ci pourrait améliorer la vision laryngoscopique chez cette population spécifique et de ce fait faciliter l'intubation. Le but de cette étude était donc d'évaluer le bénéfice du vidéolaryngoscope sur le grade de laryngoscopie chez le patient obèse morbide. Résultats : le grade laryngoscopique fut abaissé de manière significative avec le vidéolaryngoscope comparé à la vision directe avec un laryngoscope standard. Lorsque le grade laryngoscopique était plus grand que 1 à la laryngoscopie directe, il fut dans la grande majorité des cas (93% des patients) abaissé avec le vidéolaryngoscope. Chez les 7 % restant, le grade laryngoscopique resta identique. Conclusions : chez le patient obèse morbide, l'utilisation du vidéolaryngoscope améliore de manière significative la visualisation du larynx et de ce fait facilite l'intubation. Une application systématique de ce procédé pourrait donc permettre de réduire l'incidence d'une intubation difficile ainsi que ses conséquences chez cette population de patients. Summary : Background and objective: Tracheal intubation may be more difficult in morbidly obese patients (body mass index >35 kgM-2) than in the non-obese. Recently, new video-assisted intubation devices have been developed. After some experience with videolaryngoscopy, we hypothesized that it could improve the laryngoscopic view in this specific population and therefore facilitate intubation. The aim of this study was to assess the benefit of a videolaryngoscope on the grade of laryngoscopy in morbid obesity. Methods: We studied 80 morbidly obese patients undergoing bariatric surgery. They were randomly assigned to one of two groups. One group was intubated with the help of the videolaryngoscope and in the control group the screen of the videolaryngoscope was hidden to the intubating anaesthesiologist. The primary end-point of the study was to assess in both groups the Cormack and Lehane direct and indirect grades of laryngoscopy. The duration of intubation, the number of attempts needed as well as the minimal SPO2 reached during the intubation process were measured. Results: Grade of laryngoscopy was significantly lower with the videolaryngoscope compared with the direct vision (P < 0.001). When the grade of laryngoscopy was higher than one with the direct laryngoscopy (n = 30), it was lower in 28 cases with the videolaryngoscope and remained the same only in two cases (P < 0.001). The minimal SPO2 reached during the intubation was higher with the videolaryngoscope but it did not reach statistical significance. Conclusions: In morbidly obese patients, the use of the videolaryngoscope significantly improves the visualization of the larynx and thereby facilitates intubation

    Global Sensitivity Analysis of Stochastic Computer Models with joint metamodels

    Get PDF
    The global sensitivity analysis method, used to quantify the influence of uncertain input variables on the response variability of a numerical model, is applicable to deterministic computer code (for which the same set of input variables gives always the same output value). This paper proposes a global sensitivity analysis methodology for stochastic computer code (having a variability induced by some uncontrollable variables). The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, non parametric joint models (based on Generalized Additive Models and Gaussian processes) are discussed. The relevance of these new models is analyzed in terms of the obtained variance-based sensitivity indices with two case studies. Results show that the joint modeling approach leads accurate sensitivity index estimations even when clear heteroscedasticity is present

    Sensitivity analysis with dependence and variance-based measures for spatio-temporal numerical simulators

    Get PDF
    International audienceIn a case of radioactive release in the environment, modeling the radionuclide atmospheric dispersion is particularly useful for emergency response procedures and risk assessment. For this, the CEA has developed a numerical simulator, called Ceres-Mithra, to predict spatial maps of radionuclide concentrations at different instants. This computer code depends on many uncertain scalar and temporal parameters, describing the radionuclide, release or weather characteristics. The purpose is to detect the input parameters the uncertainties of which highly affect the predicted concentrations and to quantify their influences. To this end, we present various measures for the sensitivity analysis of a spatial model. Some of them lead to as many analyses as spatial locations (site sensitivity indices) while others consider a single one, with respect to the whole spatial domain (block sensitivity indices). For both categories, variance-based and dependence measures are considered, based on recent literature. All of these sensitivity measures are applied to the CM computer code and compared to each other, showing the complementarity of block and site sensitivity analyses. Finally, a sensitivity analysis summarizing the input uncertainty contribution over the entirety of the spatio-temporal domain is proposed

    New improvements in the use of dependence measures for sensitivity analysis and screening

    Get PDF
    International audiencePhysical phenomena are commonly modeled by numerical simulators. Such codes can take as input a high number of uncertain parameters and it is important to identify their influences via a global sensitivity analysis (GSA). However, these codes can be time consuming which prevents a GSA based on the classical Sobol' indices, requiring too many simulations. This is especially true as the number of inputs is important. To address this limitation, we consider recent advances in dependence measures, focusing on the distance correlation and the Hilbert-Schmidt independence criterion (HSIC). Our objective is to study these indices and use them for a screening purpose. Numerical tests reveal some differences between dependence measures and classical Sobol' indices, and preliminary answers to "What sensitivity indices to what situation?" are derived. Then, two approaches are proposed to use the dependence measures for a screening purpose. The first one directly uses these indices with independence tests; asymptotic tests and their spectral extensions exist and are detailed. For a higher accuracy in presence of small samples, we propose a non-asymptotic version based on bootstrap sampling. The second approach is based on a linear model associating two simulations, which explains their output difference as a weighed sum of their input differences. From this, a bootstrap method is proposed for the selection of the influential inputs. We also propose a heuristic approach for the calibration of the HSIC Lasso method. Numerical experiments are performed and show the potential of these approaches for screening when many inputs are not influential

    Uncertainty quantification for functional dependent random variables

    Get PDF
    International audienceThis paper proposes a new methodology to quantify the uncertainties associated to multiple dependent functional random variables, linked to a quantity of interest, called the covariate. The proposed methodology is composed of two main steps. First, the functional random variables are decomposed on a functional basis. The decomposition basis is computed by the proposed Simultaneous Partial Least Squares algorithm which enables to decompose simultaneously all the functional variables. Second, the joint probability density function of the coefficients of the decomposition associated to the functional variables is modelled by a Gaussian mixture model. A new method to estimate the parameters of the Gaussian mixture model based on a Lasso penalization algorithm is proposed. This algorithm enables to estimate sparse covariance matrices, in order to reduce the number of model parameters to be estimated. Several criteria are proposed to assess the efficiency of the methodology. Finally, its performance is shown on an analytical example and on a nuclear reliability test case

    Global sensitivity analysis for models with spatially dependent outputs

    Get PDF
    International audienceThe global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodel-based sensitivity analysis methods are well-suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol' indices is thus obtained

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    High resolution spectroscopy of methyltrioxorhenium: towards the observation of parity violation in chiral molecules

    Get PDF
    Originating from the weak interaction, parity violation in chiral molecules has been considered as a possible origin of the biohomochirality. It was predicted in 1974 but has never been observed so far. Parity violation should lead to a very tiny frequency difference in the rovibrational spectra of the enantiomers of a chiral molecule. We have proposed to observe this predicted frequency difference using the two photon Ramsey fringes technique on a supersonic beam. Promising candidates for this experiment are chiral oxorhenium complexes, which present a large effect, can be synthesized in large quantity and enantiopure form, and can be seeded in a molecular beam. As a first step towards our objective, a detailed spectroscopic study of methyltrioxorhenium (MTO) has been undertaken. It is an ideal test molecule as the achiral parent molecule of chiral candidates for the parity violation experiment. For the 187Re MTO isotopologue, a combined analysis of Fourier transform microwave and infrared spectra as well as ultra-high resolution CO2 laser absorption spectra enabled the assignment of 28 rotational lines and 71 rovibrational lines, some of them with a resolved hyperfine structure. A set of spectroscopic parameters in the ground and first excited state, including hyperfine structure constants, was obtained for the antisymmetric Re=O stretching mode of this molecule. This result validates the experimental approach to be followed once a chiral derivative of MTO will be synthesized, and shows the benefit of the combination of several spectroscopic techniques in different spectral regions, with different set-ups and resolutions. First high resolution spectra of jet-cooled MTO, obtained on the set-up being developed for the observation of molecular parity violation, are shown, which constitutes a major step towards the targeted objective.Comment: 20 pages, 6 figure
    corecore