44 research outputs found
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Une législation nouvelle pour les eaux destinées à la consommation humaine
The French Health Code, including Art. R. 1321-1 to 62 and the associated annexes, brings together the french legislation on water for human consumption. This new text transposes into french law the 1998 European directive, but repeats also the World Health Organization’ parameters proposed in 1999. This text modifies mainly the previous water regulation, especially : Regarding the definition of water used for human consumption Regarding the place where water quality is checked Moreover, it introduces very clearly the monitoring requirements, the water supplier’s responsabilities and it allows a possibility of dispensation. At last, an approach of a global quality management is proposed. This approach refers a lot to the ISO 9000/ version 2000 quality insurance frame of reference.Le Code de la Santé, articles R. 1321-1 à 62 et les annexes associées, regroupe la législation française sur les eaux destinées à la consommation humaine. Ce nouveau texte transpose, en droit français, la directive européenne de 1998 mais reprend aussi les paramètres de l'Organisation Mondiale de la Santé proposés en 1999. Ce texte modifie, de façon importante, l'ancienne réglementation sur l'eau, notamment :
-au niveau de la définition de l'eau destinée à la consommation humaine
-et du lieu où se vérifie la qualité de l'eau. De plus, il introduit, de façon très claire, les obligations d'autosurveillance, les responsabilités du producteur d'eau et permet la possibilité de dérogation. Enfin, l'approche gestion globale de la qualité est proposée. Cette approche s'inspire beaucoup du référentiel d'assurance qualité ISO 9000/ version 2000.Montiel Antoine J. Une législation nouvelle pour les eaux destinées à la consommation humaine . In: L'eau et le monde vivant. 28èmes Journées de l'Hydraulique. Congrès de la Société Hydrotechnique de France. Paris, 12 et 13 octobre 2004. 2004
2. - Qualité de l’eau destinée à la consommation humaine
Pour protéger la santé publique contre les risques hydriques, des règles ont été élaborées par les hygiénistes depuis plus d'un siècle .
Le décret n°89.3 du 3 janvier 1989 modifié les redéfinit et indique les normes de qualité applicables .
En cas de dépassement d'une norme, plusieurs solutions peuvent être mises en oeuvre, autres que l'interruption de la distribution. Une telle décision nécessite de bien distinguer les différentes significations des valeurs pouvant être associées à un paramètre.Tricard D., Montiel Antoine. 2. - Qualité de l’eau destinée à la consommation humaine. In: L'avenir de l'eau. Quelques réponses des sciences hydrotechniques à une inquiétude mondiale. Vingt deuxièmes journées de l'hydraulique. Paris, 15-17 septembre 1992. Tome 2, 1992
Etude et suivi des ions bromate au cours du traitement de potabilisation de l'eau
Cette étude a pour objet d'évaluer l'impact du traitement de potabilisation dans les conditions d'exploitation habituelles, , sur les teneurs en bromate et en bromure (précurseur des ions bromate) dans trois usines de production d'eau potable de la région parisienne.Une méthode analytique par chromatographie ionique couplée à un détecteur conductimétrique a été développée. La limite de quantification est de 2 et 5 æg/L respectivement pour les ions bromate et bromure.Les résultats obtenus montrent que la préozonation ne contribue pas à la formation d'ions bromate. Les ions bromate sont formés au cours de la post-ozonation et sont introduits au cours de la chloration. Etant donné que les teneurs en bromate dans les eaux produites par ces trois usines excèdent parfois les limites de qualité fixées par la législation française, des essais ont été entrepris en usine de manière à minimiser la formation des ions bromate sans nuire à la désinfection. L'ensemble des essais montrent la difficulté, voire l'impossibilité de respecter les critères de potabilité vis-à-vis du paramètre bromate.La dernière partie de l'étude est consacrée à l'évaluation de l'impact éventuelle d'acides aminés (la glycine, l'acide aspartique, l'acide glutamique et l'isoleucine) sur la formation des ions bromate. Les premiers résultats montrent que l'impact des acides aminés sur la formation des ions bromate semble dépendre du rapport molaire acide aminé/bromure, du pH et de l'acide aminé.The object of this study is to evaluate the impact of drinking water treatment under usual exploitation conditions, on bromate and bromide concentrations in three waterworks upstream Paris.An analytical method based on ionic chromatography coupled with a conductivity detector was developed. The quantification limit of bromate and bromide is 2 and 5 æg/L respectively.The results obtained show that preozonation doesn't play a role in bromate formation. Bromate ions are formed during post-ozonation and are introduced during the chlorination step. Bromate concentrations encountered in drinking water produced by these three waterworks exceed the quality parameters fixed by French regulation, especially in summer. Studies were therefore realised in the plants to minimize bromate formation. The results show the difficulty or even the impossibility to reach the objective.The last part of the study is dedicated to the impact of amino acids (glycine, aspartic acid, glutamic acid and isoleucine) on bromate formation. The first results show that the impact of amino acid seems to depend on amino acid/bromide molar ratio, pH and on the nature of the amino acid.ORSAY-PARIS 11-BU Sciences (914712101) / SudocSudocFranceF
Variations in trihalomethane levels in three French water distribution systems and the development of a predictive model.
International audienceEpidemiological studies have demonstrated that chlorination by-products in drinking water may cause some types of cancer in humans. However, due to differences in methodology between the various studies, it is not possible to establish a dose-response relationship. This shortcoming is due primarily to uncertainties about how exposure is measured-made difficult by the great number of compounds present-the exposure routes involved and the variation in concentrations in water distribution systems. This is especially true for trihalomethanes for which concentrations can double between the water treatment plant and the consumer tap. The aim of this study is to describe the behaviour of trihalomethanes in three French water distribution systems and develop a mathematical model to predict concentrations in the water distribution system using data collected from treated water at the plant (i.e. the entrance of the distribution system). In 2006 and 2007, samples were taken successively from treated water at the plant and at several points in the water distribution system in three French cities. In addition to the concentrations of the four trihalomethanes (chloroform, dichlorobromomethane, chlorodibromomethane, bromoform), many other parameters involved in their formation that affect their concentration were also measured. The average trihalomethane concentration in the three water distribution systems ranged from 21.6 μg/L to 59.9 μg/L. The increase in trihalomethanes between the treated water at the plant and a given point in the water distribution system varied by a factor of 1.1-5.7 over all of the samples. A log-log linear regression model was constructed to predict THM concentrations in the water distribution system. The five variables used were trihalomethane concentration and free residual chlorine for treated water at the plant, two variables that characterize the reactivity of organic matter (specific UV absorbance (SUVA), an indicator developed for the free chlorine consumption in the treatment plant before distribution δ) and water residence time in the distribution system. French regulations impose a minimum trihalomethane level for drinking water and most tests are performed on treated water at the plant. Applied in this context, the model developed here helps better to understand trihalomethane exposure in the French population, particularly useful for epidemiological studies
Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database
Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p < 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p < 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013
Outcome of acute hypoxaemic respiratory failure: insights from the LUNG SAFE Study
Background: Current incidence and outcome of patients with acute hypoxaemic respiratory failure requiring mechanical ventilation in the intensive care unit (ICU) are unknown, especially for patients not meeting criteria for acute respiratory distress syndrome (ARDS).
Methods: An international, multicentre, prospective cohort study of patients presenting with hypoxaemia early in the course of mechanical ventilation, conducted during four consecutive weeks in the winter of 2014 in 459 ICUs from 50 countries (LUNG SAFE). Patients were enrolled with arterial oxygen tension/inspiratory oxygen fraction ratio ≤300 mmHg, new pulmonary infiltrates and need for mechanical ventilation with a positive end-expiratory pressure of ≥5 cmH2O. ICU prevalence, causes of hypoxaemia, hospital survival and factors associated with hospital mortality were measured. Patients with unilateral versus bilateral opacities were compared.
Findings: 12 906 critically ill patients received mechanical ventilation and 34.9% with hypoxaemia and new infiltrates were enrolled, separated into ARDS (69.0%), unilateral infiltrate (22.7%) and congestive heart failure (CHF; 8.2%). The global hospital mortality was 38.6%. CHF patients had a mortality comparable to ARDS (44.1% versus 40.4%). Patients with unilateral-infiltrate had lower unadjusted mortality, but similar adjusted mortality compared to those with ARDS. The number of quadrants on chest imaging was associated with an increased risk of death. There was no difference in mortality comparing patients with unilateral-infiltrate and ARDS with only two quadrants involved.
Interpretation: More than one-third of patients receiving mechanical ventilation have hypoxaemia and new infiltrates with a hospital mortality of 38.6%. Survival is dependent on the degree of pulmonary involvement whether or not ARDS criteria are reached
Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis
International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine