206 research outputs found
Effect of inhaled nitric oxide on pulmonary function in cystic fibrosis
AbstractConcentrations of nitric oxide (NO) have been found to be reduced in both the upper and lower airway of patients with cystic fibrosis (CF). As NO modulates bronchomuscular tone, low NO levels may contribute to the obstructive lung disease in these patients. To assess whether increasing inspiratory NO concentrations has any impact on lung function, we have studied 13 CF patients aged 14–38 years in a clinically stable condition and nine healthy controls. NO was applied via a mixing chamber for 5 min with NO concentrations of 100 parts per billion, 1 and 40 parts per million. Spirometry was performed at baseline and after inhalation on each occasion.There were no clinical side-effects at any NO concentration and no changes in oxygen saturation were observed. Lung function remained unchanged in all subjects throughout the study period. Sputum nitrate and nitrite concentrations before and after inhalation of high NO concentrations (40 ppm) in eight CF patients did not show any significant changes, even though a tendency to higher nitrate levels was observed (399 ± 231 vs. 556 ± 474 μmoll−1). Therefore, inhaled NO at either the physiological levels present in the upper airway of normal individuals or those used therapeutically to treat pulmonary hypertension has no immediate effect on bronchomuscular tone in patients with cystic fibrosis
Recommended from our members
A compact laboratory transmission X-ray microscope for the water window
In the water window (2.2-4.4 nm) the attenuation of radiation in water is significantly smaller than in organic material. Therefore, intact biological specimen (e.g. cells) can be investigated in their natural environment. In order to make this technique accessible to users in a laboratory environment a Full-Field Laboratory Transmission X-ray Microscope (L-TXM) has been developed. The L-TXM is operated with a nitrogen laser plasma source employing an InnoSlab high power laser system for plasma generation. For microscopy the Ly α emission of highly ionized nitrogen at 2.48 nm is used. A laser plasma brightness of 5 × 1011 photons/(s × sr × μm2 in line at 2.48 nm) at a laser power of 70 W is demonstrated. In combination with a state-of-the-art Cr/V multilayer condenser mirror the sample is illuminated with 106 photons/(μm2 × s). Using objective zone plates 35-40 nm lines can be resolved with exposure times < 60 s. The exposure time can be further reduced to 20 s by the use of new multilayer condenser optics and operating the laser at its full power of 130 W. These exposure times enable cryo tomography in a laboratory environment
Solving ill-posed bilevel programs
This paper deals with ill-posed bilevel programs, i.e., problems admitting multiple lower-level solutions for some upper-level parameters. Many publications have been devoted to the standard optimistic case of this problem, where the difficulty is essentially moved from the objective function to the feasible set. This new problem is simpler but there is no guaranty to obtain local optimal solutions for the original optimistic problem by this process. Considering the intrinsic non-convexity of bilevel programs, computing local optimal solutions is the best one can hope to get in most cases. To achieve this goal, we start by establishing an equivalence between the original optimistic problem an a certain set-valued optimization problem. Next, we develop optimality conditions for the latter problem and show that they generalize all the results currently known in the literature on optimistic bilevel optimization. Our approach is then extended to multiobjective bilevel optimization, and completely new results are derived for problems with vector-valued upper- and lower-level objective functions. Numerical implementations of the results of this paper are provided on some examples, in order to demonstrate how the original optimistic problem can be solved in practice, by means of a special set-valued optimization problem
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Ethical principles and recommendations for the medical management of differences of sex development (DSD)/intersex in children and adolescents
Distributionally robust L1-estimation in multiple linear regression
Linear regression is one of the most important and widely used techniques in data analysis, for which a key step is the estimation of the unknown parameters. However, it is often carried out under the assumption that the full information of the error distribution is available. This is clearly unrealistic in practice. In this paper, we propose a distributionally robust formulation of L1-estimation (or the least absolute value estimation) problem, where the only knowledge on the error distribution is that it belongs to a well-defined ambiguity set. We then reformulate the estimation problem as a computationally tractable conic optimization problem by using duality theory. Finally, a numerical example is solved as a conic optimization problem to demonstrate the effectiveness of the proposed approach
Markers of Myocardial Damage Predict Mortality in Patients With Aortic Stenosis
Background: Cardiovascular magnetic resonance (CMR) is increasingly used for risk stratification in aortic stenosis (AS). However, the relative prognostic power of CMR markers and their respective thresholds remains undefined.
Objectives: Using machine learning, the study aimed to identify prognostically important CMR markers in AS and their thresholds of mortality.
Methods: Patients with severe AS undergoing AVR (n = 440, derivation; n = 359, validation cohort) were prospectively enrolled across 13 international sites (median 3.8 years’ follow-up). CMR was performed shortly before surgical or transcatheter AVR. A random survival forest model was built using 29 variables (13 CMR) with post-AVR death as the outcome.
Results: There were 52 deaths in the derivation cohort and 51 deaths in the validation cohort. The 4 most predictive CMR markers were extracellular volume fraction, late gadolinium enhancement, indexed left ventricular end-diastolic volume (LVEDVi), and right ventricular ejection fraction. Across the whole cohort and in asymptomatic patients, risk-adjusted predicted mortality increased strongly once extracellular volume fraction exceeded 27%, while late gadolinium enhancement >2% showed persistent high risk. Increased mortality was also observed with both large (LVEDVi >80 mL/m2) and small (LVEDVi ≤55 mL/m2) ventricles, and with high (>80%) and low (≤50%) right ventricular ejection fraction. The predictability was improved when these 4 markers were added to clinical factors (3-year C-index: 0.778 vs 0.739). The prognostic thresholds and risk stratification by CMR variables were reproduced in the validation cohort.
Conclusions: Machine learning identified myocardial fibrosis and biventricular remodeling markers as the top predictors of survival in AS and highlighted their nonlinear association with mortality. These markers may have potential in optimizing the decision of AVR
- …