1,235 research outputs found
Long-Term Efficacy of Pulmonary Rehabilitation in Patients with Occupational Respiratory Diseases
Background: Pulmonary rehabilitation is a well-recognized treatment option in chronic obstructive lung disease improving exercise performance, respiratory symptoms and quality of life. In occupational respiratory diseases, which can be rather cost-intensive due to the compensation needs, very little information is available. Objectives: This study aims at the evaluation of the usefulness of pulmonary rehabilitation in patients with occupational respiratory diseases, partly involving complex alterations of lung function and of the sustainability of effects. Methods: We studied 263 patients with occupational respiratory diseases (asthma, silicosis, asbestosis, chronic obstructive pulmonary disease) using a 4-week inpatient rehabilitation program and follow-up examinations 3 and 12 months later. The outcomes evaluated were lung function, 6-min walking distance (6MWD), maximum exercise capacity (Wmax), skeletal muscle strength, respiratory symptoms, exacerbations and associated medical consultations, quality of life (SF-36, SGRQ), anxiety/depression (HADS) and Medical Research Council and Baseline and Transition Dyspnea Index scores. Results: Compared to baseline, there were significant (p < 0.05) improvements in 6MWD, Wmax and muscle strength immediately after rehabilitation, and these were maintained over 12 months (p < 0.05). Effects were less pronounced in asbestosis. Overall, a significant reduction in the rate of exacerbations by 35%, antibiotic therapy by 27% and use of health care services by 17% occurred within 12 months after rehabilitation. No changes were seen in the questionnaire outcomes. Conclusions: Pulmonary rehabilitation is effective even in the complex settings of occupational respiratory diseases, providing sustained improvement of functional capacity and reducing health care utilization. Copyright (C) 2012 S. Karger AG, Base
A general purpose HyperTransport-based Application Accelerator Framework
HyperTransport provides a flexible, low latency and high bandwidth interconnection between processors and also between processors and peripheral omponents. Therefore, the interconnection is no longer a performance bottleneck when integrating application specific accelerators in modern computing systems. Current FPGAs providing huge computational power and permit the acceleration of compute-intensive kernels. We therefore present a general purpose architecture based on HyperTransport and modern FPGAs to accelerate time-consuming computations. Further, we present a prototypical implementation of our architecture. Here we used an AMD Opteron-based system with the HTX Board [6] to demonstrate that common applications can benefit from available hardware accelerators. A cryptographic example showed that the encryption of files, larger then 50 kByte, can be successfully accelerated
Exploiting the HTX-Board as a Coprocessor for Exact Arithmetics
Certain numerical computations benefit from dedicated computation units, e.g. providing increased computation accuracy. Exploiting current interconnection technologies and advances in reconfigurable logic, restrictions and drawbacks of past approaches towards application-specific units can be overcome. This paper presents our implementation of an FPGA-based hardware unit for exact arithmetics. The unit is tightly integrated into the host system using state-of-the-art HyperTransport technology. An according runtime system provides OS-level support including dynamic function resolution. The approach demonstrates suitability and applicability of the chosen technologies, setting the pace towards broadly acceptable use of reconfigurable coprocessor technology for application-specific computing
A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources
Polynomial Response Surfaces for Probabilistic Risk Assessment and Risk Control via Robust Design
Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario
A variety of methods is available to quantify uncertainties arising with\-in
the modeling of flow and transport in carbon dioxide storage, but there is a
lack of thorough comparisons. Usually, raw data from such storage sites can
hardly be described by theoretical statistical distributions since only very
limited data is available. Hence, exact information on distribution shapes for
all uncertain parameters is very rare in realistic applications. We discuss and
compare four different methods tested for data-driven uncertainty
quantification based on a benchmark scenario of carbon dioxide storage. In the
benchmark, for which we provide data and code, carbon dioxide is injected into
a saline aquifer modeled by the nonlinear capillarity-free fractional flow
formulation for two incompressible fluid phases, namely carbon dioxide and
brine. To cover different aspects of uncertainty quantification, we incorporate
various sources of uncertainty such as uncertainty of boundary conditions, of
conceptual model definitions and of material properties. We consider recent
versions of the following non-intrusive and intrusive uncertainty
quantification methods: arbitary polynomial chaos, spatially adaptive sparse
grids, kernel-based greedy interpolation and hybrid stochastic Galerkin. The
performance of each approach is demonstrated assessing expectation value and
standard deviation of the carbon dioxide saturation against a reference
statistic based on Monte Carlo sampling. We compare the convergence of all
methods reporting on accuracy with respect to the number of model runs and
resolution. Finally we offer suggestions about the methods' advantages and
disadvantages that can guide the modeler for uncertainty quantification in
carbon dioxide storage and beyond
Factorization of Numbers with the temporal Talbot effect: Optical implementation by a sequence of shaped ultrashort pulses
We report on the successful operation of an analogue computer designed to
factor numbers. Our device relies solely on the interference of classical light
and brings together the field of ultrashort laser pulses with number theory.
Indeed, the frequency component of the electric field corresponding to a
sequence of appropriately shaped femtosecond pulses is determined by a Gauss
sum which allows us to find the factors of a number
- …