141 research outputs found
Does Urban Noise Represent a Hazard to Health?
The problem of noise as a potential health hazard to urban man has been raised. The literature was used to establish two premises: that cities are noisy environments, and that noise-free societies have less coronary artery disease (CAD) than do industrialized sections of the world. These differences also hold for rural and urban areas of the United States.
Geographical questions concerning rate differentials for CAD have been addressed by numerous disciplines. Subsequently, social, psychological, and physical explanations have been put forth. Throughout this paper the emphasis has been placed on the physical aspects of noise exposure. The conceptual frame utilizes noise-load, overload, stress and deformation. Noise was described as a force capable of eliciting a predictable physiological response from the human organism. Noise was further conceptualized as a by-product of technology which exerts a stressor effect upon the cardiovascular system of man.
The investigation, from which the data were generated, was a micro-view of physiological effects, in that the only measurement taken was heart rate change in hospitalized patients in response to noise.
The heart rate was calculated under low noise conditions, and comparisons subsequently made to heart rate during noise. In addition, the noise climate for each of two coronary care units (CCU) was tabulated over a 24 hour period. Generally noise levels in the CCUs were higher than might be found in a man\u27s own home. Only between the hours of 3:00 and four in the morning, was ambient noise equal to or below the suggested levels (45 dbA) for any sustained period of time.
Conditions of noise elicited heart rate change in 30 of 37 subjects (p=.001). This finding relates to the presence of a change and does not speak to the extent or meaningfulness of that change. Patients with heart attacks responded to noise conditions (n=18, P=.01) in that 17 of the eighteen patients experienced a change in heart rate when noise was introduced. No differences could be noted for categories by site of infarctions.
It was further hypothesized that the extent of heart rate response (HRR) would be a function of the gap between low noise and high noise conditions. A regression analysis showed the response to be significantly correlated with noise gap for the total population (N=37, P=.05), however the correlation was minimal (r=.4528) with slightly less than 21 per cent of the variation in HRR explained by the variation in noise gap. Those subjects more than 60 years of age (n=20), also showed a significant correlation (r=.5173) with 26 per cent of the variation in HRR explained by the variation in noise gap. The highest correlation (r=.7373) was obtained for ten persons with a past history of heart disease (r2=.5436, P=.05).
The implications for site planning and structure are many, particularly for hospitals, nursing, and convalescent homes where older persons with heart disease are housed. Site planning should give attention to noise environment; and structural planning, to sound-proofing. Interviews with architects and hospital builders showed this goal to be attainable mechanically, if somewhat costly.
It was agreed by those interviewed that such costs as evolve from noise-reduction or noise-proofing in hospitals would most certainly be passed on to the consumer and be reflected in his health care costs.
Additional research is needed which focuses on the effects of noise on the cardiovascular system over time, using standardized criteria for cardiovascular health and cardiovascular disease. Other research might focus on larger samples of patients hospitalized with CAD, in an effort to identify an index of physiological and psychological responses to noise
Educational Needs of Dislocated Workers in Minnesota.
Economic changes in the 1980s in the United States caused job losses in a number of major industries. Dislocated workers from four Minnesota industries--manufacturing, mining, lumber, and agriculture--were interviewed about their job goals, plans for retraining, and needs for improved basic skills in reading and mathematics. This report includes policy recommendations as to what unions, companies, government, and educational institutions can do to aid dislocated workers. A summary of the study appeared in the June 1988 CURA Reporter.A project of the Interactive Research Grants Program, Center for Urban and Regional Affairs and the Office of the Vice President for Academic Affairs, University of Minnesota
Recommended from our members
Survey of sampling-based methods for uncertainty and sensitivity analysis.
Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition
Recommended from our members
A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost
Measuring the impact of ambulatory red blood cell transfusion on home functional status: study protocol for a pilot randomized controlled trial
SPIRIT 2013: SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) Checklist for clinical trial protocols. (DOCX 65Â kb
A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems
<p>Abstract</p> <p>Background</p> <p>Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species.</p> <p>Results</p> <p>We present four techniques, derivative approximation (DA), polynomial approximation (PA), Gauss-Hermite integration (GHI), and orthonormal Hermite approximation (OHA), for <it>analytically </it>approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the four approximation techniques considered in this paper is orders of magnitude smaller than traditional Monte Carlo estimation. Software, coded in MATLAB<sup>®</sup>, which implements all sensitivity analysis techniques discussed in this paper, is available free of charge.</p> <p>Conclusions</p> <p>Estimating variance-based sensitivity indices of a large biochemical reaction system is a computationally challenging task that can only be addressed via approximations. Among the methods presented in this paper, a technique based on orthonormal Hermite polynomials seems to be an acceptable candidate for the job, producing very good approximation results for a wide range of uncertainty levels in a fraction of the time required by traditional Monte Carlo sampling.</p
Global sensitivity analysis of stochastic computer models with joint metamodels
The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong
- …