2,825 research outputs found
Risk Assessment Model Applied on Building Physics: Statistical Data Acquisition and Stochastic Modeling of Indoor Moisture Supply in Swedish Multi-family Dwellings
Though it is highly appreciated and asked for by the practitioners there is a lack of tools to perform proper risk assessment and risk management procedures in the area of building physics. Many of the influential variables, such as outdoor temperature and indoor moisture supply, have stochastic variations, thus a general approach for risk assessment is complicated. The aim of this study is to define risk concepts in building physics and develop a risk assessment model to be used in the field. The study is based on hazard identification tools used in process industry, such as What-if, HAZOP, FMEA and VMEA. The tools are compared and used in the modeling process which leads to identification of noise factors during design, construction and service life. A literature survey is conducted in order to find statistical input data that should be used in the applicability study, based on stochastic simulations and air flow path modeling in CONTAM. By combining the hazards and safeguards in a scenario, together with Monte Carlo simulations, gives results with a distribution, dependent on the variability of the noise factors. The applicability study shows good correspondence with measurements performed on the indoor moisture supply in Swedish multi-family dwellings. Risk and safe scenarios are defined by comparing the result of the scenario with an allowed level of consequences. By implementing risk management into building physics design, it is possible to indentify critical points to avoid extra unwanted costs. In addition, risks concerning indoor climate, health and durability are clarified
Risk Assessment Model Applied on Building Physics: Statistical Data Acquisition and Stochastic Modeling of Indoor Moisture Supply in Swedish Multi-family Dwellings
Though it is highly appreciated and asked for by the practitioners there is a lack of tools to perform proper risk assessment and risk management procedures in the area of building physics. Many of the influential variables, such as outdoor temperature and indoor moisture supply, have stochastic variations, thus a general approach for risk assessment is complicated. The aim of this study is to define risk concepts in building physics and develop a risk assessment model to be used in the field. The study is based on hazard identification tools used in process industry, such as What-if, HAZOP, FMEA and VMEA. The tools are compared and used in the modeling process which leads to identification of noise factors during design, construction and service life. A literature survey is conducted in order to find statistical input data that should be used in the applicability study, based on stochastic simulations and air flow path modeling in CONTAM. By combining the hazards and safeguards in a scenario, together with Monte Carlo simulations, gives results with a distribution, dependent on the variability of the noise factors. The applicability study shows good correspondence with measurements performed on the indoor moisture supply in Swedish multi-family dwellings. Risk and safe scenarios are defined by comparing the result of the scenario with an allowed level of consequences. By implementing risk management into building physics design, it is possible to indentify critical points to avoid extra unwanted costs. In addition, risks concerning indoor climate, health and durability are clarified
Recommended from our members
Estimates of Electronic Medical Records in U.S. Emergency Departments
Background: Policymakers advocate universal electronic medical records (EMRs) and propose incentives for “meaningful use” of EMRs. Though emergency departments (EDs) are particularly sensitive to the benefits and unintended consequences of EMR adoption, surveillance has been limited. We analyze data from a nationally representative sample of US EDs to ascertain the adoption of various EMR functionalities. Methodology/Principal: Findings We analyzed data from the National Hospital Ambulatory Medical Care Survey, after pooling data from 2005 and 2006, reporting proportions with 95% confidence intervals (95% CI). In addition to reporting adoption of various EMR functionalities, we used logistic regression to ascertain patient and hospital characteristics predicting “meaningful use,” defined as a “basic” system (managing demographic information, computerized provider order entry, and lab and imaging results). We found that 46% (95% CI 39–53%) of US EDs reported having adopted EMRs. Computerized provider order entry was present in 21% (95% CI 16–27%), and only 15% (95% CI 10–20%) had warnings for drug interactions or contraindications. The “basic” definition of “meaningful use” was met by 17% (95% CI 13–21%) of EDs. Rural EDs were substantially less likely to have a “basic” EMR system than urban EDs (odds ratio 0.19, 95% CI 0.06–0.57, p = 0.003), and Midwestern (odds ratio 0.37, 95% CI 0.16–0.84, p = 0.018) and Southern (odds ratio 0.47, 95% CI 0.26–0.84, p = 0.011) EDs were substantially less likely than Northeastern EDs to have a “basic” system. Conclusions/Significance: EMRs are becoming more prevalent in US EDs, though only a minority use EMRs in a “meaningful” way, no matter how “meaningful” is defined. Rural EDs are less likely to have an EMR than metropolitan EDs, and Midwestern and Southern EDs are less likely to have an EMR than Northeastern EDs. We discuss the nuances of how to define “meaningful use,” and the importance of considering not only adoption, but also full implementation and consequences
Estimates of Electronic Medical Records in U.S. Emergency Departments
BACKGROUND: Policymakers advocate universal electronic medical records (EMRs) and propose incentives for "meaningful use" of EMRs. Though emergency departments (EDs) are particularly sensitive to the benefits and unintended consequences of EMR adoption, surveillance has been limited. We analyze data from a nationally representative sample of US EDs to ascertain the adoption of various EMR functionalities. METHODOLOGY/PRINCIPAL FINDINGS: We analyzed data from the National Hospital Ambulatory Medical Care Survey, after pooling data from 2005 and 2006, reporting proportions with 95% confidence intervals (95% CI). In addition to reporting adoption of various EMR functionalities, we used logistic regression to ascertain patient and hospital characteristics predicting "meaningful use," defined as a "basic" system (managing demographic information, computerized provider order entry, and lab and imaging results). We found that 46% (95% CI 39-53%) of US EDs reported having adopted EMRs. Computerized provider order entry was present in 21% (95% CI 16-27%), and only 15% (95% CI 10-20%) had warnings for drug interactions or contraindications. The "basic" definition of "meaningful use" was met by 17% (95% CI 13-21%) of EDs. Rural EDs were substantially less likely to have a "basic" EMR system than urban EDs (odds ratio 0.19, 95% CI 0.06-0.57, p = 0.003), and Midwestern (odds ratio 0.37, 95% CI 0.16-0.84, p = 0.018) and Southern (odds ratio 0.47, 95% CI 0.26-0.84, p = 0.011) EDs were substantially less likely than Northeastern EDs to have a "basic" system. CONCLUSIONS/SIGNIFICANCE: EMRs are becoming more prevalent in US EDs, though only a minority use EMRs in a "meaningful" way, no matter how "meaningful" is defined. Rural EDs are less likely to have an EMR than metropolitan EDs, and Midwestern and Southern EDs are less likely to have an EMR than Northeastern EDs. We discuss the nuances of how to define "meaningful use," and the importance of considering not only adoption, but also full implementation and consequences
Investigation of top mass measurements with the ATLAS detector at LHC
Several methods for the determination of the mass of the top quark with the
ATLAS detector at the LHC are presented. All dominant decay channels of the top
quark can be explored. The measurements are in most cases dominated by
systematic uncertainties. New methods have been developed to control those
related to the detector. The results indicate that a total error on the top
mass at the level of 1 GeV should be achievable.Comment: 47 pages, 40 figure
Mass corrections in decay and the role of distribution amplitudes
We consider mass correction effects on the polar angular distribution of a
baryon--antibaryon pair created in the chain decay process , generalizing a previous analysis of Carimalo. We show the
relevance of the features of the baryon distribution amplitudes and estimate
the electromagnetic corrections to the QCD results.Comment: 26 pages + 3 figures, REVTEX 3.0, figures appended as uuencoded,
tar-compressed postscript fil
Ratio of Hadronic Decay Rates of J\psi and \psi(2S) and the \rho\pi Puzzle
The so-called \rho\pi puzzle of J\psi and \psi(2S) decays is examined using
the experimental data available to date. Two different approaches were taken to
estimate the ratio of J\psi and \psi(2S) hadronic decay rates. While one of the
estimates could not yield the exact ratio of \psi(2S) to J\psi inclusive
hadronic decay rates, the other, based on a computation of the inclusive ggg
decay rate for
\psi(2S) (J\psi) by subtracting other decay rates from the total decay rate,
differs by two standard deviations from the naive prediction of perturbative
QCD, even though its central value is nearly twice as large as what was naively
expected. A comparison between this ratio, upon making corrections for specific
exclusive two-body decay modes, and the corresponding experimental data
confirms the puzzles in
J\psi and \psi(2S) decays. We find from our analysis that the exclusively
reconstructed hadronic decays of the \psi(2S) account for only a small fraction
of its total decays, and a ratio exceeding the above estimate should be
expected to occur for a considerable number of the remaining decay channels. We
also show that the recent new results from the BES experiment provide crucial
tests of various theoretical models proposed to explain the puzzle.Comment: 8 pages, no figure, 4 table
Modelling the nucleon wave function from soft and hard processes
Current light-cone wave functions for the nucleon are unsatisfactory since
they are in conflict with the data of the nucleon's Dirac form factor at large
momentum transfer. Therefore, we attempt a determination of a new wave function
respecting theoretical ideas on its parameterization and satisfying the
following constraints: It should provide a soft Feynman contribution to the
proton's form factor in agreement with data; it should be consistent with
current parameterizations of the valence quark distribution functions and
lastly it should provide an acceptable value for the \jp \to N \bar N decay
width. The latter process is calculated within the modified perturbative
approach to hard exclusive reactions. A simultaneous fit to the three sets of
data leads to a wave function whose -dependent part, the distribution
amplitude, shows the same type of asymmetry as those distribution amplitudes
constrained by QCD sum rules. The asymmetry is however much more moderate as in
those amplitudes. Our distribution amplitude resembles the asymptotic one in
shape but the position of the maximum is somewhat shifted.Comment: 32 pages RevTex + PS-file with 5 figures in uu-encoded, compressed
fil
Search for CP Violation in the Decay Z -> b (b bar) g
About three million hadronic decays of the Z collected by ALEPH in the years
1991-1994 are used to search for anomalous CP violation beyond the Standard
Model in the decay Z -> b \bar{b} g. The study is performed by analyzing
angular correlations between the two quarks and the gluon in three-jet events
and by measuring the differential two-jet rate. No signal of CP violation is
found. For the combinations of anomalous CP violating couplings, and , limits of \hat{h}_b < 0.59h^{\ast}_{b} < 3.02$ are given at 95\% CL.Comment: 8 pages, 1 postscript figure, uses here.sty, epsfig.st
Search for the glueball candidates f0(1500) and fJ(1710) in gamma gamma collisions
Data taken with the ALEPH detector at LEP1 have been used to search for gamma
gamma production of the glueball candidates f0(1500) and fJ(1710) via their
decay to pi+pi-. No signal is observed and upper limits to the product of gamma
gamma width and pi+pi- branching ratio of the f0(1500) and the fJ(1710) have
been measured to be Gamma_(gamma gamma -> f0(1500)). BR(f0(1500)->pi+pi-) <
0.31 keV and Gamma_(gamma gamma -> fJ(1710)). BR(fJ(1710)->pi+pi-) < 0.55 keV
at 95% confidence level.Comment: 10 pages, 3 figure
- …