18,722 research outputs found
Does IT Spending Matter on Hospital Financial Performance and Quality?
This research explored impacts of IT spending on hospital financial performance and hospital quality. We developed two research hypotheses accordingly. The first hypothesis was that IT spending would be positively related to the hospital financial performance, and the second hypothesis was that hospitals with higher IT spending would have better quality metrics. We used the 2017 American Hospital Association Survey data and the HCAHPS dataset from Medicare website. We tested three hospital financials and three quality measures. We employed T-Tests and ANOVA models to test the hypotheses. Results were inconclusive for both hypotheses. Evidence showed statistical significance on two out of seven tests
Semiautomated Skeletonization of the Pulmonary Arterial Tree in Micro-CT Images
We present a simple and robust approach that utilizes planar images at different angular rotations combined with unfiltered back-projection to locate the central axes of the pulmonary arterial tree. Three-dimensional points are selected interactively by the user. The computer calculates a sub- volume unfiltered back-projection orthogonal to the vector connecting the two points and centered on the first point. Because more x-rays are absorbed at the thickest portion of the vessel, in the unfiltered back-projection, the darkest pixel is assumed to be the center of the vessel. The computer replaces this point with the newly computer-calculated point. A second back-projection is calculated around the original point orthogonal to a vector connecting the newly-calculated first point and user-determined second point. The darkest pixel within the reconstruction is determined. The computer then replaces the second point with the XYZ coordinates of the darkest pixel within this second reconstruction. Following a vector based on a moving average of previously determined 3- dimensional points along the vessel\u27s axis, the computer continues this skeletonization process until stopped by the user. The computer estimates the vessel diameter along the set of previously determined points using a method similar to the full width-half max algorithm. On all subsequent vessels, the process works the same way except that at each point, distances between the current point and all previously determined points along different vessels are determined. If the difference is less than the previously estimated diameter, the vessels are assumed to branch. This user/computer interaction continues until the vascular tree has been skeletonized
Re-Assessing the U.S. Quality Adjustment to Computer Prices: The Role of Durability and Changing Software
In the second-half of the 1990s, the positive impact of information technology on productivity growth for the United States became apparent. The measurement of this productivity improvement depends on hedonic procedures adopted by the Bureau of Labor Statistics (BLS) and Bureau of Economic Analysis (BEA). In this paper we suggest a new reason why conventional hedonic methods may overstate the price decline of personal computers. We model computers as a durable good and suppose that software changes over time, which influences the efficiency of a computer. Anticipating future increases in software, purchasers may "overbuy" characteristics, in the sense that the purchased bundle of characteristics is not fully utilized in the first months or year that a computer is owned. In this case, we argue that hedonic procedures do not provide valid bounds on the true price of computer services at the time the machine is purchased with the concurrent level of software. To assess these theoretical results we estimate the model and find that before 2000 the hedonic price index constructed with BLS methods overstates the fall in computer prices. After 2000, however, the BLS hedonic index falls more slowly, reflecting the reduced marginal cost of acquiring (and therefore marginal benefit to users) of characteristics such as RAM, hard disk space or speed.
The Interplay of Cluster and Galaxy Evolution
We review here the interplay of cluster and galaxy evolution. As a case
study, we consider the Butcher-Oemler effect and propose that it is the result
of the changing rate of cluster merger events in a hierarchical universe. This
case study highlights the need for new catalogs of clusters and groups that
possess quantified morphologies. We present such a sample here, namely the
Sloan Digital Sky Survey (SDSS) C4 Catalog, which has been objectively-selected
from the SDSS spectroscopic galaxy sample. We outline here the C4 algorithm and
present first results based on the SDSS Early Data Release, including an X-ray
luminosity-velocity dispersion (L_x-sigma) scaling relationship (as a function
of cluster morphology), and the density-SFR relation of galaxies within C4
clusters (Gomez et al. 2003). We also discuss the merger of Coma and the
NGC4839 group, and its effect on the galaxy populations in these systems. We
finish with a brief discussion of a new sample of Hdelta-selected galaxies
(i.e., k+a, post--starburst galaxies) obtained from the SDSS spectroscopic
survey.Comment: Invited review at the JENAM 2002 Workshop on "Galaxy Evolution in
Groups and Clusters", Porto, Sep 5-7 2002, eds. Lobo, Serote-Roos and
Biviano, Kluwer in pres
Acoustic Oscillations in the Early Universe and Today
During its first ~100,000 years, the universe was a fully ionized plasma with
a tight coupling by Thompson scattering between the photons and matter. The
trade--off between gravitational collapse and photon pressure causes acoustic
oscillations in this primordial fluid. These oscillations will leave
predictable imprints in the spectra of the cosmic microwave background and the
present day matter-density distribution. Recently, the BOOMERANG and MAXIMA
teams announced the detection of these acoustic oscillations in the cosmic
microwave background (observed at redshift ~1000). Here, we compare these CMB
detections with the corresponding acoustic oscillations in the matter-density
power spectrum (observed at redshift ~0.1). These consistent results, from two
different cosmological epochs, provide further support for our standard Hot Big
Bang model of the universe.Comment: To appear in the journal Science. 6 pages, 1 color figur
Detecting the Baryons in Matter Power Spectra
We examine power spectra from the Abell/ACO rich cluster survey and the 2dF
Galaxy Redshift Survey (2dfGRS) for observational evidence of features produced
by the baryons. A non-negligible baryon fraction produces relatively sharp
oscillatory features at specific wavenumbers in the matter power spectrum.
However, the mere existence of baryons will also produce a global suppression
of the power spectrum. We look for both of these features using the false
discovery rate (FDR) statistic. We show that the window effects on the
Abell/ACO power spectrum are minimal, which has allowed for the discovery of
discrete oscillatory features in the power spectrum. On the other hand, there
are no statistically significant oscillatory features in the 2dFGRS power
spectrum, which is expected from the survey's broad window function. After
accounting for window effects, we apply a scale-independent bias to the 2dFGRS
power spectrum, P_{Abell}(k) = b^2P_{2dF}(k) and b = 3.2. We find that the
overall shapes of the Abell/ACO and the biased 2dFGRS power spectra are
entirely consistent over the range 0.02 <= k <= 0.15hMpc^-1. We examine the
range of Omega_{matter} and baryon fraction for which these surveys could
detect significant suppression in power. The reported baryon fractions for both
the Abell/ACO and 2dFGRS surveys are high enough to cause a detectable
suppression in power (after accounting for errors, windows and k-space
sampling). Using the same technique, we also examine, given the best fit baryon
density obtained from BBN, whether it is possible to detect additional
suppression due to dark matter-baryon interaction. We find that the limit on
dark matter cross section/mass derived from these surveys are the same as those
ruled out in a recent study by Chen, Hannestad and Scherrer.Comment: 11 pages of text, 6 figures. Submitted to Ap
Quasi-Exact Helical Cone Beam Reconstruction for Micro CT
A cone beam micro-CT system is set up to collect truncated helical cone beam data. This system includes a micro-focal X-ray source, a precision computer-controlled X-Y-Z-theta stage, and an image-intensifier coupled to a large format CCD detector. The helical scanning mode is implemented by rotating and translating the stage while keeping X-ray source and detector stationary. A chunk of bone and a mouse leg are scanned and quasi-exact reconstruction is performed using the approach proposed in J. Hu et al. (2001). This approach introduced the original idea of accessory paths with upper and lower virtual detectors having infinite axial extent. It has a filtered backprojection structure which is desirable in practice and possesses the advantages of being simple to implement and computationally efficient compared to other quasi-exact helical cone beam algorithms for the long object problem
- …