4,367 research outputs found
Evaluation of the Performance of Smoothing Functions in Generalized Additive Models for Spatial Variation in Disease
Generalized additive models (GAMs) with bivariate smoothing functions have been applied to estimate spatial variation in risk for many types of cancers. Only a handful of studies have evaluated the performance of smoothing functions applied in GAMs with regard to different geographical areas of elevated risk and different risk levels. This study evaluates the ability of different smoothing functions to detect overall spatial variation of risk and elevated risk in diverse geographical areas at various risk levels using a simulation study. We created five scenarios with different true risk area shapes (circle, triangle, linear) in a square study region. We applied four different smoothing functions in the GAMs, including two types of thin plate regression splines (TPRS) and two versions of locally weighted scatterplot smoothing (loess). We tested the null hypothesis of constant risk and detected areas of elevated risk using analysis of deviance with permutation methods and assessed the performance of the smoothing methods based on the spatial detection rate, sensitivity, accuracy, precision, power, and false-positive rate. The results showed that all methods had a higher sensitivity and a consistently moderate-to-high accuracy rate when the true disease risk was higher. The models generally performed better in detecting elevated risk areas than detecting overall spatial variation. One of the loess methods had the highest precision in detecting overall spatial variation across scenarios and outperformed the other methods in detecting a linear elevated risk area. The TPRS methods outperformed loess in detecting elevated risk in two circular areas
Evaluating Geographically Weighted Regression Models for Environmental Chemical Risk Analysis
In the evaluation of cancer risk related to environmental chemical exposures, the effect of many correlated chemicals on disease is often of interest. The relationship between correlated environmental chemicals and health effects is not always constant across a study area, as exposure levels may change spatially due to various environmental factors. Geographically weighted regression (GWR) has been proposed to model spatially varying effects. However, concerns about collinearity effects, including regression coefficient sign reversal (ie, reversal paradox), may limit the applicability of GWR for environmental chemical risk analysis. A penalized version of GWR, the geographically weighted lasso, has been proposed to remediate the collinearity effects in GWR models. Our focus in this study was on assessing through a simulation study the ability of GWR and GWL to correctly identify spatially varying chemical effects for a mixture of correlated chemicals within a study area. Our results showed that GWR suffered from the reversal paradox, while GWL overpenalized the effects for the chemical most strongly related to the outcome
Assessment of Weighted Quantile Sum Regression for Modeling Chemical Mixtures and Cancer Risk
In evaluation of cancer risk related to environmental chemical exposures, the effect of many chemicals on disease is ultimately of interest. However, because of potentially strong correlations among chemicals that occur together, traditional regression methods suffer from collinearity effects, including regression coefficient sign reversal and variance inflation. In addition, penalized regression methods designed to remediate collinearity may have limitations in selecting the truly bad actors among many correlated components. The recently proposed method of weighted quantile sum (WQS) regression attempts to overcome these problems by estimating a body burden index, which identifies important chemicals in a mixture of correlated environmental chemicals. Our focus was on assessing through simulation studies the accuracy of WQS regression in detecting subsets of chemicals associated with health outcomes (binary and continuous) in site-specific analyses and in non-site-specific analyses. We also evaluated the performance of the penalized regres-sion methods of lasso, adaptive lasso, and elastic net in correctly classifying chemicals as bad actors or unrelated to the outcome. We based the simulation study on data from the National Cancer Institute Surveillance Epidemiology and End Results Program (NCI-SEER) case–control study of non-Hodgkin lymphoma (NHL) to achieve realistic exposure situations. Our results showed that WQS regression had good sensitivity and specificity across a variety of conditions considered in this study. The shrinkage methods had a tendency to incorrectly identify a large number of components, especially in the case of strong association with the outcome
Catchment Area Analysis Using Bayesian Regression Modeling
A catchment area (CA) is the geographic area and population from which a cancer center draws patients. Defining a CA allows a cancer center to describe its primary patient population and assess how well it meets the needs of cancer patients within the CA. A CA definition is required for cancer centers applying for National Cancer Institute (NCI)-designated cancer center status. In this research, we constructed both diagnosis and diagnosis/treatment CAs for the Massey Cancer Center (MCC) at Virginia Commonwealth University. We constructed diagnosis CAs for all cancers based on Virginia state cancer registry data and Bayesian hierarchical logistic regression models. We constructed a diagnosis/treatment CA using billing data from MCC and a Bayesian hierarchical Poisson regression model. To define CAs, we used exceedance probabilities for county random effects to assess unusual spatial clustering of patients diagnosed or treated at MCC after adjusting for important demographic covariates. We used the MCC CAs to compare patient characteristics inside and outside the CAs. Among cancer patients living within the MCC CA, patients diagnosed at MCC were more likely to be minority, female, uninsured, or on Medicaid
Intercomparison of standard resolution and high resolution TOVS soundings with radiosonde, lidar, and surface temperature/humidity data
One objective of the FIRE Cirrus IFO is to characterize relationships between cloud properties inferred from satellite observations at various scales to those obtained directly or inferred from very high resolution measurements. Satellite derived NOAA-9 high and standard resolution Tiros Operational Vertical Sounder (TOVS) soundings are compared with directly measured lidar, surface temperature, humidity, and vertical radiosonde profiles associated with the Ft. McCoy site. The results of this intercomparison should be useful in planning future cloud experiments
Visualizing Spacetime Curvature via Frame-Drag Vortexes and Tidal Tendexes I. General Theory and Weak-Gravity Applications
When one splits spacetime into space plus time, the Weyl curvature tensor
(vacuum Riemann tensor) gets split into two spatial, symmetric, and trace-free
(STF) tensors: (i) the Weyl tensor's so-called "electric" part or tidal field,
and (ii) the Weyl tensor's so-called "magnetic" part or frame-drag field. Being
STF, the tidal field and frame-drag field each have three orthogonal
eigenvector fields which can be depicted by their integral curves. We call the
integral curves of the tidal field's eigenvectors tendex lines, we call each
tendex line's eigenvalue its tendicity, and we give the name tendex to a
collection of tendex lines with large tendicity. The analogous quantities for
the frame-drag field are vortex lines, their vorticities, and vortexes. We
build up physical intuition into these concepts by applying them to a variety
of weak-gravity phenomena: a spinning, gravitating point particle, two such
particles side by side, a plane gravitational wave, a point particle with a
dynamical current-quadrupole moment or dynamical mass-quadrupole moment, and a
slow-motion binary system made of nonspinning point particles. [Abstract is
abbreviated; full abstract also mentions additional results.]Comment: 25 pages, 20 figures, matches the published versio
Information preserving structures: A general framework for quantum zero-error information
Quantum systems carry information. Quantum theory supports at least two
distinct kinds of information (classical and quantum), and a variety of
different ways to encode and preserve information in physical systems. A
system's ability to carry information is constrained and defined by the noise
in its dynamics. This paper introduces an operational framework, using
information-preserving structures to classify all the kinds of information that
can be perfectly (i.e., with zero error) preserved by quantum dynamics. We
prove that every perfectly preserved code has the same structure as a matrix
algebra, and that preserved information can always be corrected. We also
classify distinct operational criteria for preservation (e.g., "noiseless",
"unitarily correctible", etc.) and introduce two new and natural criteria for
measurement-stabilized and unconditionally preserved codes. Finally, for
several of these operational critera, we present efficient (polynomial in the
state-space dimension) algorithms to find all of a channel's
information-preserving structures.Comment: 29 pages, 19 examples. Contains complete proofs for all the theorems
in arXiv:0705.428
Randomized trial comparing proactive, high-dose versus reactive, low-dose intravenous iron supplementation in hemodialysis (PIVOTAL) : Study design and baseline data
Background: Intravenous (IV) iron supplementation is a standard maintenance treatment for hemodialysis (HD) patients, but the optimum dosing regimen is unknown. Methods: PIVOTAL (Proactive IV irOn Therapy in hemodiALysis patients) is a multicenter, open-label, blinded endpoint, randomized controlled (PROBE) trial. Incident HD adults with a serum ferritin 700 μg/L and/or TSAT ≥40%) or a reactive, low-dose IV iron arm (iron sucrose administered if ferritin <200 μg/L or TSAT < 20%). We hypothesized that proactive, high-dose IV iron would be noninferior to reactive, low-dose IV iron for the primary outcome of first occurrence of nonfatal myocardial infarction (MI), nonfatal stroke, hospitalization for heart failure or death from any cause. If noninferiority is confirmed with a noninferiority limit of 1.25 for the hazard ratio of the proactive strategy relative to the reactive strategy, a test for superiority will be carried out. Secondary outcomes include infection-related endpoints, ESA dose requirements, and quality-of-life measures. As an event-driven trial, the study will continue until at least 631 primary outcome events have accrued, but the expected duration of follow-up is 2-4 years. Results: Of the 2,589 patients screened across 50 UK sites, 2,141 (83%) were randomized. At baseline, 65.3% were male, the median age was 65 years, and 79% were white. According to eligibility criteria, all patients were on ESA at screening. Prior stroke and MI were present in 8 and 9% of the cohort, respectively, and 44% of patients had diabetes at baseline. Baseline data for the randomized cohort were generally concordant with recent data from the UK Renal Registry. Conclusions: PIVOTAL will provide important information about the optimum dosing of IV iron in HD patients representative of usual clinical practice. Trial Registration: EudraCT number: 2013-002267-25.Peer reviewedFinal Published versio
BOSS-LDG: A Novel Computational Framework that Brings Together Blue Waters, Open Science Grid, Shifter and the LIGO Data Grid to Accelerate Gravitational Wave Discovery
We present a novel computational framework that connects Blue Waters, the
NSF-supported, leadership-class supercomputer operated by NCSA, to the Laser
Interferometer Gravitational-Wave Observatory (LIGO) Data Grid via Open Science
Grid technology. To enable this computational infrastructure, we configured,
for the first time, a LIGO Data Grid Tier-1 Center that can submit
heterogeneous LIGO workflows using Open Science Grid facilities. In order to
enable a seamless connection between the LIGO Data Grid and Blue Waters via
Open Science Grid, we utilize Shifter to containerize LIGO's workflow software.
This work represents the first time Open Science Grid, Shifter, and Blue Waters
are unified to tackle a scientific problem and, in particular, it is the first
time a framework of this nature is used in the context of large scale
gravitational wave data analysis. This new framework has been used in the last
several weeks of LIGO's second discovery campaign to run the most
computationally demanding gravitational wave search workflows on Blue Waters,
and accelerate discovery in the emergent field of gravitational wave
astrophysics. We discuss the implications of this novel framework for a wider
ecosystem of Higher Performance Computing users.Comment: 10 pages, 10 figures. Accepted as a Full Research Paper to the 13th
IEEE International Conference on eScienc
- …
