1,778 research outputs found
Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system
We consider the application of active subspaces to inform a
Metropolis-Hastings algorithm, thereby aggressively reducing the computational
dimension of the sampling problem. We show that the original formulation, as
proposed by Constantine, Kent, and Bui-Thanh (SIAM J. Sci. Comput.,
38(5):A2779-A2805, 2016), possesses asymptotic bias. Using pseudo-marginal
arguments, we develop an asymptotically unbiased variant. Our algorithm is
applied to a synthetic multimodal target distribution as well as a Bayesian
formulation of a parameter inference problem for a Lorenz-96 system
A BROAD SYMMETRY CRITERION FOR NONPARAMETRIC VALIDITY OF PARAMETRICALLY-BASED TESTS IN RANDOMIZED TRIALS
Summary. Pilot phases of a randomized clinical trial often suggest that a parametric model may be an accurate description of the trial\u27s longitudinal trajectories. However, parametric models are often not used for fear that they may invalidate tests of null hypotheses of equality between the experimental groups. Existing work has shown that when, for some types of data, certain parametric models are used, the validity for testing the null is preserved even if the parametric models are incorrect. Here, we provide a broader and easier to check characterization of parametric models that can be used to (a) preserve nonparametric validity of testing the null hypothesis, i.e., even when the models are incorrect, and (b) increase power compared to the non- or semiparametric bounds when the models are close to correct. We demonstrate our results in a clinical trial of depression in Alzheimer\u27s patients
Recommended from our members
Disparities and Trends in Indoor Exposure to Secondhand Smoke among U.S. Adolescents: 2000-2009
Introduction: Secondhand smoke (SHS) exposure causes disease and death among nonsmokers. With a plethora of smoke-free legislation implemented and a steady decrease in cigarette consumption noted over the past decade in the U.S., this study assessed trends in indoor SHS exposure among U.S. adolescents in grades 6–12 during 2000–2009. Methods: Data were obtained from the 2000–2009 National Youth Tobacco Survey – a national survey of U.S. middle and high school students. SHS exposure within an indoor area within the past seven days was self-reported. Trends in indoor SHS exposure during 2000–2009 were assessed overall and by socio-demographic characteristics, using the Wald's test in a binary logistic regression. Within-group comparisons were performed using chi-squared statistics (p<0.05). Results: The proportion of U.S. middle and high school students who were exposed to indoor SHS declined from 65.5% in 2000 to 40.5% in 2009 (p<0.05 for linear trend). Significant declines were also observed across all population subgroups. Between 2000 and 2009, prevalence of indoor SHS exposure declined significantly among both middle (58.5% to 34.3%) and high school (71.5% to 45.4%) students. Prevalence of indoor SHS exposure was significantly higher among girls (44.0% in 2009) compared to boys (37.2% in 2009) during each survey year. Similarly, prevalence of indoor SHS exposure during 2000–2009 was highest among non-Hispanic whites (44.2% in 2009) and lowest among non-Hispanic Asians (30.2% in 2009). During each survey year, prevalence was highest among the oldest age group (≥18 years) and lowest among the youngest (9–11 years). Also, prevalence was significantly higher among current cigarette smokers (83.8% in 2009) compared to nonsmokers (34.0% in 2009). Conclusion: Significant declines in indoor SHS exposure among U.S. middle and high school students occurred during 2000–2009. While the results are encouraging, additional efforts are needed to further reduce youth indoor SHS exposure
Stochastic models which separate fractal dimension and Hurst effect
Fractal behavior and long-range dependence have been observed in an
astonishing number of physical systems. Either phenomenon has been modeled by
self-similar random functions, thereby implying a linear relationship between
fractal dimension, a measure of roughness, and Hurst coefficient, a measure of
long-memory dependence. This letter introduces simple stochastic models which
allow for any combination of fractal dimension and Hurst exponent. We
synthesize images from these models, with arbitrary fractal properties and
power-law correlations, and propose a test for self-similarity.Comment: 8 pages, 2 figure
Investing in Mobility: Freight Transport in the Hudson Region
Proposes a framework for assessing alternative investments in freight rail, highway, and transit capacity that would increase the ability to improve mobility and air quality in the New York metropolitan area
Choosing profile double-sampling designs for survival estimation with application to PEPFAR evaluation
Most studies that follow subjects over time are challenged by having some subjects who dropout. Double sampling is a design that selects and devotes resources to intensively pursue and find a subset of these dropouts, then uses data obtained from these to adjust naïve estimates, which are potentially biased by the dropout. Existing methods to estimate survival from double sampling assume a random sample. In limited-resource settings, however, generating accurate estimates using a minimum of resources is important. We propose using double-sampling designs that oversample certain profiles of dropouts as more efficient alternatives to random designs. First, we develop a framework to estimate the survival function under these profile double-sampling designs. We then derive the precision of these designs as a function of the rule for selecting different profiles, in order to identify more efficient designs. We illustrate using data from the United States President's Emergency Plan for AIDS Relief-funded HIV care and treatment program in western Kenya. Our results show why and how more efficient designs should oversample patients with shorter dropout times. Further, our work suggests generalizable practice for more efficient double-sampling designs, which can help maximize efficiency in resource-limited settings
Estimating effects by combining instrumental variables with case-control designs: the role of principal stratification
The instrumental variable framework is commonly used in the estimation of causal effects from cohort samples. In the case of more efficient designs such as the case-control study, however, the combination of the instrumental variable and complex sampling designs requires new methodological consideration. As the prevalence of Mendelian randomization studies is increasing and the cost of genotyping and expression data can be high, the analysis of data gathered from more cost-effective sampling designs is of prime interest. We show that the standard instrumental variable analysis is not applicable to the case-control design and can lead to erroneous estimation and inference. We also propose a method based on principal stratification for the analysis of data arising from the combination of case-control sampling and instrumental variable design and illustrate it with a study in oncology
Management of digital libraries : challenges and opportunities redefining the contemporary information professional’s role
This paper examines digital libraries principally from the
management perspective. For the purpose of appreciating the
intrinsic concepts involved, it starts with a comprehensive
discussion of definitions, followed by basic principles pertaining to
digital libraries. Next, it gives a glimpse into a wide-ranging
spectrum of reasons as to why digital libraries are mushrooming
predominantly in the developed world and also in a few developing
countries. Reasons for the management of these types of libraries
are also brought into view. Core competencies expected of digital
librarians are outlined, in the wake of the new and continuously
dynamic technological dispensation. The paper stresses the need
for a paradigm shift in information management strategies, in as
far as digital libraries are concerned. This is considered to be
crucial if at all information professionals are to gain maximum
mileage, in their noble mission of satisfying evolving user needs.
Urgent attention ought to be directed towards managing of digital
libraries, as a means of enabling contemporary information
professionals to assert their unique role in society, not only as
information gatekeepers but as information gateways, as well
A Lanczos Method for Approximating Composite Functions
We seek to approximate a composite function h(x) = g(f(x)) with a global
polynomial. The standard approach chooses points x in the domain of f and
computes h(x) at each point, which requires an evaluation of f and an
evaluation of g. We present a Lanczos-based procedure that implicitly
approximates g with a polynomial of f. By constructing a quadrature rule for
the density function of f, we can approximate h(x) using many fewer evaluations
of g. The savings is particularly dramatic when g is much more expensive than f
or the dimension of x is large. We demonstrate this procedure with two
numerical examples: (i) an exponential function composed with a rational
function and (ii) a Navier-Stokes model of fluid flow with a scalar input
parameter that depends on multiple physical quantities
- …