1,778 research outputs found

    Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system

    Get PDF
    We consider the application of active subspaces to inform a Metropolis-Hastings algorithm, thereby aggressively reducing the computational dimension of the sampling problem. We show that the original formulation, as proposed by Constantine, Kent, and Bui-Thanh (SIAM J. Sci. Comput., 38(5):A2779-A2805, 2016), possesses asymptotic bias. Using pseudo-marginal arguments, we develop an asymptotically unbiased variant. Our algorithm is applied to a synthetic multimodal target distribution as well as a Bayesian formulation of a parameter inference problem for a Lorenz-96 system

    A BROAD SYMMETRY CRITERION FOR NONPARAMETRIC VALIDITY OF PARAMETRICALLY-BASED TESTS IN RANDOMIZED TRIALS

    Get PDF
    Summary. Pilot phases of a randomized clinical trial often suggest that a parametric model may be an accurate description of the trial\u27s longitudinal trajectories. However, parametric models are often not used for fear that they may invalidate tests of null hypotheses of equality between the experimental groups. Existing work has shown that when, for some types of data, certain parametric models are used, the validity for testing the null is preserved even if the parametric models are incorrect. Here, we provide a broader and easier to check characterization of parametric models that can be used to (a) preserve nonparametric validity of testing the null hypothesis, i.e., even when the models are incorrect, and (b) increase power compared to the non- or semiparametric bounds when the models are close to correct. We demonstrate our results in a clinical trial of depression in Alzheimer\u27s patients

    Stochastic models which separate fractal dimension and Hurst effect

    Get PDF
    Fractal behavior and long-range dependence have been observed in an astonishing number of physical systems. Either phenomenon has been modeled by self-similar random functions, thereby implying a linear relationship between fractal dimension, a measure of roughness, and Hurst coefficient, a measure of long-memory dependence. This letter introduces simple stochastic models which allow for any combination of fractal dimension and Hurst exponent. We synthesize images from these models, with arbitrary fractal properties and power-law correlations, and propose a test for self-similarity.Comment: 8 pages, 2 figure

    Investing in Mobility: Freight Transport in the Hudson Region

    Get PDF
    Proposes a framework for assessing alternative investments in freight rail, highway, and transit capacity that would increase the ability to improve mobility and air quality in the New York metropolitan area

    Choosing profile double-sampling designs for survival estimation with application to PEPFAR evaluation

    Get PDF
    Most studies that follow subjects over time are challenged by having some subjects who dropout. Double sampling is a design that selects and devotes resources to intensively pursue and find a subset of these dropouts, then uses data obtained from these to adjust naïve estimates, which are potentially biased by the dropout. Existing methods to estimate survival from double sampling assume a random sample. In limited-resource settings, however, generating accurate estimates using a minimum of resources is important. We propose using double-sampling designs that oversample certain profiles of dropouts as more efficient alternatives to random designs. First, we develop a framework to estimate the survival function under these profile double-sampling designs. We then derive the precision of these designs as a function of the rule for selecting different profiles, in order to identify more efficient designs. We illustrate using data from the United States President's Emergency Plan for AIDS Relief-funded HIV care and treatment program in western Kenya. Our results show why and how more efficient designs should oversample patients with shorter dropout times. Further, our work suggests generalizable practice for more efficient double-sampling designs, which can help maximize efficiency in resource-limited settings

    Estimating effects by combining instrumental variables with case-control designs: the role of principal stratification

    Get PDF
    The instrumental variable framework is commonly used in the estimation of causal effects from cohort samples. In the case of more efficient designs such as the case-control study, however, the combination of the instrumental variable and complex sampling designs requires new methodological consideration. As the prevalence of Mendelian randomization studies is increasing and the cost of genotyping and expression data can be high, the analysis of data gathered from more cost-effective sampling designs is of prime interest. We show that the standard instrumental variable analysis is not applicable to the case-control design and can lead to erroneous estimation and inference. We also propose a method based on principal stratification for the analysis of data arising from the combination of case-control sampling and instrumental variable design and illustrate it with a study in oncology

    Management of digital libraries : challenges and opportunities redefining the contemporary information professional’s role

    Get PDF
    This paper examines digital libraries principally from the management perspective. For the purpose of appreciating the intrinsic concepts involved, it starts with a comprehensive discussion of definitions, followed by basic principles pertaining to digital libraries. Next, it gives a glimpse into a wide-ranging spectrum of reasons as to why digital libraries are mushrooming predominantly in the developed world and also in a few developing countries. Reasons for the management of these types of libraries are also brought into view. Core competencies expected of digital librarians are outlined, in the wake of the new and continuously dynamic technological dispensation. The paper stresses the need for a paradigm shift in information management strategies, in as far as digital libraries are concerned. This is considered to be crucial if at all information professionals are to gain maximum mileage, in their noble mission of satisfying evolving user needs. Urgent attention ought to be directed towards managing of digital libraries, as a means of enabling contemporary information professionals to assert their unique role in society, not only as information gatekeepers but as information gateways, as well

    A Lanczos Method for Approximating Composite Functions

    Full text link
    We seek to approximate a composite function h(x) = g(f(x)) with a global polynomial. The standard approach chooses points x in the domain of f and computes h(x) at each point, which requires an evaluation of f and an evaluation of g. We present a Lanczos-based procedure that implicitly approximates g with a polynomial of f. By constructing a quadrature rule for the density function of f, we can approximate h(x) using many fewer evaluations of g. The savings is particularly dramatic when g is much more expensive than f or the dimension of x is large. We demonstrate this procedure with two numerical examples: (i) an exponential function composed with a rational function and (ii) a Navier-Stokes model of fluid flow with a scalar input parameter that depends on multiple physical quantities
    • …
    corecore