3,091 research outputs found

    Generalized Wald-type Tests based on Minimum Density Power Divergence Estimators

    Get PDF
    In testing of hypothesis the robustness of the tests is an important concern. Generally, the maximum likelihood based tests are most efficient under standard regularity conditions, but they are highly non-robust even under small deviations from the assumed conditions. In this paper we have proposed generalized Wald-type tests based on minimum density power divergence estimators for parametric hypotheses. This method avoids the use of nonparametric density estimation and the bandwidth selection. The trade-off between efficiency and robustness is controlled by a tuning parameter β\beta. The asymptotic distributions of the test statistics are chi-square with appropriate degrees of freedom. The performance of the proposed tests are explored through simulations and real data analysis.Comment: 26 pages, 10 figures. arXiv admin note: substantial text overlap with arXiv:1403.033

    Social Network Analysis with sna

    Get PDF
    Modern social network analysis---the analysis of relational data arising from social systems---is a computationally intensive area of research. Here, we provide an overview of a software package which provides support for a range of network analytic functionality within the R statistical computing environment. General categories of currently supported functionality are described, and brief examples of package syntax and usage are shown.

    Mixtures of g-priors in Generalized Linear Models

    Full text link
    Mixtures of Zellner's g-priors have been studied extensively in linear models and have been shown to have numerous desirable properties for Bayesian variable selection and model averaging. Several extensions of g-priors to Generalized Linear Models (GLMs) have been proposed in the literature; however, the choice of prior distribution of g and resulting properties for inference have received considerably less attention. In this paper, we unify mixtures of g-priors in GLMs by assigning the truncated Compound Confluent Hypergeometric (tCCH) distribution to 1/(1 + g), which encompasses as special cases several mixtures of g-priors in the literature, such as the hyper-g, Beta-prime, truncated Gamma, incomplete inverse-Gamma, benchmark, robust, hyper-g/n, and intrinsic priors. Through an integrated Laplace approximation, the posterior distribution of 1/(1 + g) is in turn a tCCH distribution, and approximate marginal likelihoods are thus available analytically, leading to "Compound Hypergeometric Information Criteria" for model selection. We discuss the local geometric properties of the g-prior in GLMs and show how the desiderata for model selection proposed by Bayarri et al, such as asymptotic model selection consistency, intrinsic consistency, and measurement invariance may be used to justify the prior and specific choices of the hyper parameters. We illustrate inference using these priors and contrast them to other approaches via simulation and real data examples. The methodology is implemented in the R package BAS and freely available on CRAN

    Social Network Analysis with sna

    Get PDF
    Modern social network analysis---the analysis of relational data arising from social systems---is a computationally intensive area of research. Here, we provide an overview of a software package which provides support for a range of network analytic functionality within the R statistical computing environment. General categories of currently supported functionality are described, and brief examples of package syntax and usage are shown

    Contributions to Mediation Analysis and First Principles Modeling for Mechanistic Statistical Analysis

    Full text link
    This thesis contains three projects that propose novel methods for studying mechanisms that explain statistical relationships. The ultimate goal of each of these methods is to help researchers describe how or why complex relationships between observed variables exist. The first project proposes and studies a method for recovering mediation structure in high dimensions. We take a dimension reduction approach that generalizes the ``product of coefficients'' concept for univariate mediation analysis through the optimization of a loss function. We devise an efficient algorithm for optimizing the product-of-coefficients inspired loss function. Through extensive simulation studies, we show that the method is capable of consistently identifying mediation structure. Finally, two case studies are presented that demonstrate how the method can be used to conduct multivariate mediation analysis. The second project uses tools from conditional inference to improve the calibration of tests of univariate mediation hypotheses. The key insight of the project is that the non-Euclidean geometry of the null parameter space causes the test statistic’s sampling distribution to depend on a nuisance parameter. After identifying a statistic that is both sufficient for the nuisance parameter and approximately ancillary for the parameter of interest, we derive the test statistic’s limiting conditional sampling distribution. We additionally develop a non-standard bootstrap procedure for calibration in finite samples. We demonstrate through simulation studies that improved evidence calibration leads to substantial power increases over existing methods. This project suggests that conditional inference might be a useful tool in evidence calibration for other non-standard or otherwise challenging problems. In the last project, we present a methodological contribution to a pharmaceutical science study of {em in vivo} ibuprofen pharmacokinetics. We demonstrate how model misspecification in a first-principles analysis can be addressed by augmenting the model to include a term corresponding to an omitted source of variation. In previously used first-principles models, gastric emptying, which is pulsatile and stochastic, is modeled as first-order diffusion for simplicity. However, analyses suggest that the actual gastric emptying process is expected to be a unimodal smooth function, with phase and amplitude varying by subject. Therefore, we adopt a flexible approach in which a highly idealized parametric version of gastric emptying is combined with a Gaussian process to capture deviations from the idealized form. These functions are characterized by their distributions, which allows us to learn their common and unique features across subjects despite that these features are not directly observed. Through simulation studies, we show that the proposed approach is able to identify certain features of latent function distributions.PHDStatisticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/163026/1/josephdi_1.pd
    • …
    corecore