5,247 research outputs found
Sampling from Dirichlet process mixture models with unknown concentration parameter: mixing issues in large data implementations
We consider the question of Markov chain Monte Carlo sampling from a general stick-breaking Dirichlet process mixture model, with concentration parameter (Formula presented.). This paper introduces a Gibbs sampling algorithm that combines the slice sampling approach of Walker (Communications in Statistics - Simulation and Computation 36:45-54, 2007) and the retrospective sampling approach of Papaspiliopoulos and Roberts (Biometrika 95(1):169-186, 2008). Our general algorithm is implemented as efficient open source C++ software, available as an R package, and is based on a blocking strategy similar to that suggested by Papaspiliopoulos (A note on posterior sampling from Dirichlet mixture models, 2008) and implemented by Yau et al. (Journal of the Royal Statistical Society, Series B (Statistical Methodology) 73:37-57, 2011). We discuss the difficulties of achieving good mixing in MCMC samplers of this nature in large data sets and investigate sensitivity to initialisation. We additionally consider the challenges when an additional layer of hierarchy is added such that joint inference is to be made on (Formula presented.). We introduce a new label-switching move and compute the marginal partition posterior to help to surmount these difficulties. Our work is illustrated using a profile regression (Molitor et al. Biostatistics 11(3):484-498, 2010) application, where we demonstrate good mixing behaviour for both synthetic and real examples. © 2014 The Author(s)
How are Statistical Journals linked? A Network Analysis
The exploratory analysis developed in this paper relies on the hypothesis that each editor possesses some power in the definition of the editorial policy of her journal. Consequently if the same scholar sits on the board of two journals, those journals could have some common elements in their editorial policies. The proximity of the editorial policies of two scientific journals can be assessed by the number of common editors sitting on their boards. A database of all editors of the journals classified as “Statistics & Probability” in the Journal of Citation Report by ISI-Thomson is used. The structure of the network generated by the interlocking editorship is explored applying the instruments of network analysis. Evidences are found of a very compact network. This is interpreted as the result of a common perspective about the appropriate methods for investigating the problems and constructing the theories in the domain of statisticsNetworks; Journals; Editorial boards; Interlocking editorship; Statisticians
Maximum likelihood estimators from discrete data modeled by mixed fractional Brownian motion with application to the Nordic stock markets
Mixed fractional Brownian motion is a linear combination of Brownian motion and independent Fractional Brownian motion that is extensively used for option pricing. The consideration of the mixed process is able to capture the long–range dependence property that financial time series exhibit. This paper examines the problem of deriving simultaneously the estimators of all the unknown parameters for a model driven by the mixed fractional Brownian motion using the maximum likelihood estimation method. The consistency and asymptotic normality properties of these estimators are provided. The performance of the methodology is tested on simulated data sets, and the outcomes illustrate that the maximum likelihood technique is efficient and reliable. An empirical application of the proposed method is also made to the real financial data from four Nordic stock market indices.©2020 Taylor & Francis Group, LLC. This is an Accepted Manuscript of an article published by Taylor & Francis in Communications in statistics: simulation and computation on 30 May 2020, available online: http://www.tandfonline.com/10.1080/03610918.2020.1764581.fi=vertaisarvioitu|en=peerReviewed
Shrinkage Estimation and Prediction for Joint Type-II Censored Data from Two Burr-XII Populations
The main objective of this paper is to apply linear and pretest shrinkage
estimation techniques to estimating the parameters of two 2-parameter Burr-XII
distributions. Further more, predictions for future observations are made using
both classical and Bayesian methods within a joint type-II censoring scheme.
The efficiency of shrinkage estimates is compared to maximum likelihood and
Bayesian estimates obtained through the expectation-maximization algorithm and
importance sampling method, as developed by Akbari Bargoshadi et al. (2023) in
"Statistical inference under joint type-II censoring data from two Burr-XII
populations" published in Communications in Statistics-Simulation and
Computation". For Bayesian estimations, both informative and non-informative
prior distributions are considered. Additionally, various loss functions
including squared error, linear-exponential, and generalized entropy are taken
into account. Approximate confidence, credible, and highest probability density
intervals are calculated. To evaluate the performance of the estimation
methods, a Monte Carlo simulation study is conducted. Additionally, two real
datasets are utilized to illustrate the proposed methods.Comment: 33 pages and 33 table
- …