6 research outputs found

    Bayesian calibration for multiple source regression model

    Get PDF
    In large variety of practical applications, using information from different sources or different kind of data is a reasonable demand. The problem of studying multiple source data can be represented as a multi-task learning problem, and then the information from one source can help to study the information from the other source by extracting a shared common structure. From the other hand, parameter evaluations obtained from various sources can be confused and conflicting. This paper proposes a Bayesian based approach to calibrate data obtained from different sources and to solve nonlinear regression problem in the presence of heteroscedastisity of the multiple-source model. An efficient algorithm is developed for implementation. Using analytical and simulation studies, it is shown that the proposed Bayesian calibration improves the convergence rate of the algorithm and precision of the model. The theoretical results are supported by a synthetic example, and a real-world problem, namely, modeling unsteady pitching moment coefficient of aircraft, for which a recurrent neural network is constructed

    Linear Mixed Models with Marginally Symmetric Nonparametric Random Effects

    Full text link
    Linear mixed models (LMMs) are used as an important tool in the data analysis of repeated measures and longitudinal studies. The most common form of LMMs utilize a normal distribution to model the random effects. Such assumptions can often lead to misspecification errors when the random effects are not normal. One approach to remedy the misspecification errors is to utilize a point-mass distribution to model the random effects; this is known as the nonparametric maximum likelihood-fitted (NPML) model. The NPML model is flexible but requires a large number of parameters to characterize the random-effects distribution. It is often natural to assume that the random-effects distribution be at least marginally symmetric. The marginally symmetric NPML (MSNPML) random-effects model is introduced, which assumes a marginally symmetric point-mass distribution for the random effects. Under the symmetry assumption, the MSNPML model utilizes half the number of parameters to characterize the same number of point masses as the NPML model; thus the model confers an advantage in economy and parsimony. An EM-type algorithm is presented for the maximum likelihood (ML) estimation of LMMs with MSNPML random effects; the algorithm is shown to monotonically increase the log-likelihood and is proven to be convergent to a stationary point of the log-likelihood function in the case of convergence. Furthermore, it is shown that the ML estimator is consistent and asymptotically normal under certain conditions, and the estimation of quantities such as the random-effects covariance matrix and individual a posteriori expectations is demonstrated

    A multivariate linear regression analysis using finite mixtures of t distributions

    No full text
    Recently, finite mixture models have been used to model the distribution of the error terms in multivariate linear regression analysis. In particular, Gaussian mixture models have been employed. A novel approach that assumes that the error terms follow a finite mixture of t distributions is introduced. This assumption allows for an extension of multivariate linear regression models, making these models more versatile and robust against the presence of outliers in the error term distribution. The issues of model identifiability and maximum likelihood estimation are addressed. In particular, identifiability conditions are provided and an Expectation\u2013Maximisation algorithm for estimating the model parameters is developed. Properties of the estimators of the regression coefficients are evaluated through Monte Carlo experiments and compared to the estimators from the Gaussian mixture models. Results from the analysis of two real datasets are presented
    corecore