539,003 research outputs found

    The Hachemeister Regression Model

    Get PDF
    In this article we will obtain, just like in the case of classical credibility model, a credibility solution in the form of a linear combination of the individual estimate (based on the data of a particular state) and the collective estimate (based on aggregate USA data). Mathematics Subject Classification: 62P05.linearized regression credibility premium, the structural parameters, unbiased estimators.

    Developing Multi Linear Regression Models for Estimation of Marshall Stability

    Full text link
    Nowadays, asphalt roads are exposed to increasing traffic loads in recent times. It is important to obtain a quality and healthy asphalt road covering when considering the conditions of our country where freight and passenger transportation are carried out by roads. One of the most important issues in asphalt road design is the determination of the optimum percentage of bitumen. The Marshall stability test is utilized for optimum percent bitumen determination. In our work, instead of the long and laborious Marshall experiment process, Multi Linear Regression (MLR) Models are developed as an alternative. Models were developed for Marshall experiment result for Marshall stability prediction. In order to construct stability estimation models, pre-made test parameters are used. These parameters are; the bitumen penetration (P),weight of the sample in the weather (H), the temperature (C), the bitumen weight (G), the sample heights (Y), the bitumen percentage (W), weight of the sample in water (S), the stability (ST). In the performance evaluation of the models, the correlation coefficient (R), the mean percentage errors (MPE) and the meansquare errors (MSE) are used. It is seen that the model with the highest performance value is composed of six variable model in this study formed by the MLR. The R value of the best model is 0.571.The MSE value of the best model is 14841,81. The MPE value of the best model is 9.58

    The Infinite Hierarchical Factor Regression Model

    Get PDF
    We propose a nonparametric Bayesian factor regression model that accounts for uncertainty in the number of factors, and the relationship between factors. To accomplish this, we propose a sparse variant of the Indian Buffet Process and couple this with a hierarchical model over factors, based on Kingman's coalescent. We apply this model to two problems (factor analysis and factor regression) in gene-expression data analysis

    Component Selection in the Additive Regression Model

    Full text link
    Similar to variable selection in the linear regression model, selecting significant components in the popular additive regression model is of great interest. However, such components are unknown smooth functions of independent variables, which are unobservable. As such, some approximation is needed. In this paper, we suggest a combination of penalized regression spline approximation and group variable selection, called the lasso-type spline method (LSM), to handle this component selection problem with a diverging number of strongly correlated variables in each group. It is shown that the proposed method can select significant components and estimate nonparametric additive function components simultaneously with an optimal convergence rate simultaneously. To make the LSM stable in computation and able to adapt its estimators to the level of smoothness of the component functions, weighted power spline bases and projected weighted power spline bases are proposed. Their performance is examined by simulation studies across two set-ups with independent predictors and correlated predictors, respectively, and appears superior to the performance of competing methods. The proposed method is extended to a partial linear regression model analysis with real data, and gives reliable results
    • …
    corecore