103,560 research outputs found
The Combined Forecasts Using the Akaike Weights
The focus in the paper is on the information criteria approach and especially the Akaike information criterion which is used to obtain the Akaike weights. This approach enables to receive not one best model, but several plausible models for which the ranking can be built using the Akaike weights. This set of candidate models is the basis of calculating individual forecasts, and then for combining forecasts using the Akaike weights. The procedure of obtaining the combined forecasts using the AIC weights is proposed. The performance of combining forecasts with the AIC weights and equal weights with regard to individual forecasts obtained from models selected by the AIC criterion and the a posteriori selection method is compared in simulation experiment. The conditions when the Akaike weights are worth to use in combining forecasts were indicated. The use of the information criteria approach to obtain combined forecasts as an alternative to formal hypothesis testing was recommended.combining forecasts, weighting schemes, information criteria.
Akaike\u27s Information Criterion (AIC) Untuk Seleksi Optimal Pada Model Neural Network = Akaike\u27s Information Criterion ( AIC) For The Selection Optimal Of Model Neural Network
ABSTRACT
During the last twenty years, Akaike\u27s Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the Akaike\u27s Information Criterion (AIC) to determine the optimal architecture model of neural network. Neural network have been used to resolve a variety of classification problems. The computational properties of many of the possible network designs have been analyzed, but the decision as to which of several competing network architecture is "best" for a given problem remains subjective.
A relationship between optimal neural net-work and model statistic identification is described. A derivative of Akaike\u27s Information Criterion (AIC) is given.
Key words : neural network, Multi-Layered Perceptions, Maximum Likelihood, Kullback-Leibler Information, Entropy, Akaike\u27s Information Criterion
Principal Component Regression Analysis of CO2 Emission
Principal component regression (PCR) model is developed, in this study, for predicting and forecasting the abundance of CO2 emission which is the most important greenhouse gas in the atmosphere that contributes to global warming. The model was compared with supervised principal component regression (SPCR) model and was found to have more predictive power than it using the values of Akaike information criterion (AIC) and Swartz information criterion (SIC) of the models.Keywords: Global warming, CO2, Principal component regression (PCR), Supervised principal component regression (SPCR), Akaike information criterion (AIC) and Swartz information criterion (SIC
Model selection in continuous test norming with GAMLSS
To compute norms from reference group test scores, continuous norming is preferred over traditional norming. A suitable continuous norming approach for continuous data is the use of the Box–Cox Power Exponential model, which is found in the generalized additive models for location, scale, and shape. Applying the Box–Cox Power Exponential model for test norming requires model selection, but it is unknown how well this can be done with an automatic selection procedure. In a simulation study, we compared the performance of two stepwise model selection procedures combined with four model-fit criteria (Akaike information criterion, Bayesian information criterion, generalized Akaike information criterion (3), cross-validation), varying data complexity, sampling design, and sample size in a fully crossed design. The new procedure combined with one of the generalized Akaike information criterion was the most efficient model selection procedure (i.e., required the smallest sample size). The advocated model selection procedure is illustrated with norming data of an intelligence test
Combined Forecasts Using the Akaike Weights
The focus in the paper is on the information criteria approach and especially the Akaike information criterion which is used to obtain the Akaike weights. This approach enables to receive not one best model, but several plausible models for which the ranking can be built using the Akaike weights. This set of candidate models is the basis of calculating individual forecasts, and then for combining forecasts using the Akaike weights. The procedure of obtaining the combined forecasts using the AIC weights is proposed. The performance of combining forecasts with the AIC weights and equal weights with regard to individual forecasts obtained from models selected by the AIC criterion and the a posteriori selection method is compared in simulation experiment. The conditions when the Akaike weights are worth to use in combining forecasts were indicated. The use of the information criteria approach to obtain combined forecasts as an alternative to formal hypothesis testing was recommended
Information criteria for astrophysical model selection
Model selection is the problem of distinguishing competing models, perhaps
featuring different numbers of parameters. The statistics literature contains
two distinct sets of tools, those based on information theory such as the
Akaike Information Criterion (AIC), and those on Bayesian inference such as the
Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance
Information Criterion combines ideas from both heritages; it is readily
computed from Monte Carlo posterior samples and, unlike the AIC and BIC, allows
for parameter degeneracy. I describe the properties of the information
criteria, and as an example compute them from WMAP3 data for several
cosmological models. I find that at present the information theory and Bayesian
approaches give significantly different conclusions from that data.Comment: 5 pages, no figures. Update to match version accepted by MNRAS
Letters. Extra references, minor changes to discussion, no change to
conclusion
- …