8 research outputs found

    Equivalent number of degrees of freedoms for neural networks.

    No full text
    The notion of equivalent number of degrees of freedom (e.d.f.) to be usedin neural network modeling from small datasets has been introduced in Ingrassiaand Morlini (2005). It is much smaller than the total number of parameters andit does not depend on the number of input variables. We generalize our previousresults and discuss the use of the e.d.f. in the general framework of multivariatenonparametric model selection. Through numerical simulations, we also investigatethe behavior of model selection criteria like AIC, GCV and BIC/SBC, when thee.d.f. is used instead of the total number of the adaptive parameters in the model

    Improving prediction interval quality : a genetic algorithm-based method applied to neural networks

    Full text link
    The delta technique has been proposed in literature for constructingprediction intervals for targets estimated by neural networks. Quality of constructed prediction intervals using this technique highly depends on neural network characteristics. Unfortunately, literature is void of information about how these dependences can be managed in order to optimize prediction intervals. This study attempts to optimize length and coverage probability of prediction intervals through modifying structure and parameters of the underlying neural networks. In an evolutionary optimization, genetic algorithm is applied for finding the optimal values of network size and training hyper-parameters. The applicability and efficiency of the proposed optimization technique is examined and demonstrated using a real case study. It is shown that application of the proposed optimization technique significantly improves quality of constructed prediction intervals in term of length and coverage probability.<br /
    corecore