1,680 research outputs found

    Discrete and fuzzy dynamical genetic programming in the XCSF learning classifier system

    Full text link
    A number of representation schemes have been presented for use within learning classifier systems, ranging from binary encodings to neural networks. This paper presents results from an investigation into using discrete and fuzzy dynamical system representations within the XCSF learning classifier system. In particular, asynchronous random Boolean networks are used to represent the traditional condition-action production system rules in the discrete case and asynchronous fuzzy logic networks in the continuous-valued case. It is shown possible to use self-adaptive, open-ended evolution to design an ensemble of such dynamical systems within XCSF to solve a number of well-known test problems

    Comparison of modelling techniques for milk-production forecasting

    Get PDF
    peer-reviewedThe objective of this study was to assess the suitability of 3 different modeling techniques for the prediction of total daily herd milk yield from a herd of 140 lactating pasture-based dairy cows over varying forecast horizons. A nonlinear auto-regressive model with exogenous input, a static artificial neural network, and a multiple linear regression model were developed using 3 yr of historical milk-production data. The models predicted the total daily herd milk yield over a full season using a 305-d forecast horizon and 50-, 30-, and 10-d moving piecewise horizons to test the accuracy of the models over long- and short-term periods. All 3 models predicted the daily production levels for a full lactation of 305 d with a percentage root mean square error (RMSE) of ≤12.03%. However, the nonlinear auto-regressive model with exogenous input was capable of increasing its prediction accuracy as the horizon was shortened from 305 to 50, 30, and 10 d [RMSE (%) = 8.59, 8.1, 6.77, 5.84], whereas the static artificial neural network [RMSE (%) = 12.03, 12.15, 11.74, 10.7] and the multiple linear regression model [RMSE (%) = 10.62, 10.68, 10.62, 10.54] were not able to reduce their forecast error over the same horizons to the same extent. For this particular application the nonlinear auto-regressive model with exogenous input can be presented as a more accurate alternative to conventional regression modeling techniques, especially for short-term milk-yield predictions

    Feature-based time-series analysis

    Full text link
    This work presents an introduction to feature-based time-series analysis. The time series as a data type is first described, along with an overview of the interdisciplinary time-series analysis literature. I then summarize the range of feature-based representations for time series that have been developed to aid interpretable insights into time-series structure. Particular emphasis is given to emerging research that facilitates wide comparison of feature-based representations that allow us to understand the properties of a time-series dataset that make it suited to a particular feature-based representation or analysis algorithm. The future of time-series analysis is likely to embrace approaches that exploit machine learning methods to partially automate human learning to aid understanding of the complex dynamical patterns in the time series we measure from the world.Comment: 28 pages, 9 figure

    Supervised classification and mathematical optimization

    Get PDF
    Data Mining techniques often ask for the resolution of optimization problems. Supervised Classification, and, in particular, Support Vector Machines, can be seen as a paradigmatic instance. In this paper, some links between Mathematical Optimization methods and Supervised Classification are emphasized. It is shown that many different areas of Mathematical Optimization play a central role in off-the-shelf Supervised Classification methods. Moreover, Mathematical Optimization turns out to be extremely useful to address important issues in Classification, such as identifying relevant variables, improving the interpretability of classifiers or dealing with vagueness/noise in the data.Ministerio de Ciencia e InnovaciónJunta de Andalucí

    Supervised Classification and Mathematical Optimization

    Get PDF
    Data Mining techniques often ask for the resolution of optimization problems. Supervised Classification, and, in particular, Support Vector Machines, can be seen as a paradigmatic instance. In this paper, some links between Mathematical Optimization methods and Supervised Classification are emphasized. It is shown that many different areas of Mathematical Optimization play a central role in off-the-shelf Supervised Classification methods. Moreover, Mathematical Optimization turns out to be extremely useful to address important issues in Classification, such as identifying relevant variables, improving the interpretability of classifiers or dealing with vagueness/noise in the data
    • …
    corecore