8 research outputs found
Tikhonov regularization as a complexity measure in multiobjective genetic programming
In this paper, we propose the use of Tikhonov regularization in conjunction with node count as a general complexity measure in multiobjective genetic programming. We demonstrate that employing this general complexity yields mean squared test error measures over a range of regression problems, which are typically superior to those from conventional nodecount (but never statistically worse). We also analyze the reason that our new method outperforms the conventional complexity measure and conclude that it forms a decision mechanism that balances both syntactic and semantic information.</p
Training genetic programming classifiers by vicinal-risk minimization
We propose and motivate the use of vicinal-risk minimization (VRM) for training genetic programming classifiers. We demonstrate that VRM has a number of attractive properties and demonstrate that it has a better correlation withgeneralization error compared to empirical risk minimization (ERM) so is more likely to lead to better generalization performance, in general. From the results of statistical tests over a range of real and synthetic datasets, we further demonstrate that VRM yields consistently superior generalization errors compared to conventional ERM.</p
Learning from life-logging data by hybrid HMM: a case study on active states prediction
In this paper, we have proposed employing a hybrid classifier-hidden Markov model (HMM) as a supervised learning approach to recognize daily active states from sequential life-logging data collected from wearable sensors. We generate synthetic data from real dataset to cope with noise and incompleteness for training purpose and, in conjunction with HMM, propose using a multiobjective genetic programming (MOGP) classifier in comparison of the support vector machine (SVM) with variant kernels. We demonstrate that the system with either algorithm works effectively to recognize personal active states regarding medical reference. We also illustrate that MOGP yields generally better results than SVM without requiring an ad hoc kernel.</p
The use of an analytic quotient operator in genetic programming
We propose replacing the division operator used in genetic programming with an analytic quotient operator.We demonstrate that this analytic quotient operator systematically yields lower mean squared errors over a range of regression tasks, due principally to removing the discontinuities or singularities that can often result from using either protected or unprotected division. Further, the analytic quotient operator is differentiable. We also show that the new analytic quotient operator stabilizes the variance of the intermediate quantities in the tree.</p
