50 research outputs found

    Performance of a localized tree splitting criterion in tree averaging

    No full text
    This paper explores the performance of the local splitting criterion devised by Bremner & Taplin for classification and regression trees when multiple trees are averaged to improve performance. The criterion is compared with the deviance used by Clark & Pregibon's method, which is a global splitting criterion typically used to grow trees. The paper considers multiple trees generated by randomly selecting splits with probability proportional to the likelihood for the split, and by bagging where bootstrap samples from the data are used to grow trees. The superiority of the localized splitting criterion often persists when multiple trees are grown and averaged for six datasets. Tree averaging is known to be advantageous when the trees being averaged produce different predictions, and this can be achieved by choosing splits where the splitting criterion is locally optimal. The paper shows that use of locally optimal splits gives promising results in conjunction with both local and global splitting criteria, and with and without random selection of splits. The paper also extends the local splitting criterion to accommodate categorical predictors

    Performance of localized regression tree splitting criteria on data with discontinuities

    No full text
    Properties of the localized regression tree splitting criterion, described in Bremner & Taplin (2002) and referred to as the BT method, are explored in this paper and compared to those of Clark & Pregibon's (1992) criterion (the CP method). These properties indicate why the BT method can result in superior trees. This paper shows that the BT method exhibits a weak bias towards edge splits, and the CP method exhibits a strong bias towards central splits in the presence of main effects. A third criterion, called the SM method, that exhibits no bias towards a particular split position is introduced. The SM method is a modification of the BT method that uses more symmetric local means. The BT and SM methods are more likely to split at a discontinuity than the CP method because of their relatively low bias towards particular split positions. The paper shows that the BT and SM methods can be used to discover discontinuities in the data, and that they offer a way of producing a variety of different trees for examination or for tree averaging methods

    Modified classification and regression tree splitting criteria for data with interactions

    No full text
    This paper proposes modified splitting criteria for classification and regression trees by modifying the definition of the deviance. The modified deviance is based on local averaging instead of global averaging and is more successful at modelling data with interactions. The paper shows that the modified criteria result in much simpler trees for pure interaction data (no main effects) and can produce trees with fewer errors and lower residual mean deviances than those produced by Clark & Pregibon’s (1992) method when applied to real datasets with strong interaction effects

    Region of eye contact of humanoid Nao robot is similar to that of a human

    No full text
    Eye contact is an important social cue in human-human interaction, but it is unclear how easily it carries over to humanoid robots. In this study we investigated whether the tolerance of making eye contact is similar for the Nao robot as compared to human lookers. We measured the region of eye contact (REC) in three conditions (sitting, standing and eye height). We found that the REC of the Nao robot is similar to that of human lookers. We also compared the centre of REC with the robot’s gaze direction when looking straight at the observer’s nose bridge. We found that the nose bridge lies slightly above the computed centre of the REC. This can be interpreted as an asymmetry in the downward direction of the REC. Taken together these results enable us to model eye contact and the tolerance for breaking eye contact with the Nao robot
    corecore