4 research outputs found

    Subgroup Preference Neural Network.

    Full text link
    Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (SGPNN) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (ANN) to discover the hidden relation between the subgroups' multi-labels. The SGPNN is a feedforward (FF), partially connected network that has a single middle layer and uses stairstep (SS) multi-valued activation function to enhance the prediction's probability and accelerate the ranking convergence. The novel structure of the proposed SGPNN consists of a multi-activation function neuron (MAFN) in the middle layer to rank each subgroup independently. The SGPNN uses gradient ascent to maximize the Spearman ranking correlation between the groups of labels. Each label is represented by an output neuron that has a single SS function. The proposed SGPNN using conjoint dataset outperforms the other label ranking methods which uses each dataset individually. The proposed SGPNN achieves an average accuracy of 91.4% using the conjoint dataset compared to supervised clustering, decision tree, multilayer perceptron label ranking and label ranking forests that achieve an average accuracy of 60%, 84.8%, 69.2% and 73%, respectively, using the individual dataset

    Preference Neural Network

    Full text link
    This paper proposes a preference neural network (PNN) to address the problem of indifference preferences orders with new activation function. PNN also solves the Multi-label ranking problem, where labels may have indifference preference orders or subgroups are equally ranked. PNN follows a multi-layer feedforward architecture with fully connected neurons. Each neuron contains a novel smooth stairstep activation function based on the number of preference orders. PNN inputs represent data features and output neurons represent label indexes. The proposed PNN is evaluated using new preference mining dataset that contains repeated label values which have not experimented before. PNN outperforms five previously proposed methods for strict label ranking in terms of accurate results with high computational efficiency.Comment: The current content is inappropriate and requires to be comprehensively reviewed agai

    What Socrates Began: An Examination of Intellect Vol. 1

    Get PDF
    Walter E. Russell Endowed Chair in Philosophy and Education Symposium 1988https://digitalcommons.usm.maine.edu/facbooks/1237/thumbnail.jp
    corecore