3 research outputs found

    Energy efficient cosine similarity measures according to a convex cost function

    Get PDF
    We propose a new family of vector similarity measures. Each measure is associated with a convex cost function. Given two vectors, we determine the surface normals of the convex function at the vectors. The angle between the two surface normals is the similarity measure. Convex cost function can be the negative entropy function, total variation (TV) function and filtered variation function constructed from wavelets. The convex cost functions need not to be differentiable everywhere. In general, we need to compute the gradient of the cost function to compute the surface normals. If the gradient does not exist at a given vector, it is possible to use the sub-gradients and the normal producing the smallest angle between the two vectors is used to compute the similarity measure. The proposed measures are compared experimentally to other nonlinear similarity measures and the ordinary cosine similarity measure. The TV-based vector product is more energy efficient than the ordinary inner product because it does not require any multiplications. © 2016, Springer-Verlag London

    Multiplication-free Neural Networks [Çarpmasiz Yapay Sinir Aʇi]

    No full text
    In this article, a multiplication-free artificial Neural Network (ANN) structure is proposed. Inner products between the input vectors and the ANN weights are implemented using a multiplication-free vector operator. Training of the new artificial neural network structure is carried out using the sign-LMS algorithm. Proposed ANN system can be used in applications requiring low-power usage or running on microprocessors that have limited processing power. © 2015 IEEE
    corecore