3,315 research outputs found

    Logistic Regression: Tight Bounds for Stochastic and Online Optimization

    Full text link
    The logistic loss function is often advocated in machine learning and statistics as a smooth and strictly convex surrogate for the 0-1 loss. In this paper we investigate the question of whether these smoothness and convexity properties make the logistic loss preferable to other widely considered options such as the hinge loss. We show that in contrast to known asymptotic bounds, as long as the number of prediction/optimization iterations is sub exponential, the logistic loss provides no improvement over a generic non-smooth loss function such as the hinge loss. In particular we show that the convergence rate of stochastic logistic optimization is bounded from below by a polynomial in the diameter of the decision set and the number of prediction iterations, and provide a matching tight upper bound. This resolves the COLT open problem of McMahan and Streeter (2012)

    Distance-Dependent Kronecker Graphs for Modeling Social Networks

    Get PDF
    This paper focuses on a generalization of stochastic Kronecker graphs, introducing a Kronecker-like operator and defining a family of generator matrices H dependent on distances between nodes in a specified graph embedding. We prove that any lattice-based network model with sufficiently small distance-dependent connection probability will have a Poisson degree distribution and provide a general framework to prove searchability for such a network. Using this framework, we focus on a specific example of an expanding hypercube and discuss the similarities and differences of such a model with recently proposed network models based on a hidden metric space. We also prove that a greedy forwarding algorithm can find very short paths of length O((log log n)^2) on the hypercube with n nodes, demonstrating that distance-dependent Kronecker graphs can generate searchable network models
    • …
    corecore