11,941 research outputs found

    Kirchhoff Index As a Measure of Edge Centrality in Weighted Networks: Nearly Linear Time Algorithms

    Full text link
    Most previous work of centralities focuses on metrics of vertex importance and methods for identifying powerful vertices, while related work for edges is much lesser, especially for weighted networks, due to the computational challenge. In this paper, we propose to use the well-known Kirchhoff index as the measure of edge centrality in weighted networks, called θ\theta-Kirchhoff edge centrality. The Kirchhoff index of a network is defined as the sum of effective resistances over all vertex pairs. The centrality of an edge ee is reflected in the increase of Kirchhoff index of the network when the edge ee is partially deactivated, characterized by a parameter θ\theta. We define two equivalent measures for θ\theta-Kirchhoff edge centrality. Both are global metrics and have a better discriminating power than commonly used measures, based on local or partial structural information of networks, e.g. edge betweenness and spanning edge centrality. Despite the strong advantages of Kirchhoff index as a centrality measure and its wide applications, computing the exact value of Kirchhoff edge centrality for each edge in a graph is computationally demanding. To solve this problem, for each of the θ\theta-Kirchhoff edge centrality metrics, we present an efficient algorithm to compute its ϵ\epsilon-approximation for all the mm edges in nearly linear time in mm. The proposed θ\theta-Kirchhoff edge centrality is the first global metric of edge importance that can be provably approximated in nearly-linear time. Moreover, according to the θ\theta-Kirchhoff edge centrality, we present a θ\theta-Kirchhoff vertex centrality measure, as well as a fast algorithm that can compute ϵ\epsilon-approximate Kirchhoff vertex centrality for all the nn vertices in nearly linear time in mm

    Convergence of the Exponentiated Gradient Method with Armijo Line Search

    Get PDF
    Consider the problem of minimizing a convex differentiable function on the probability simplex, spectrahedron, or set of quantum density matrices. We prove that the exponentiated gradient method with Armjo line search always converges to the optimum, if the sequence of the iterates possesses a strictly positive limit point (element-wise for the vector case, and with respect to the Lowner partial ordering for the matrix case). To the best our knowledge, this is the first convergence result for a mirror descent-type method that only requires differentiability. The proof exploits self-concordant likeness of the log-partition function, which is of independent interest.Comment: 18 page
    • …
    corecore