211,992 research outputs found

    Some inequalities involving the distance signless Laplacian eigenvalues of graphs

    Get PDF
    Given a simple graph G, the distance signless Laplacian DQ(G) = Tr(G) + D(G) is the sum of vertex transmissions matrix T r(G) and distance matrix D(G). In this paper, thanks to the symmetry of DQ(G), we obtain novel sharp bounds on the distance signless Laplacian eigenvalues of G, and in particular the distance signless Laplacian spectral radius. The bounds are expressed through graph diameter, vertex covering number, edge covering number, clique number, independence number, domination number as well as extremal transmission degrees. The graphs achieving the corresponding bounds are delineated. In addition, we investigate the distance signless Laplacian spectrum induced by Indu-Bala product, Cartesian product as well as extended double cover graph

    The additive-multiplicative distance matrix of a graph, and a novel third invariant

    Full text link
    Graham showed with Pollak and Hoffman-Hosoya that for any directed graph GG with strong blocks GeG_e, the determinant det⁑(DG)\det(D_G) and cofactor-sum cof(DG)cof(D_G) of the distance matrix DGD_G can be computed from the same quantities for the blocks GeG_e. This was extended to trees - and in our recent work to any graph - with multiplicative and qq-distance matrices. For trees, we went further and unified all previous variants with weights in a unital commutative ring, into a distance matrix with additive and multiplicative edge-data. In this work: (1) We introduce the additive-multiplicative distance matrix DGD_G of every strongly connected graph GG, using what we term the additive-multiplicative block-datum G\mathcal{G}. This subsumes the previously studied additive, multiplicative, and qq-distances for all graphs. (2) We introduce an invariant ΞΊ(DG)\kappa(D_G) that seems novel to date, and use it to show "master" Graham-Hoffman-Hosoya (GHH) identities, which express det⁑(DG),cof(DG)\det(D_G), cof(D_G) in terms of the blocks GeG_e. We show how these imply all previous variants. (3) We show det⁑(.),cof(.),ΞΊ(.)\det(.), cof(.), \kappa(.) depend only on the block-data for not just DGD_G, but also several minors of DGD_G. This was not studied in any setting to date; we show it in the "most general" additive-multiplicative setting, hence in all known settings. (4) We compute DGβˆ’1D_G^{-1} in closed-form; this specializes to all known variants. In particular, we recover our previous formula for DTβˆ’1D_T^{-1} for additive-multiplicative trees (which itself specializes to a result of Graham-Lovasz and answers a 2006 question of Bapat-Lal-Pati.) (5) We also show that not the Laplacian, but a closely related matrix is the "correct" one to use in DGβˆ’1D_G^{-1} - for the most general additive-multiplicative matrix DGD_G of each GG. As examples, we compute in closed form det⁑(DG),cof(DG),ΞΊ(DG),DGβˆ’1\det(D_G), cof(D_G), \kappa(D_G), D_G^{-1} for hypertrees.Comment: 27 pages, LaTe

    QESK: Quantum-based Entropic Subtree Kernels for Graph Classification

    Full text link
    In this paper, we propose a novel graph kernel, namely the Quantum-based Entropic Subtree Kernel (QESK), for Graph Classification. To this end, we commence by computing the Average Mixing Matrix (AMM) of the Continuous-time Quantum Walk (CTQW) evolved on each graph structure. Moreover, we show how this AMM matrix can be employed to compute a series of entropic subtree representations associated with the classical Weisfeiler-Lehman (WL) algorithm. For a pair of graphs, the QESK kernel is defined by computing the exponentiation of the negative Euclidean distance between their entropic subtree representations, theoretically resulting in a positive definite graph kernel. We show that the proposed QESK kernel not only encapsulates complicated intrinsic quantum-based structural characteristics of graph structures through the CTQW, but also theoretically addresses the shortcoming of ignoring the effects of unshared substructures arising in state-of-the-art R-convolution graph kernels. Moreover, unlike the classical R-convolution kernels, the proposed QESK can discriminate the distinctions of isomorphic subtrees in terms of the global graph structures, theoretically explaining the effectiveness. Experiments indicate that the proposed QESK kernel can significantly outperform state-of-the-art graph kernels and graph deep learning methods for graph classification problems
    • …
    corecore