107 research outputs found

    Distance-Restricted Folklore Weisfeiler-Leman GNNs with Provable Cycle Counting Power

    Full text link
    The ability of graph neural networks (GNNs) to count certain graph substructures, especially cycles, is important for the success of GNNs on a wide range of tasks. It has been recently used as a popular metric for evaluating the expressive power of GNNs. Many of the proposed GNN models with provable cycle counting power are based on subgraph GNNs, i.e., extracting a bag of subgraphs from the input graph, generating representations for each subgraph, and using them to augment the representation of the input graph. However, those methods require heavy preprocessing, and suffer from high time and memory costs. In this paper, we overcome the aforementioned limitations of subgraph GNNs by proposing a novel class of GNNs -- dd-Distance-Restricted FWL(2) GNNs, or dd-DRFWL(2) GNNs. dd-DRFWL(2) GNNs use node pairs whose mutual distances are at most dd as the units for message passing to balance the expressive power and complexity. By performing message passing among distance-restricted node pairs in the original graph, dd-DRFWL(2) GNNs avoid the expensive subgraph extraction operations in subgraph GNNs, making both the time and space complexity lower. We theoretically show that the discriminative power of dd-DRFWL(2) GNNs strictly increases as dd increases. More importantly, dd-DRFWL(2) GNNs have provably strong cycle counting power even with d=2d=2: they can count all 3, 4, 5, 6-cycles. Since 6-cycles (e.g., benzene rings) are ubiquitous in organic molecules, being able to detect and count them is crucial for achieving robust and generalizable performance on molecular tasks. Experiments on both synthetic datasets and molecular datasets verify our theory. To the best of our knowledge, our model is the most efficient GNN model to date (both theoretically and empirically) that can count up to 6-cycles

    SG-Net: Syntax-Guided Machine Reading Comprehension

    Full text link
    For machine reading comprehension, the capacity of effectively modeling the linguistic knowledge from the detail-riddled and lengthy passages and getting ride of the noises is essential to improve its performance. Traditional attentive models attend to all words without explicit constraint, which results in inaccurate concentration on some dispensable words. In this work, we propose using syntax to guide the text modeling by incorporating explicit syntactic constraints into attention mechanism for better linguistically motivated word representations. In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention. Syntax-guided network (SG-Net) is then composed of this extra SDOI-SAN and the SAN from the original Transformer encoder through a dual contextual architecture for better linguistics inspired representation. To verify its effectiveness, the proposed SG-Net is applied to typical pre-trained language model BERT which is right based on a Transformer encoder. Extensive experiments on popular benchmarks including SQuAD 2.0 and RACE show that the proposed SG-Net design helps achieve substantial performance improvement over strong baselines.Comment: Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-2020

    Vibration and instability of a fluid-conveying nanotube resting on elastic foundation subjected to a magnetic field

    Get PDF
    Using the nonlocal Euler-Bernouli beam model, this paper is carried out to investigate the vibrations and instability of a single-walled carbon nanotube (SWCNT) conveying fluid subjected to a longitudinal magnetic field. The nanobeam with clamped-clamped boundary conditions lies on the Pasternak foundation. Hamilton’s principle is applied to derive the fluid-structure interaction (FSI) governing equation and the corresponding boundary conditions. In the solution part the differential transformation method (DTM) is used to solve the differential equations of motion. The influences of nonlocal parameter, longitudinal magnetic field, Pasternak foundation on the critical divergence velocity of the nanotubes is studied
    • …
    corecore