44 research outputs found
A Parallel Template for Implementing Filters for Biological Correlation Networks
High throughput biological experiments are critical for their role in systems biology – the ability to survey the state of cellular mechanisms on the broad scale opens possibilities for the scientific researcher to understand how multiple components come together, and what goes wrong in disease states. However, the data returned from these experiments is massive and heterogeneous, and requires intuitive and clever computational algorithms for analysis. The correlation network model has been proposed as a tool for modeling and analysis of this high throughput data; structures within the model identified by graph theory have been found to represent key players in major cellular pathways. Previous work has found that network filtering using graph theoretic structural concepts can reduce noise and strengthen biological signals in these networks. However, the process of filtering biological network using such filters is computationally intensive and the filtered networks remain large. In this research, we develop a parallel template for these network filters to improve runtime, and use this high performance environment to show that parallelization does not affect network structure or biological function of that structure
Boosting the Cycle Counting Power of Graph Neural Networks with I-GNNs
Message Passing Neural Networks (MPNNs) are a widely used class of Graph
Neural Networks (GNNs). The limited representational power of MPNNs inspires
the study of provably powerful GNN architectures. However, knowing one model is
more powerful than another gives little insight about what functions they can
or cannot express. It is still unclear whether these models are able to
approximate specific functions such as counting certain graph substructures,
which is essential for applications in biology, chemistry and social network
analysis. Motivated by this, we propose to study the counting power of Subgraph
MPNNs, a recent and popular class of powerful GNN models that extract rooted
subgraphs for each node, assign the root node a unique identifier and encode
the root node's representation within its rooted subgraph. Specifically, we
prove that Subgraph MPNNs fail to count more-than-4-cycles at node level,
implying that node representations cannot correctly encode the surrounding
substructures like ring systems with more than four atoms. To overcome this
limitation, we propose I-GNNs to extend Subgraph MPNNs by assigning
different identifiers for the root node and its neighbors in each subgraph.
I-GNNs' discriminative power is shown to be strictly stronger than Subgraph
MPNNs and partially stronger than the 3-WL test. More importantly, I-GNNs
are proven capable of counting all 3, 4, 5 and 6-cycles, covering common
substructures like benzene rings in organic chemistry, while still keeping
linear complexity. To the best of our knowledge, it is the first linear-time
GNN model that can count 6-cycles with theoretical guarantees. We validate its
counting power in cycle counting tasks and demonstrate its competitive
performance in molecular prediction benchmarks
Proceedings of the 8th Cologne-Twente Workshop on Graphs and Combinatorial Optimization
International audienceThe Cologne-Twente Workshop (CTW) on Graphs and Combinatorial Optimization started off as a series of workshops organized bi-annually by either Köln University or Twente University. As its importance grew over time, it re-centered its geographical focus by including northern Italy (CTW04 in Menaggio, on the lake Como and CTW08 in Gargnano, on the Garda lake). This year, CTW (in its eighth edition) will be staged in France for the first time: more precisely in the heart of Paris, at the Conservatoire National d’Arts et Métiers (CNAM), between 2nd and 4th June 2009, by a mixed organizing committee with members from LIX, Ecole Polytechnique and CEDRIC, CNAM
LIPIcs, Volume 274, ESA 2023, Complete Volume
LIPIcs, Volume 274, ESA 2023, Complete Volum
LIPIcs, Volume 248, ISAAC 2022, Complete Volume
LIPIcs, Volume 248, ISAAC 2022, Complete Volum
LIPIcs, Volume 261, ICALP 2023, Complete Volume
LIPIcs, Volume 261, ICALP 2023, Complete Volum