Graph Neural Networks (GNNs) research has concentrated on improving convolutional layers, with little attention paid to developing graph poolinglayers. Yet pooling layers can enable GNNs toreason over abstracted groups of nodes instead ofsingle nodes, thus increasing their generalization potential. To close this gap, we propose a graph pooling layer relying on the notion of edge con-traction: EdgePool learns a localized and sparse pooling transform. We evaluate it on four datasets, finding that it increases performance on the three largest. We also show that EdgePool can be integrated in existing GNN architectures without adding any additional losses or regularization.
«
Graph Neural Networks (GNNs) research has concentrated on improving convolutional layers, with little attention paid to developing graph poolinglayers. Yet pooling layers can enable GNNs toreason over abstracted groups of nodes instead ofsingle nodes, thus increasing their generalization potential. To close this gap, we propose a graph pooling layer relying on the notion of edge con-traction: EdgePool learns a localized and sparse pooling transform. We evaluate it on four datasets, finding that...
»