A new major release, introducing TorchScript support, memory-efficient aggregations, bipartite GNN modules, static graphs and much more!
Major Features
- TorchScript support, see here for the accompanying tutorial (thanks to @lgray and @liaopeiyuan)
- Memory-efficient aggregations via
torch_sparse.SparseTensor
, see here for the accompanying tutorial - Most GNN modules can now operate on bipartite graphs (and some of them can also operate on different feature dimensionalities for source and target nodes), useful for neighbor sampling or heterogeneous graphs:
conv = SAGEConv(in_channels=(32, 64), out_channels=64)
out = conv((x_src, x_dst), edge_index)
- Static graph support:
conv = GCNConv(in_channels=32, out_channels=64)
x = torch.randn(batch_size, num_nodes, in_channels)
out = conv(x, edge_index)
print(out.size())
>>> torch.Size([batch_size, num_nodes, out_channels])
Additional Features
PNAConv
(thanks to @lukecavabarrett and @gcorso)- Pre-Trained
DimeNet
on QM9 - SEAL link prediction example (thanks to @muhanzhang)
ClusterGCNConv
- Cluster-GCN PPI example (thanks to @CFF-Dream)
WeightedEdgeSampler
for GraphSAINT (thanks to @KiddoZhu)- Better
num_workers
support forGraphSAINT
- The automatic addition of self-loops can now be disabled via the
add_self_loops
argument, e.g., for GCNConv
Breaking Changes
- Memory-efficient
RGCNConv
: The oldRGCNConv
implementation has been moved toFastRGCNConv
Complementary Frameworks
- DeepSNAP: A PyTorch library that bridges between graph libraries such as NetworkX and PyTorch Geometric
- PyTorch Geometric Temporal: A temporal GNN library built upon PyTorch Geometric
Datasets
GNNBenchmarkDataset
suite from the Benchmarking Graph Neural Networks paperWordNet18
Bugfixes
- Fixed a bug in the
VGAE
KL-loss computation (thanks to @GuillaumeSalha)