github pyg-team/pytorch_geometric 1.5.0

latest releases: 2.6.1, 2.6.0, 2.5.3...
4 years ago

This release is a big one thanks to many wonderful contributors. You guys are awesome!

Breaking Changes and Highlights

  • NeighborSampler got completely revamped: it's now much faster, allows for parallel sampling, and allows to easily apply skip-connections or self-loops. See examples/reddit.py or the newly introduced OGB examples (examples/ogbn_products_sage.py and examples/ogbn_products_gat.py). The latter also sets a new SOTA on the OGB leaderboards (reaching 0.7945 ± 0.0059 test accuracy)
  • SAGEConv now uses concat=True by default, and there is no option to disable it anymore
  • Node2Vec got enhanced by a parallel sampling mechanism, and as a result, its API slightly changed
  • MetaPath2Vec: The first model in PyG that is able to operate on heteregenous graphs
  • GNNExplainer: Generating explanations for graph neural networks
  • GraphSAINT: A graph sampling based inductive learning method
  • SchNet model for learning on molecular graphs, comes with pre-trained weights for each target of the QM9 dataset (thanks to @Nyuten)

Additional Features

Datasets

Minor changes

  • GATConv can now return attention weights via the return_attention_weights argument (thanks to @douglasrizzo)
  • InMemoryDataset now has a copy method that converts sliced datasets back into a contiguous memory layout
  • Planetoid got enhanced by the ability to let users choose between different splitting methods (thanks to @dongkwan-kim)
  • k_hop_subgraph: Computes the k-hop subgraph around a subset of nodes
  • geodesic_distance: Geodesic distances can now be computed in parallel (thanks to @jannessm)
  • tree_decomposition: The tree decompostion algorithm for generating junction trees from molecules
  • SortPool benchmark script now uses 1-D convolutions after pooling, leading to better performance (thanks to @muhanzhang)

Bugfixes

Don't miss a new pytorch_geometric release

NewReleases is sending notifications on new releases.