dgl.nn (PyTorch)

卷积层

GraphConv

来自 Semi-Supervised Classification with Graph Convolutional Networks 的图卷积层

EdgeWeightNorm

此模块根据 GCN 中的形式,归一化图上正标量边权重。

RelGraphConv

来自 Modeling Relational Data with Graph Convolutional Networks 的关系图卷积层

TAGConv

来自 Topology Adaptive Graph Convolutional Networks 的拓扑自适应图卷积层

GATConv

来自 Graph Attention Network 的图注意力层

GATv2Conv

来自 How Attentive are Graph Attention Networks? 的 GATv2

EGATConv

来自 Rossmann-Toolbox 的处理边特征的图注意力层(参见补充数据)

EdgeGATConv

来自 SCENE 的带有边特征的图注意力层

EdgeConv

来自 Dynamic Graph CNN for Learning on Point Clouds 的 EdgeConv 层

SAGEConv

来自 Inductive Representation Learning on Large Graphs 的 GraphSAGE 层

SGConv

来自 Simplifying Graph Convolutional Networks 的 SGC 层

APPNPConv

来自 Predict then Propagate: Graph Neural Networks meet Personalized PageRank 的神经预测近似个性化传播层

GINConv

来自 How Powerful are Graph Neural Networks? 的图同构网络层

GINEConv

带有边特征的图同构网络,由 Strategies for Pre-training Graph Neural Networks 引入

GatedGraphConv

来自 Gated Graph Sequence Neural Networks 的门控图卷积层

GatedGCNConv

来自 Benchmarking Graph Neural Networks 的门控图卷积层

GMMConv

来自 Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs 的高斯混合模型卷积层

ChebConv

来自 Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering 的切比雪夫谱图卷积层

AGNNConv

来自 Attention-based Graph Neural Network for Semi-Supervised Learning 的基于注意力的图神经网络层

NNConv

来自 Neural Message Passing for Quantum Chemistry 的图卷积层

AtomicConv

来自 Atomic Convolutional Networks for Predicting Protein-Ligand Binding Affinity 的原子卷积层

CFConv

来自 SchNet: A continuous-filter convolutional neural network for modeling quantum interactions 的 CFConv

DotGatConv

Graph Attention Network 中应用点积版本的自注意力

TWIRLSConv

来自 Graph Neural Networks Inspired by Classical Iterative Algorithms 的卷积与迭代重加权最小二乘结合

TWIRLSUnfoldingAndAttention

描述 结合传播和注意力。

GCN2Conv

来自 Simple and Deep Graph Convolutional Networks 的通过初始残差和恒等映射的图卷积网络 (GCNII)

HGTConv

来自 Heterogeneous Graph Transformer 的异构图 Transformer 卷积

GroupRevRes

用于 GNN 的分组可逆残差连接,如 Training Graph Neural Networks with 1000 Layers 中介绍的

EGNNConv

来自 E(n) Equivariant Graph Neural Networks 的等变图卷积层

PNAConv

来自 Principal Neighbourhood Aggregation for Graph Nets 的主邻域聚合层

DGNConv

来自 Directional Graph Networks 的方向图网络层

CuGraph 卷积层

CuGraphRelGraphConv

来自 Modeling Relational Data with Graph Convolutional Networks 的加速关系图卷积层,利用 cugraph-ops 中高度优化的聚合原语。

CuGraphGATConv

来自 Graph Attention Networks 的图注意力层,通过 cugraph-ops 加速稀疏聚合。

CuGraphSAGEConv

来自 Inductive Representation Learning on Large Graphs 的加速 GraphSAGE 层,利用 cugraph-ops 中高度优化的聚合原语

密集卷积层

DenseGraphConv

来自 Semi-Supervised Classification with Graph Convolutional Networks 的图卷积层

DenseSAGEConv

来自 Inductive Representation Learning on Large Graphs 的 GraphSAGE 层

DenseChebConv

来自 Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering 的切比雪夫谱图卷积层

全局池化层

SumPooling

对图中的节点应用求和池化。

AvgPooling

对图中的节点应用平均池化。

MaxPooling

对图中的节点应用最大值池化。

SortPooling

来自 An End-to-End Deep Learning Architecture for Graph Classification 的排序池化

WeightAndSum

计算原子的重要性权重并执行加权求和。

GlobalAttentionPooling

来自 Gated Graph Sequence Neural Networks 的全局注意力池化

Set2Set

来自 Order Matters: Sequence to sequence for sets 的 Set2Set 算子

SetTransformerEncoder

来自 Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks 的编码器模块

SetTransformerDecoder

来自 Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks 的解码器模块

异构学习模块

HeteroGraphConv

用于计算异构图卷积的通用模块。

HeteroLinear

对异构输入应用线性变换。

HeteroEmbedding

创建一个异构嵌入表。

TypedLinear

根据类型进行线性变换。

实用模块

Sequential

用于堆叠图神经网络模块的顺序容器

WeightBasis

来自 Modeling Relational Data with Graph Convolutional Networks 的基分解

KNNGraph

将一个点集转换为图,或将具有相同点数的点集批次转换为这些图的批处理联合的层。

SegmentedKNNGraph

将一个点集转换为图,或将具有不同点数的点集批次转换为这些图的批处理联合的层。

RadiusGraph

将一个点集转换为具有给定距离内邻居的双向图的层。

JumpingKnowledge

来自 Representation Learning on Graphs with Jumping Knowledge Networks 的 Jumping Knowledge 聚合模块

NodeEmbedding

用于存储节点嵌入的类。

GNNExplainer

来自 GNNExplainer: Generating Explanations for Graph Neural Networks 的 GNNExplainer 模型

HeteroGNNExplainer

来自 GNNExplainer: Generating Explanations for Graph Neural Networks 的 GNNExplainer 模型,适用于异构图

SubgraphX

来自 On Explainability of Graph Neural Networks via Subgraph Explorations <https://arxiv.org/abs/2102.05152> 的 SubgraphX

HeteroSubgraphX

来自 On Explainability of Graph Neural Networks via Subgraph Explorations 的 SubgraphX,适用于异构图

PGExplainer

来自 Parameterized Explainer for Graph Neural Network <https://arxiv.org/pdf/2011.04573> 的 PGExplainer

HeteroPGExplainer

来自 Parameterized Explainer for Graph Neural Network 的 PGExplainer,适用于异构图

LabelPropagation

来自 Learning from Labeled and Unlabeled Data with Label Propagation 的标签传播

网络嵌入模块

DeepWalk

来自 DeepWalk: Online Learning of Social Representations 的 DeepWalk 模块

MetaPath2Vec

来自 metapath2vec: Scalable Representation Learning for Heterogeneous Networks 的 metapath2vec 模块

图 Transformer 实用模块

DegreeEncoder

度编码器,如 Do Transformers Really Perform Bad for Graph Representation? 中介绍的

LapPosEncoder

拉普拉斯位置编码器 (LPE),如 GraphGPS: General Powerful Scalable Graph Transformers 中介绍的

PathEncoder

路径编码器,如 Do Transformers Really Perform Bad for Graph Representation? 的边缘编码中介绍的

SpatialEncoder

空间编码器,如 Do Transformers Really Perform Bad for Graph Representation? 中介绍的

SpatialEncoder3d

3D 空间编码器,如 One Transformer Can Understand Both 2D & 3D Molecular Data 中介绍的

BiasedMHA

带有图注意力偏差的密集多头注意力模块。

GraphormerLayer

带有密集多头注意力的 Graphormer 层,如 Do Transformers Really Perform Bad for Graph Representation? 中介绍的

EGTLayer

Edge-augmented Graph Transformer (EGT) 的 EGT 层,如 `Global Self-Attention as a Replacement for Graph Convolution Reference `<https://arxiv.org/pdf/2108.03348.pdf>`_ 中介绍的