cogdl.models.nn.pyg_gin
¶
Module Contents¶
Classes¶
Graph Isomorphism Network layer from paper `”How Powerful are Graph |
|
Multilayer perception with batch normalization |
|
Graph Isomorphism Network from paper `”How Powerful are Graph |
-
class
cogdl.models.nn.pyg_gin.
GINLayer
(apply_func=None, eps=0, train_eps=True)[source]¶ Bases:
torch.nn.Module
Graph Isomorphism Network layer from paper “How Powerful are Graph Neural Networks?”.
\[h_i^{(l+1)} = f_\Theta \left((1 + \epsilon) h_i^{l} + \mathrm{sum}\left(\left\{h_j^{l}, j\in\mathcal{N}(i) \right\}\right)\right)\]- apply_funccallable layer function)
layer or function applied to update node feature
- epsfloat32, optional
Initial epsilon value.
- train_epsbool, optional
If True, epsilon will be a learnable parameter.
-
class
cogdl.models.nn.pyg_gin.
GINMLP
(in_feats, out_feats, hidden_dim, num_layers, use_bn=True, activation=None)[source]¶ Bases:
torch.nn.Module
Multilayer perception with batch normalization
\[x^{(i+1)} = \sigma(W^{i}x^{(i)})\]- in_featsint
Size of each input sample.
- out_featsint
Size of each output sample.
- hidden_dimint
Size of hidden layer dimension.
- use_bnbool, optional
Apply batch normalization if True, default: `True).
-
class
cogdl.models.nn.pyg_gin.
GIN
(num_layers, in_feats, out_feats, hidden_dim, num_mlp_layers, eps=0, pooling='sum', train_eps=False, dropout=0.5)[source]¶ Bases:
cogdl.models.BaseModel
Graph Isomorphism Network from paper “How Powerful are Graph Neural Networks?”.
- Args:
- num_layersint
Number of GIN layers
- in_featsint
Size of each input sample
- out_featsint
Size of each output sample
- hidden_dimint
Size of each hidden layer dimension
- num_mlp_layersint
Number of MLP layers
- epsfloat32, optional
Initial epsilon value, default:
0
- poolingstr, optional
Aggregator type to use, default:
sum
- train_epsbool, optional
If True, epsilon will be a learnable parameter, default:
True