-
Notifications
You must be signed in to change notification settings - Fork 21
TopoTune
TopoBench includes TopoTune, a comprehensive framework for easily defining and training new, general TDL models on any domain using any (graph) neural network ω as a backbone. The pre-print detailing this framework is TopoTune: A Framework for Generalized Combinatorial Complex Neural Networks. In a GCCN (pictured below), the input complex is represented as an ensemble of strictly augmented Hasse graphs, one per neighborhood of the complex. Each of these Hasse graphs is processed by a sub model ω, and the outputs are rank-wise aggregated in between layers.
Generalized Combinatorial Complex Network (GCCN). The input complex
To implement and train a GCCN, run the following command line with the desired choice of dataset, lifting domain (ex: cell
, simplicial
), PyTorch Geometric backbone model (ex: GCN
, GIN
, GAT
, GraphSAGE
) and parameters (ex. model.backbone.GNN.num_layers=2
), neighborhood structure (routes), and other hyperparameters.
python -m topobench \
dataset=graph/PROTEINS \
dataset.split_params.data_seed=1 \
model=cell/topotune\
model.tune_gnn=GCN \
model.backbone.GNN.num_layers=2 \
model.backbone.neighborhoods=\[1-up_laplacian-0,1-down_incidence-2\] \
model.backbone.layers=4 \
model.feature_encoder.out_channels=32 \
model.feature_encoder.proj_dropout=0.3 \
model.readout.readout_name=PropagateSignalDown \
logger.wandb.project=TopoTune_cell \
trainer.max_epochs=1000 \
callbacks.early_stopping.patience=50 \
To use a single augmented Hasse graph expansion, use model={domain}/topotune_onehasse
instead of model={domain}/topotune
.
To specify a set of neighborhoods on the complex, use a list of neighborhoods each specified as a string of the form
r-{neighborhood}-k
, where {neighborhood}
considers. Currently, the following options for {neighborhood}
are supported:
-
up_laplacian
, between cells of rank$k$ through$k+r$ cells. -
down_laplacian
, between cells of rank$k$ through$k-r$ cells. -
hodge_laplacian
, between cells of rank$k$ through both$k-r$ and$k+r$ cells. -
up_adjacency
, between cells of rank$k$ through$k+r$ cells. -
down_adjacency
, between cells of rank$k$ through$k-r$ cells. -
up_incidence
, from rank$k$ to$k+r$ . -
down_incidence
, from rank$k$ to$k-r$ .
The number up_incidence-k
represents the incidence from rank
By default, backbone models are imported from torch_geometric.nn.models
. To import and specify a backbone model from any other package, such as torch.nn.Transformer
or dgl.nn.GATConv
, it is sufficient to 1) make sure the package is installed and 2) specify in the command line:
model.tune_gnn = {backbone_model}
model.backbone.GNN._target_={package}.{backbone_model}
We provide scripts to reproduce experiments on a broad class of GCCNs in scripts/topotune
and reproduce iterations of existing neural networks in scripts/topotune/existing_models
, as previously reported in the TopoTune paper.
We invite users interested in running extensive sweeps on new GCCNs to replicate the --multirun
flag in the scripts. This is a shortcut for running every possible combination of the specified parameters in a single command.
- Defining GCCNs
- Defining backbone models
- Reproducing experiments
-
Graph to Simplicial Complex
-
Graph to Cell Complex
-
Graph to Hypergraph
-
Graph to Combinatorial
-
Pointcloud to Graph
-
Pointcloud to Simplicial
-
Pointcloud to Hypergraph
-
Hypergraph to Simplicial
-
Hypergraph to Combinatorial