Skip to content

Commit 17b5dba

Browse files
wanglouis49diegolascasas
authored andcommitted
Update README.md.
PiperOrigin-RevId: 390606818
1 parent 1243baa commit 17b5dba

File tree

2 files changed

+53
-15
lines changed

2 files changed

+53
-15
lines changed

wikigraphs/README.md

+52-14
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,12 @@ conditioned on graph and generate graphs given text.
1111

1212
[Jax](https://github.com/google/jax#installation),
1313
[Haiku](https://github.com/deepmind/dm-haiku#installation),
14-
[Optax](https://github.com/deepmind/dm-haiku#installation), and
14+
[Optax](https://optax.readthedocs.io/en/latest/#installation), and
1515
[Jraph](https://github.com/deepmind/jraph) are needed for this package. It has
1616
been developed and tested on python 3 with the following packages:
1717

1818
* Jax==0.2.13
19-
* Haiku==0.0.5
19+
* Haiku==0.0.5.dev
2020
* Optax==0.0.6
2121
* Jraph==0.0.1.dev
2222

@@ -167,38 +167,76 @@ it elsewhere.
167167

168168
## Run baseline models
169169

170-
Note: our code supports training with multiple GPUs.
171-
172-
To run the default baseline GNN-based TransformerXL on Wikigraphs with 8
173-
GPUs:
170+
To quickly test-run a small model with 1 GPU:
174171

175172
```base
176173
python main.py --model_type=graph2text \
177174
--dataset=freebase2wikitext \
178175
--checkpoint_dir=/tmp/graph2text \
179176
--job_mode=train \
177+
--train_batch_size=2 \
178+
--gnn_num_layers=1 \
179+
--num_gpus=1
180+
```
181+
182+
To run the default baseline unconditional TransformerXL on Wikigraphs with 8
183+
GPUs:
184+
185+
```base
186+
python main.py --model_type=text \
187+
--dataset=freebase2wikitext \
188+
--checkpoint_dir=/tmp/text \
189+
--job_mode=train \
190+
--train_batch_size=64 \
191+
--gnn_num_layers=1 \
192+
--num_gpus=8
193+
```
194+
195+
To run the default baseline BoW-based TransformerXL on Wikigraphs with 8
196+
GPUs:
197+
198+
```base
199+
python main.py --model_type=bow2text \
200+
--dataset=freebase2wikitext \
201+
--checkpoint_dir=/tmp/bow2text \
202+
--job_mode=train \
180203
--train_batch_size=64 \
181204
--gnn_num_layers=1 \
182205
--num_gpus=8
183206
```
184207

185-
We ran our experiments in the paper using 8 Nvidia V100 GPUs. To allow for
186-
batch parallization for the GNN-based (graph2text) model, we pad graphs to
187-
the largest graph in the batch. The full run takes almost 4 days. BoW- and
188-
nodes-based models can be trained within 14 hours because there is no
189-
additional padding.
208+
To run the default baseline Nodes-only GNN-based TransformerXL on Wikigraphs
209+
with 8 GPUs:
210+
211+
```base
212+
python main.py --model_type=bow2text \
213+
--dataset=freebase2wikitext \
214+
--checkpoint_dir=/tmp/bow2text \
215+
--job_mode=train \
216+
--train_batch_size=64 \
217+
--gnn_num_layers=0 \
218+
--num_gpus=8
219+
```
190220

191-
Or to quickly test-run a small model:
221+
To run the default baseline GNN-based TransformerXL on Wikigraphs with 8
222+
GPUs:
192223

193224
```base
194225
python main.py --model_type=graph2text \
195226
--dataset=freebase2wikitext \
196227
--checkpoint_dir=/tmp/graph2text \
197228
--job_mode=train \
198-
--train_batch_size=2 \
199-
--gnn_num_layers=1
229+
--train_batch_size=64 \
230+
--gnn_num_layers=1 \
231+
--num_gpus=8
200232
```
201233

234+
We ran our experiments in the paper using 8 Nvidia V100 GPUs. Reduce the batch
235+
size if the model does not fit into memory. To allow for batch parallization for
236+
the GNN-based (graph2text) model, we pad graphs to the largest graph in the
237+
batch. The full run takes almost 4 days. BoW- and nodes-based models can be
238+
trained within 14 hours because there is no additional padding.
239+
202240
To evaluate the model on the validation set (this only uses 1 GPU):
203241

204242
```base

wikigraphs/setup.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@
3333

3434
setup(
3535
name='wikigraphs',
36-
version='0.0.2',
36+
version='0.1.0',
3737
description='A Wikipedia - knowledge graph paired dataset.',
3838
url='https://github.com/deepmind/deepmind-research/tree/master/wikigraphs',
3939
author='DeepMind',

0 commit comments

Comments
 (0)