Skip to content

Commit

Permalink
fix a bug
Browse files Browse the repository at this point in the history
  • Loading branch information
pengzhiliang committed Nov 22, 2021
1 parent 40780d6 commit 92c2327
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ python run_mae_vis.py ${IMAGE_PATH} ${OUTPUT_DIR} ${MODEL_PATH}
| vit-base | 400e | 100e | 83.1% | [pretrain](files/pretrain_base_0.75_400e.txt) [finetune](files/pretrain_base_0.75_400e_finetune_100e.txt)| [Google drive](https://drive.google.com/drive/folders/182F5SLwJnGVngkzguTelja4PztYLTXfa?usp=sharing) |
| vit-large | 400e | 50e | 84.5% | [pretrain](files/pretrain_large_0.75_400e.txt) [finetune](files/pretrain_large_0.75_400e_finetune_50e.txt) | unavailable |

Due to the limited gpus, it's really a chanllenge for us to pretrain with larger model or longer schedule mentioned in the paper. (the pretraining and end-to-end fine-tuning process of vit-large model are fininshed by [a enthusiastic handsome guy](https://github.com/sunsmarterjie) with many v100, but the weights are unavailable)
Due to the limited gpus, it's really a chanllenge for us to pretrain with larger model or longer schedule mentioned in the paper. (the pretraining and end-to-end fine-tuning process of vit-large model are fininshed by [this enthusiastic handsome guy](https://github.com/sunsmarterjie) with many v100s, but the weights are unavailable)

So if one can fininsh it, please feel free to report it in the issue or push a PR, thank you!

Expand Down
5 changes: 3 additions & 2 deletions modeling_finetune.py
Original file line number Diff line number Diff line change
Expand Up @@ -274,9 +274,10 @@ def forward_features(self, x):
for blk in self.blocks:
x = blk(x)

x = self.norm(x)
if self.fc_norm is not None:
t = x[:, 1:, :]
return self.fc_norm(t.mean(1))
# return self.fc_norm(x[:, 1:].mean(1))
return self.fc_norm(x.mean(1))
else:
return x[:, 0]

Expand Down

0 comments on commit 92c2327

Please sign in to comment.