Official PyTorch inference code of our CVPR 2024 paper, "3D Human Pose Perception from Egocentric Stereo Videos".
For any questions, please contact the first author, Hiroyasu Akada [hakada@mpi-inf.mpg.de] .
[Project Page] [Benchmark Challenge]
@inproceedings{hakada2024unrealego2,
title = {3D Human Pose Perception from Egocentric Stereo Videos},
author = {Akada, Hiroyasu and Wang, Jian and Golyanik, Vladislav and Theobalt, Christian},
booktitle = {Computer Vision and Pattern Recognition (CVPR)},
year = {2024}
}
You can download the UnrealEgo2/UnrealEgo-RW datasets on our benchmark challenge page.
Note that UnrealEgo2 is fully compatible with UnrealEgo. The test data of UnrealEgo is publicly available, including 72 body joint annotations (32 for body and 40 for hand), whereas the test data of UnrealEgo2 and UnrealEgo-RW are not (See "Evaluation" section of our benchmark challenge page).
You can download depth data from SfM/Metashape described in our paper.
-
Depth from UnrealEgo-RW test split
bash download_unrealego2_test_sfm.sh bash download_unrealego_rw_test_sfm.sh
Note that these depth data differ from the synthetic pixel-perfect depth maps available on our benchmark challenge page.
We tested our code with the following dependencies:
- Python 3.9
- Ubuntu 18.04
- PyTorch 2.0.0
- Cuda 11.7
Please install other dependencies:
pip install -r requirements.txt
Our inference code automatically generates submittable zip files of the predictions as described in our benchmark challenge page.
You can download our trained models. Please save them in ./log/(experiment_name)
.
bash scripts/test/unrealego2_pose-qa-avg-df_data-ue2_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad.sh
--data_dir [path to the `UnrealEgoData2_test_rgb` dir]
--metadata_dir [path to the `UnrealEgoData2_test_sfm` dir]
Please modify the arguments above. The pose predictions will be saved in ./results/UnrealEgoData2_test_pose (raw and zip versions)
.
-
Model without pre-training on UnrealEgo2
bash scripts/test/unrealego2_pose-qa-avg-df_data-ue-rw_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad.sh --data_dir [path to the `UnrealEgoData_rw_test_rgb` dir] --metadata_dir [path to the `UnrealEgoData_rw_test_sfm` dir]
-
Model with pre-training on UnrealEgo2
bash scripts/test/unrealego2_pose-qa-avg-df_data-ue2_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad_finetuning_epoch5-5.sh --data_dir [path to the `UnrealEgoData_rw_test_rgb` dir] --metadata_dir [path to the `UnrealEgoData_rw_test_sfm` dir]
Please modify the arguments above. The pose predictions will be saved in ./results/UnrealEgoData_rw_test_pose (raw and zip versions)
.
For quantitative results of your methods, please follow the instructions in our benchmark challenge page and submit a zip version.
Also, note that UnrealEgo2 is fully compatible with UnrealEgo. This means that you can train your method on UnrealEgo2 and test it on UnrealEgo, and vice versa. The test data of UnrealEgo is publicly available, including 72 body joint annotations (32 for body and 40 for hand), whereas the test data of UnrealEgo2 and UnrealEgo-RW are not (See "Evaluation" section of our benchmark challenge page).