Skip to content

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose (ICCV 2023)

License

Notifications You must be signed in to change notification settings

GentlesJan/AffordPose

Repository files navigation


Logo

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose

ICCV, 2023
Juntao Jian1 · Xiuping Liu1 · Manyi Li2, · Ruizhen Hu3 · Jian Liu4,

1 Dalian University of Technology    2 Shandong University
3 Shenzhen University    4 Tsinghua University
Corresponding author

Paper PDF ArXiv PDF Project Page Youtube Video Project Page


overview overview overview overview

Download Datasets

  1. Download the AffordPose datasets from the AffordPose Project Page. You can download specific categories or all the data according to your needs. The data are saved with the path: AffordPose/Object_class/Object_id/affordance/xxx.json, look like:

     .
     └── AffordPose
         ├──bottle
         │   ├──3415
         │   │   ├──3415_Twist
         │   │   │   ├── 1.json
         │   │   │   ├── ...
         │   │   │   └── 28.json
         │   │   │
         │   │   └──3415_Wrap-grasp
         │   │       ├── 1.json
         │   │       ├── ...
         │   │       └── 28.json
         |   |
         |   └── ...
         |
         └── ...
    
  2. The structure in xxx.json file as follows:

     .
     ├── xxx.json
         ├── rhand_mesh            # the hand mesh
         ├── dofs                  # the joint configurations of the hand
         ├── rhand_trans           # the translation of the paml
         ├── rhand_quat            # the rotation of the paml
         ├── object_mesh           # the object mesh, and the verts are annotated with affordance label
         ├── trans_obj             # with the default value: (0,0,0)
         ├── quat_obj              # with the default value: (1,0,0,0)
         ├── afford_name           # the object affordance corresponding to the interaction
         └── class_name            # the object class
    

Data visualization

  • If you want to visualize the hand mesh, a feasible way is to save the value of "rhand_mesh" from the xxx.json as xxx.obj file and visualize it in MeshLab, which is also applies to object mesh.

  • The hand model we use following the obman dataset, which ports the MANO hand model to GraspIt! simulator.

  • We used GraspIt! to collect xxx.xml data and ran ManoHand_xml2mesh.py to obtain the hand mesh in 'mm'. Please note that you cannot obtain the correct hand mesh in 'm' by simply changing the 'scale' parameter in this python file.

    $ python ./ManoHand_xml2mesh.py --xml_path PATH_TO_DATA.xml --mesh_path PATH_TO_SAVE_DATA.obj --part_path DIRPATH_TO_SAVE_HAND_PARTS

Citation

If you find AffordPose dataset is useful for your research, please considering cite us:

@InProceedings{Jian_2023_ICCV,
  author    = {Jian, Juntao and Liu, Xiuping and Li, Manyi and Hu, Ruizhen and Liu, Jian},
  title     = {AffordPose: A Large-Scale Dataset of Hand-Object Interactions with Affordance-Driven Hand Pose},
  booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  month     = {October},
  year      = {2023},
  pages     = {14713-14724}
}

About

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose (ICCV 2023)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages