Skip to content

Codes for Transferable Interactiveness Knowledge for Human-Object Interaction Detection. (CVPR 2019)

License

Notifications You must be signed in to change notification settings

MY-Chen2000/Transferable-Interactiveness-Network

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TIN: Transferable Interactiveness Network

Code for our CVPR2019 paper "Transferable Interactiveness Knowledge for Human-Object Interaction Detection".

Created by Yong-Lu Li, Siyuan Zhou, Xijie Huang, Liang Xu, Ze Ma, Hao-Shu Fang, Yan-Feng Wang, Cewu Lu.

Link: [Arxiv]

Citation

If you find our work useful in your research, please consider citing:

 @article{li2018transferable,
 title={Transferable Interactiveness Prior for Human-Object Interaction Detection},
  author={Li, Yong-Lu and Zhou, Siyuan and Huang, Xijie and Xu, Liang and Ma, Ze and Fang, Hao-Shu and Wang, Yan-Feng and Lu, Cewu},
  journal={arXiv preprint arXiv:1811.08264},
  year={2018}
}

Introduction

Interactiveness Knowledge indicates whether human and object interact with each other or not. It can be learned across HOI datasets, regardless of HOI category settings. We exploit an Interactiveness Network to learn the general interactiveness knowledge from multiple HOI datasets and perform Non-Interaction Suppression before HOI classification in inference. On account of the generalization of interactiveness, our TIN: Transferable Interactiveness Network is a transferable knowledge learner and can be cooperated with any HOI detection models to achieve desirable results. TIN outperforms state-of-the-art HOI detection results by a great margin, verifying its efficacy and flexibility.

Overview of Our Framework

Results on HICO-DET and V-COCO

Our Results on HICO-DET dataset

Method Full(def) Rare(def) None-Rare(def) Full(ko) Rare(ko) None-Rare(ko)
RCD 13.75 10.23 15.45 15.34 10.98 17.02
RPDCD 17.03 13.42 18.11 19.17 15.51 20.26
RCT 10.61 7.78 11.45 12.47 8.87 13.54
RPT1CD 16.91 13.32 17.99 19.05 15.22 20.19
RPT2CD 17.22 13.51 18.32 19.38 15.38 20.57
RPT2CD(optimized) 17.54 13.80 18.65 19.75 15.70 20.96

Our Results on V-COCO dataset

Method Full(def)
RCD 43.2
RPDCD 47.8
RCT 38.5
RPT1CD 48.3
RPT2CD 49.0

Please note that we have reimplemented and refined our code, thus the results here is better than our paper [Arxiv].

You may also be interested in our new work HAKE[website], HAKE is a new large-scale knowledge base and engine for human activity understanding. HAKE provides elaborate and abundant body part state labels for active human instances in a large scale of images and videos. With HAKE, we boost the HOI recognition performance on HICO and some other widely-used human activity benchmarks. Now we are still enlarging and enriching it, and looking forward to working with outstanding researchers around the world on its applications and further improvements. If you have any pieces of advice or interests, please feel free to contact Yong-Lu Li ([email protected]).

Getting Started

Installation

1.Clone this repository.

git clone https://github.com/DirtyHarryLYL/Transferable-Interactiveness-Network.git

2.Download dataset and setup evaluation and API. (The detection results (person and object boudning boxes) are collected from: iCAN: Instance-Centric Attention Network for Human-Object Interaction Detection [website].)

chmod +x ./script/Dataset_download.sh 
./script/Dataset_download.sh

3.Install Python dependencies.

pip install -r requirements.txt

If you have trouble installing requirements, try to update your pip or try to use conda/virtualenv.

4.Download our pre-trained weight (Optional)

python script/Download_data.py 1f_w7HQxTfXGxOPrkriu7jTyCTC-KPEH3 Weights/TIN_HICO.zip
python script/Download_data.py 1iU9dN9rLtekcHX2MT_zU_df3Yf0paL9s Weights/TIN_VCOCO.zip

Training

1.Train on HICO-DET dataset

python tools/Train_TIN_HICO.py --num_iteration 2000000 --model TIN_HICO_test

2.Train on V-COCO dataset

python tools/Train_TIN_VCOCO.py --num_iteration 20000 --model TIN_VCOCO_test

Testing

1.Test on HICO-DET dataset

python tools/Test_TIN_HICO.py --num_iteration 1700000 --model TIN_HICO

2.Test on V-COCO dataset

python tools/Test_TIN_VCOCO.py --num_iteration 6000 --model TIN_VCOCO

Acknowledgement

Some of the codes are built upon iCAN: Instance-Centric Attention Network for Human-Object Interaction Detection [website]. Thanks them for their great work! The pose estimation results are obtained from AlphaPose . Alpha Pose is an accurate multi-person pose estimator, which is the first real-time open-source system that achieves 70+ mAP (72.3 mAP) on COCO dataset and 80+ mAP (82.1 mAP) on MPII dataset. You may also use your own pose estimation results to train the interactiveness predictor, thus you could directly donwload the train and test pkl files from iCAN [website] and insert your pose results.

If you get any problems or if you find any bugs, don't hesitate to comment on GitHub or make a pull request!

TIN(Transferable Interactiveness Network) is freely available for free non-commercial use, and may be redistributed under these conditions. For commercial queries, please drop an e-mail. We will send the detail agreement to you.

About

Codes for Transferable Interactiveness Knowledge for Human-Object Interaction Detection. (CVPR 2019)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.0%
  • MATLAB 10.1%
  • Other 0.9%