Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
cuiyutao 0755299596 | 1 year ago | |
---|---|---|
experiments | 2 years ago | |
external | 2 years ago | |
lib | 1 year ago | |
tracking | 1 year ago | |
.gitignore | 2 years ago | |
LICENSE | 2 years ago | |
README.md | 1 year ago | |
install_pytorch17.sh | 2 years ago |
The official implementation of the CVPR 2022 paper MixFormer: End-to-End Tracking with Iterative Mixed Attention
[Models and Raw results] (Google Driver) [Models and Raw results] (Baidu Driver: hmuv)
[Feb 08, 2023]
[Mar 29, 2022]
[Mar 21, 2022]
MixFormer is composed of a target-search mixed attention (MAM) based backbone and a simple corner head,
yielding a compact tracking pipeline without an explicit integration module.
Mixformer is an end-to-end tracking framework without post-processing. Compared with other transformer trackers, MixFormer
doesn's use positional embedding, attentional mask and multi-layer feature aggregation strategy.
Tracker | VOT2020 (EAO) | LaSOT (NP) | GOT-10K (AO) | TrackingNet (NP) |
---|---|---|---|---|
MixFormer | 0.555 | 79.9 | 70.7 | 88.9 |
ToMP101* (CVPR2022) | - | 79.2 | - | 86.4 |
SBT-large* (CVPR2022) | 0.529 | - | 70.4 | - |
SwinTrack* (Arxiv2021) | - | 78.6 | 69.4 | 88.2 |
Sim-L/14* (Arxiv2022) | - | 79.7 | 69.8 | 87.4 |
STARK (ICCV2021) | 0.505 | 77.0 | 68.8 | 86.9 |
KeepTrack (ICCV2021) | - | 77.2 | - | - |
TransT (CVPR2021) | 0.495 | 73.8 | 67.1 | 86.7 |
TrDiMP (CVPR2021) | - | - | 67.1 | 83.3 |
Siam R-CNN (CVPR2020) | - | 72.2 | 64.9 | 85.4 |
TREG (Arxiv2021) | - | 74.1 | 66.8 | 83.8 |
Use the Anaconda
conda create -n mixformer python=3.6
conda activate mixformer
bash install_pytorch17.sh
Put the tracking datasets in ./data. It should look like:
${MixFormer_ROOT}
-- data
-- lasot
|-- airplane
|-- basketball
|-- bear
...
-- got10k
|-- test
|-- train
|-- val
-- coco
|-- annotations
|-- train2017
-- trackingnet
|-- TRAIN_0
|-- TRAIN_1
...
|-- TRAIN_11
|-- TEST
Run the following command to set paths for this project
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir .
After running this command, you can also modify paths by editing these two files
lib/train/admin/local.py # paths about training
lib/test/evaluation/local.py # paths about testing
Training with multiple GPUs using DDP. More details of
other training settings can be found at tracking/train_mixformer.sh
# MixFormer
bash tracking/train_mixformer.sh
tracking/test_mixformer.sh
bash tracking/test_mixformer.sh
external/vot20/
by setting trackers.ini
.cd external/vot20/<workspace_dir>
vot evaluate --workspace . MixFormerPython
# generating analysis results
vot analysis --workspace . --nocache
bash tracking/run_video_demo.sh
bash tracking/profile_mixformer.sh
bash tracking/vis_mixformer_attn.sh
The trained models and the raw tracking results are provided in the [Models and Raw results] (Google Driver) or
[Models and Raw results] (Baidu Driver: hmuv).
Yutao Cui: cuiyutao@smail.nju.edu.cn
Cheng Jiang: mg1933027@smail.nju.edu.cn
If you think this project is helpful, please feel free to leave a star⭐️ and cite our paper:
@inproceedings{cui2022mixformer,
title={Mixformer: End-to-end tracking with iterative mixed attention},
author={Cui, Yutao and Jiang, Cheng and Wang, Limin and Wu, Gangshan},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={13608--13618},
year={2022}
}
@misc{cui2023mixformer,
title={MixFormer: End-to-End Tracking with Iterative Mixed Attention},
author={Yutao Cui and Cheng Jiang and Gangshan Wu and Limin Wang},
year={2023},
eprint={2302.02814},
archivePrefix={arXiv}
}
No Description
Text Python Shell
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》