Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
gag 4794d49959 | 1 year ago | |
---|---|---|
mmseg | 1 year ago | |
11 | 1 year ago | |
Conv.py | 1 year ago | |
LICENSE | 1 year ago | |
README.md | 1 year ago | |
aatan.py | 1 year ago | |
cfvmodels.py | 1 year ago | |
requirements.txt | 1 year ago | |
self_attention.py | 1 year ago | |
test.py | 1 year ago | |
train.py | 1 year ago | |
utils.py | 1 year ago |
This repository contains the official Pytorch implementation of training & evaluation code for ColonFormer.
conda create -n ColonFormer
CUDA 11.1
and pytorch 1.7.1
pip install -r requirements.txt
Downloading necessary data:
Experiment 1
in our paper:
./data/TestDataset/
, which can be found in this download link (Google Drive)../data/TrainDataset/
, which can be found in this download link (Google Drive).Experiment 2
and Experiment 3
:
Download MiT's pretrained weights on ImageNet-1K, and put them in a folder pretrained/
.
Config hyper-parameters and run train.py
for training. For example:
python train.py --backbone b3 --train_path ./data/TrainDataset --train_save ColonFormerB3
Here is an example in Google Colab
For evaluation, specific your backbone version, weight's path and dataset and run test.py
. For example:
python test.py --backbone b3 --weight ./snapshots/ColonFormerB3/last.pth --test_path ./data/TestDataset
We provide some pretrained weights in case you need.
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》