To expedite the training process, we’ll begin by pretraining the DETR component of the model. Typically, training the DETR model on a specific dataset (like DanceTrack, SportsMOT, etc.) doesn’t take too long (approximately a few hours).
💾 Similar to many other methods (e.g., MOTR and MeMOTR), we also use COCO pretrained DETR weights for initialization. You can obtain them from the following link:
- Deformable DETR: [official repo] [our drive]
- DAB-Deformable DETR: [official repo]
All our pre-train scripts follows the template script below. You'll need to fill the <placeholders>
according to your requirements:
python -m torch.distributed.run --nproc_per_node=8 main.py --mode train --use-distributed True --use-wandb False --config-path <config file path> --data-root <DATADIR> --outputs-dir <outputs dir>
For example, you can pre-train a Deformable-DETR model on DanceTrack as follows:
python -m torch.distributed.run --nproc_per_node=8 main.py --mode train --use-distributed True --use-wandb False --config-path ./configs/pretrain_r50_deformable_detr_dancetrack.yaml --data-root ./datasets/ --outputs-dir ./outputs/pretrain_r50_deformable_detr_dancetrack/
Please referring to here to get more information.
💾 You can directly download the pretrained DETR weights we used in our experiments from Google Drive ☁️. Then put them into ./pretrains/
directory.