Description
This is an implementation of DiffusionDet based on MMDetection, MMCV, and MMEngine.

Usage
Comparison of results
Download the DiffusionDet released model.
Convert model from DiffusionDet version to MMDetection version. We give a sample script to convert
DiffusionDet-resnet50
model. Users can download the corresponding models from here.python projects/DiffusionDet/model_converters/diffusiondet_resnet_to_mmdet.py ${DiffusionDet ckpt path} ${MMDetectron ckpt path}
Testing the model in MMDetection.
python tools/test.py projects/DiffusionDet/configs/diffusiondet_r50_fpn_500-proposals_1-step_crop-ms-480-800-450k_coco.py ${CHECKPOINT_PATH}
Note: During inference time, DiffusionDet will randomly generate noisy boxes,
which may affect the AP results. If users want to get the same result every inference time, setting seed is a good way.
We give a table to compare the inference results on ResNet50-500-proposals
between DiffusionDet and MMDetection.
Config | Step | AP |
---|---|---|
DiffusionDet (released results) | 1 | 45.5 |
DiffusionDet (seed=0) | 1 | 45.66 |
MMDetection (seed=0) | 1 | 45.7 |
MMDetection (random seed) | 1 | 45.6~45.8 |
DiffusionDet (released results) | 4 | 46.1 |
DiffusionDet (seed=0) | 4 | 46.38 |
MMDetection (seed=0) | 4 | 46.4 |
MMDetection (random seed) | 4 | 46.2~46.4 |
seed=0
means hard set seed before generating random boxes.# hard set seed=0 before generating random boxes seed = 0 random.seed(seed) torch.manual_seed(seed) # torch.cuda.manual_seed(seed) torch.cuda.manual_seed_all(seed) ... noise_bboxes_raw = torch.randn( (self.num_proposals, 4), device=device) ...
random seed
means do not hard set seed before generating random boxes.
Training commands
In MMDetection's root directory, run the following command to train the model:
python tools/train.py projects/DiffusionDet/configs/diffusiondet_r50_fpn_500-proposals_1-step_crop-ms-480-800-450k_coco.py
For multi-gpu training, run:
python -m torch.distributed.launch --nnodes=1 --node_rank=0 --nproc_per_node=${NUM_GPUS} --master_port=29506 --master_addr="127.0.0.1" tools/train.py projects/DiffusionDet/configs/diffusiondet_r50_fpn_500-proposals_1-step_crop-ms-480-800-450k_coco.py
Testing commands
In MMDetection's root directory, run the following command to test the model:
# for 1 step inference
# test command
python tools/test.py projects/DiffusionDet/configs/diffusiondet_r50_fpn_500-proposals_1-step_crop-ms-480-800-450k_coco.py ${CHECKPOINT_PATH}
# for 4 steps inference
# test command
python tools/test.py projects/DiffusionDet/configs/diffusiondet_r50_fpn_500-proposals_1-step_crop-ms-480-800-450k_coco.py ${CHECKPOINT_PATH} --cfg-options model.bbox_head.sampling_timesteps=4
Note: There is no difference between 1 step or 4 steps (or other multi-step) during training. Users can set different steps during inference through --cfg-options model.bbox_head.sampling_timesteps=${STEPS}
, but larger sampling_timesteps
will affect the inference time.
Results
Here we provide the baseline version of DiffusionDet with ResNet50 backbone.
To find more variants, please visit the official model zoo.
Backbone | Style | Lr schd | AP (Step=1) | AP (Step=4) | Config | Download |
---|---|---|---|---|---|---|
R-50 | PyTorch | 450k | 44.5 | 46.2 | config | model | log |
License
DiffusionDet is under the CC-BY-NC 4.0 license. Users should be careful about adopting these features in any commercial matters.
Citation
If you find DiffusionDet is useful in your research or applications, please consider giving a star 🌟 to the official repository and citing DiffusionDet by the following BibTeX entry.
@article{chen2022diffusiondet,
title={DiffusionDet: Diffusion Model for Object Detection},
author={Chen, Shoufa and Sun, Peize and Song, Yibing and Luo, Ping},
journal={arXiv preprint arXiv:2211.09788},
year={2022}
}
Checklist
Milestone 1: PR-ready, and acceptable to be one of the
projects/
.Finish the code
Basic docstrings & proper citation
Test-time correctness
A full README
Milestone 2: Indicates a successful model implementation.
Training-time correctness
Milestone 3: Good to be a part of our core package!
Type hints and docstrings
Unit tests
Code polishing
Metafile.yml
Move your modules into the core package following the codebase's file hierarchy structure.
Refactor your modules into the core package following the codebase's file hierarchy structure.