Image Classification
FBAGSTM commited on
Commit
5e0a031
·
verified ·
1 Parent(s): 95af511

Update ST Model Zoo

Browse files
Files changed (1) hide show
  1. README.md +25 -35
README.md CHANGED
@@ -1,9 +1,3 @@
1
- ---
2
- license: other
3
- license_name: sla0044
4
- license_link: >-
5
- https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/LICENSE.md
6
- ---
7
  # EfficientNet v2
8
 
9
  ## **Use case** : `Image classification`
@@ -11,7 +5,7 @@ license_link: >-
11
  # Model description
12
 
13
 
14
- EfficientNet v2 family is one of the best topology for image classification. It has been obtained through neural architecture search with a special care given to training time
15
  and number of parameters reduction.
16
 
17
  This family of networks comprises various subtypes: B0 (224x224), B1 (240x240), B2 (260x260), B3 (300x300), S (384x384) ranked by depth and width increasing order.
@@ -69,40 +63,37 @@ For an image resolution of NxM and P classes
69
  ## Metrics
70
 
71
  * Measures are done with default STM32Cube.AI configuration with enabled input / output allocated option.
72
-
73
  * `fft` stands for "full fine-tuning", meaning that the full model weights were initialized from a transfer learning pre-trained model, and all the layers were unfrozen during the training.
74
 
75
-
76
-
77
  ### Reference **NPU** memory footprint on food-101 and ImageNet dataset (see Accuracy for details on dataset)
78
  |Model | Dataset | Format | Resolution | Series | Internal RAM (KiB) | External RAM (KiB) | Weights Flash (KiB) | STM32Cube.AI version | STEdgeAI Core version |
79
- |----------|------------------|--------|-------------|------------------|------------------|---------------------|-------|----------------------|-------------------------|
80
- | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | food-101 | Int8 | 224x224x3 | STM32N6 | 1834.44 |0.0| 7553.77 | 10.0.0 | 2.0.0 |
81
- | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | food-101 | Int8 | 240x240x3 | STM32N6 | 2589.97 |0.0| 8924.78 | 10.0.0 | 2.0.0 |
82
- | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | food-101 | Int8 | 260x260x3 | STM32N6 | 2629.56 |528.12| 11212.75| 10.0.0 | 2.0.0 |
83
- | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | food-10 | Int8 | 384x384x3 | STM32N6 | 2700 | 6912 | 25756.92 | 10.0.0 | 2.0.0 |
84
- | [efficientnet_v2B0_224 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B0_224/efficientnet_v2B0_224_qdq_int8.onnx) | ImageNet | Int8 | 224x224x3 | STM32N6 | 1834.44 | 0.0 | 8680.39 | 10.0.0 | 2.0.0 |
85
- | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | ImageNet | Int8 | 240x240x3 | STM32N6 | 2589.97 | 0.0 | 10051.7 | 10.0.0 | 2.0.0 |
86
- | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | ImageNet | Int8 | 260x260x3 | STM32N6 | 2629.56 | 528.12 | 12451.77 | 10.0.0 | 2.0.0 |
87
- | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | ImageNet | Int8 | 384x384x3 | STM32N6 | 2700 | 6912 | 26884.47 | 10.0.0 | 2.0.0 |
88
 
89
 
90
  ### Reference **NPU** inference time on food-101 and ImageNet dataset (see Accuracy for details on dataset)
91
- | Model | Dataset | Format | Resolution | Board | Execution Engine | Inference time (ms) | Inf / sec | STM32Cube.AI version | STEdgeAI Core version |
92
- |--------|------------------|--------|-------------|------------------|------------------|---------------------|-------|----------------------|-------------------------|
93
- | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | food-101 | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 54.32 | 18.41 | 10.0.0 | 2.0.0 |
94
- | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | food-101 | Int8 | 240x240x3 | STM32N6570-DK | NPU/MCU | 73.89 | 13.53 | 10.0.0 | 2.0.0 |
95
- | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | food-101 | Int8 | 260x260x3 | STM32N6570-DK | NPU/MCU | 146.01 | 6.85 | 10.0.0 | 2.0.0 |
96
- | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | food-101 | Int8 | 384x384x3 | STM32N6570-DK | NPU/MCU | 842 | 1.19 | 10.0.0 | 2.0.0 |
97
- | [efficientnet_v2B0_224 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B0_224/efficientnet_v2B0_224_qdq_int8.onnx) | ImageNet | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 57.5 | 17.39 | 10.0.0 | 2.0.0 |
98
- | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | ImageNet | Int8 | 240x240x3 | STM32N6570-DK | NPU/MCU | 77.25 | 12.94 | 10.0.0 | 2.0.0 |
99
- | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | ImageNet | Int8 | 260x260x3 | STM32N6570-DK | NPU/MCU | 148.78 | 6.72 | 10.0.0 | 2.0.0 |
100
- | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | ImageNet | Int8 | 384x384x3 | STM32N6570-DK | NPU/MCU | 809.73 | 1.23 | 10.0.0 | 2.0.0 |
101
 
102
  * The deployment of all the models listed in the table is supported, except for the efficientnet_v2S_384 model, for which support is coming soon.
103
  ### Accuracy with Food-101 dataset
104
 
105
- Dataset details: [link](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-101/) , Quotation[[3]](#3) , Number of classes: 101 , Number of images: 101 000
106
 
107
  | Model | Format | Resolution | Top 1 Accuracy |
108
  |--------------------------------------------------------------------------------------------------------------------------------------------------|--------|-----------|----------------|
@@ -110,7 +101,7 @@ Dataset details: [link](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-1
110
  | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | Int8 | 224x224x3 | 81.1 % |
111
  | [efficientnet_v2B1_240_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft.h5) | Float | 240x240x3 | 83.23 % |
112
  | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | Int8 | 240x240x3 | 82.95 % |
113
- | [efficientnet_v2B2_260_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft.h5) | Float | 260x260x3 | 84.37 % |
114
  | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | Int8 | 260x260x3 | 84.04 % |
115
  | [efficientnet_v2S_384_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft.h5) | Float | 384x384x3 | 88.16 % |
116
  | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | Int8 | 384x384x3 | 87.34 % |
@@ -118,7 +109,7 @@ Dataset details: [link](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-1
118
 
119
  ### Accuracy with ImageNet
120
 
121
- Dataset details: [link](https://www.image-net.org), Quotation[[4]](#4)
122
  Number of classes: 1000.
123
  To perform the quantization, we calibrated the activations with a random subset of the training set.
124
  For the sake of simplicity, the accuracy reported here was estimated on the 10000 labelled images of the validation set.
@@ -131,7 +122,7 @@ For the sake of simplicity, the accuracy reported here was estimated on the 1000
131
  | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | Int8 | 240x240x3 | 75.5 % |
132
  | [efficientnet_v2B2_260](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260.h5) | Float | 260x260x3 | 76.58 % |
133
  | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | Int8 | 260x260x3 | 76.26 % |
134
- | [efficientnet_v2S_384](./Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384.h5) | Float | 384x384x3 | 83.52 % |
135
  | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | Int8 | 384x384x3 | 83.07 % |
136
 
137
 
@@ -153,5 +144,4 @@ L. Bossard, M. Guillaumin, and L. Van Gool, "Food-101 -- Mining Discriminative C
153
 
154
  <a id="4">[4]</a>
155
  Olga Russakovsky*, Jia Deng*, Hao Su, Jonathan Krause, Sanjeev Satheesh, Sean Ma, Zhiheng Huang, Andrej Karpathy, Aditya Khosla, Michael Bernstein, Alexander C. Berg and Li Fei-Fei.
156
- (* = equal contribution) ImageNet Large Scale Visual Recognition Challenge.
157
-
 
 
 
 
 
 
 
1
  # EfficientNet v2
2
 
3
  ## **Use case** : `Image classification`
 
5
  # Model description
6
 
7
 
8
+ EfficientNet v2 family is one of the best topologies for image classification. It has been obtained through neural architecture search with a special care given to training time
9
  and number of parameters reduction.
10
 
11
  This family of networks comprises various subtypes: B0 (224x224), B1 (240x240), B2 (260x260), B3 (300x300), S (384x384) ranked by depth and width increasing order.
 
63
  ## Metrics
64
 
65
  * Measures are done with default STM32Cube.AI configuration with enabled input / output allocated option.
 
66
  * `fft` stands for "full fine-tuning", meaning that the full model weights were initialized from a transfer learning pre-trained model, and all the layers were unfrozen during the training.
67
 
 
 
68
  ### Reference **NPU** memory footprint on food-101 and ImageNet dataset (see Accuracy for details on dataset)
69
  |Model | Dataset | Format | Resolution | Series | Internal RAM (KiB) | External RAM (KiB) | Weights Flash (KiB) | STM32Cube.AI version | STEdgeAI Core version |
70
+ |----------|------------------|--------|-------------|------------------|------------------|---------------------|---------------------|----------------------|-------------------------|
71
+ | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | food-101 | Int8 | 224x224x3 | STM32N6 | 1834.44 |0.0| 7552.02 | 10.2.0 | 2.2.0 |
72
+ | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | food-101 | Int8 | 240x240x3 | STM32N6 | 2589.97 |0.0| 8332.27 | 10.2.0 | 2.2.0 |
73
+ | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | food-101 | Int8 | 260x260x3 | STM32N6 | 2629.56 |528.12| 10525.95 | 10.2.0 | 2.2.0 |
74
+ | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | food-101 | Int8 | 384x384x3 | STM32N6 | 2700 | 6912 | 24451.31 | 10.2.0 | 2.2.0 |
75
+ | [efficientnet_v2B0_224 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B0_224/efficientnet_v2B0_224_qdq_int8.onnx) | ImageNet | Int8 | 224x224x3 | STM32N6 | 1834.44 | 0.0 | 8179.67 | 10.2.0 | 2.2.0 |
76
+ | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | ImageNet | Int8 | 240x240x3 | STM32N6 | 2589.97 | 0.0 | 9459.92 | 10.2.0 | 2.2.0 |
77
+ | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | ImageNet | Int8 | 260x260x3 | STM32N6 | 2629.56 | 528.12 | 11765.99 | 10.2.0 | 2.2.0 |
78
+ | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | ImageNet | Int8 | 384x384x3 | STM32N6 | 2700 | 6912 | 25579.03 | 10.2.0 | 2.2.0 |
79
 
80
 
81
  ### Reference **NPU** inference time on food-101 and ImageNet dataset (see Accuracy for details on dataset)
82
+ | Model | Dataset | Format | Resolution | Board | Execution Engine | Inference time (ms) | Inf / sec | STM32Cube.AI version | STEdgeAI Core version |
83
+ |--------|------------------|--------|-------------|------------------|------------------|---------------------|-----------|----------------------|-------------------------|
84
+ | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | food-101 | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 52.05 | 19.21 | 10.2.0 | 2.2.0 |
85
+ | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | food-101 | Int8 | 240x240x3 | STM32N6570-DK | NPU/MCU | 70.91 | 14.1 | 10.2.0 | 2.2.0 |
86
+ | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | food-101 | Int8 | 260x260x3 | STM32N6570-DK | NPU/MCU | 142.62 | 7.01 | 10.2.0 | 2.2.0 |
87
+ | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | food-101 | Int8 | 384x384x3 | STM32N6570-DK | NPU/MCU | 816.34 | 1.22 | 10.2.0 | 2.2.0 |
88
+ | [efficientnet_v2B0_224 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B0_224/efficientnet_v2B0_224_qdq_int8.onnx) | ImageNet | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 55.27 | 18.09 | 10.2.0 | 2.2.0 |
89
+ | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | ImageNet | Int8 | 240x240x3 | STM32N6570-DK | NPU/MCU | 74.48 | 13.34 | 10.2.0 | 2.2.0 |
90
+ | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | ImageNet | Int8 | 260x260x3 | STM32N6570-DK | NPU/MCU | 145.27 | 6.88 | 10.2.0 | 2.2.0 |
91
+ | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | ImageNet | Int8 | 384x384x3 | STM32N6570-DK | NPU/MCU | 785.01 | 1.27 | 10.2.0 | 2.2.0 |
92
 
93
  * The deployment of all the models listed in the table is supported, except for the efficientnet_v2S_384 model, for which support is coming soon.
94
  ### Accuracy with Food-101 dataset
95
 
96
+ Dataset details: [link](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-101/), Quotation[[3]](#3) , Number of classes: 101 , Number of images: 101 000
97
 
98
  | Model | Format | Resolution | Top 1 Accuracy |
99
  |--------------------------------------------------------------------------------------------------------------------------------------------------|--------|-----------|----------------|
 
101
  | [efficientnet_v2B0_224_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B0_224_fft/efficientnet_v2B0_224_fft_qdq_int8.onnx) | Int8 | 224x224x3 | 81.1 % |
102
  | [efficientnet_v2B1_240_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft.h5) | Float | 240x240x3 | 83.23 % |
103
  | [efficientnet_v2B1_240_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B1_240_fft/efficientnet_v2B1_240_fft_qdq_int8.onnx) | Int8 | 240x240x3 | 82.95 % |
104
+ | [efficientnet_v2B2_260_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft.h5) | Float | 260x260x3 | 84.35 % |
105
  | [efficientnet_v2B2_260_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2B2_260_fft/efficientnet_v2B2_260_fft_qdq_int8.onnx) | Int8 | 260x260x3 | 84.04 % |
106
  | [efficientnet_v2S_384_fft](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft.h5) | Float | 384x384x3 | 88.16 % |
107
  | [efficientnet_v2S_384_fft onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/ST_pretrainedmodel_public_dataset/food-101/efficientnet_v2S_384_fft/efficientnet_v2S_384_fft_qdq_int8.onnx) | Int8 | 384x384x3 | 87.34 % |
 
109
 
110
  ### Accuracy with ImageNet
111
 
112
+ Dataset details: [link](https://www.image-net.org), Quotation[[4]](#4).
113
  Number of classes: 1000.
114
  To perform the quantization, we calibrated the activations with a random subset of the training set.
115
  For the sake of simplicity, the accuracy reported here was estimated on the 10000 labelled images of the validation set.
 
122
  | [efficientnet_v2B1_240 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B1_240/efficientnet_v2B1_240_qdq_int8.onnx) | Int8 | 240x240x3 | 75.5 % |
123
  | [efficientnet_v2B2_260](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260.h5) | Float | 260x260x3 | 76.58 % |
124
  | [efficientnet_v2B2_260 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2B2_260/efficientnet_v2B2_260_qdq_int8.onnx) | Int8 | 260x260x3 | 76.26 % |
125
+ | [efficientnet_v2S_384](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384.h5) | Float | 384x384x3 | 83.52 % |
126
  | [efficientnet_v2S_384 onnx](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/efficientnetv2/Public_pretrainedmodel_public_dataset/ImageNet/efficientnet_v2S_384/efficientnet_v2S_384_qdq_int8.onnx) | Int8 | 384x384x3 | 83.07 % |
127
 
128
 
 
144
 
145
  <a id="4">[4]</a>
146
  Olga Russakovsky*, Jia Deng*, Hao Su, Jonathan Krause, Sanjeev Satheesh, Sean Ma, Zhiheng Huang, Andrej Karpathy, Aditya Khosla, Michael Bernstein, Alexander C. Berg and Li Fei-Fei.
147
+ (* = equal contribution) ImageNet Large Scale Visual Recognition Challenge.