Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,32 @@
|
|
1 |
-
---
|
2 |
-
license: gpl-3.0
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: gpl-3.0
|
3 |
+
---
|
4 |
+
|
5 |
+

|
6 |
+
|
7 |
+
## U-Net: Semantic Segmentation
|
8 |
+
|
9 |
+
U-Net is a convolutional neural network designed for biomedical image segmentation, introduced by Olaf Ronneberger et al. in 2015. The model gets its name from its U-shaped architecture, featuring a symmetrical encoder-decoder structure. The encoder part extracts features from the image through a series of convolutions and downsampling operations, while the decoder part restores the spatial resolution through upsampling, combining the extracted features to accurately locate and segment objects within the image. U-Net uses skip connections that pass feature maps from the encoder directly to the decoder, aiding in the recovery of fine details. This design makes U-Net highly effective for tasks requiring precise localization, such as medical image segmentation, and it is widely applied in other areas like remote sensing, autonomous driving, and image denoising.
|
10 |
+
|
11 |
+
### Source model
|
12 |
+
|
13 |
+
- Input shape: 640x1280
|
14 |
+
- Number of parameters: 29.6M
|
15 |
+
- Model size: 118.4M
|
16 |
+
- Output shape: 1x2x640x1280
|
17 |
+
|
18 |
+
Source model repository: [U-Net](https://github.com/milesial/Pytorch-UNet)
|
19 |
+
|
20 |
+
## Performance Reference
|
21 |
+
|
22 |
+
Please search model by model name in [Model Farm](https://aiot.aidlux.com/en/models)
|
23 |
+
|
24 |
+
## Inference & Model Conversion
|
25 |
+
|
26 |
+
Please search model by model name in [Model Farm](https://aiot.aidlux.com/en/models)
|
27 |
+
|
28 |
+
## License
|
29 |
+
|
30 |
+
- Source Model: [GPL-3.0](https://github.com/milesial/Pytorch-UNet/blob/master/LICENSE)
|
31 |
+
|
32 |
+
- Deployable Model: [GPL-3.0](https://github.com/milesial/Pytorch-UNet/blob/master/LICENSE)
|