File size: 6,051 Bytes
ff36be7
 
 
 
 
b3fb934
 
 
 
 
 
ff36be7
 
 
 
 
 
7f94d8f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
af36706
 
9e7ebbd
75e963a
af36706
 
 
 
 
 
 
75e963a
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
---
datasets:
- deepghs/real_face_detection
- deepghs/anime_face_detection
pipeline_tag: object-detection
library_name: dghs-imgutils
tags:
- art
- anime
- photo
- face
---

These models are trained on [deepghs/anime_face_detection](https://huggingface.co/deepghs/anime_face_detection) and open-sourced real photos datasets.

So both anime and photos are supported.

Use this with `dghs-realutils`

```shell
pip install dghs-realutils
```

```python
from realutils.detect import detect_faces

print(detect_faces('yolo/solo.jpg'))
# [((157, 94, 252, 208), 'face', 0.8836570382118225)]
print(detect_faces('yolo/2girls.jpg'))
# [((718, 154, 1110, 728), 'face', 0.8841166496276855), ((157, 275, 519, 715), 'face', 0.8668240904808044)]
print(detect_faces('yolo/3+cosplay.jpg'))
# [((349, 227, 413, 305), 'face', 0.8543888330459595), ((383, 61, 432, 117), 'face', 0.8080574870109558), ((194, 107, 245, 162), 'face', 0.8035706877708435)]
print(detect_faces('yolo/multiple.jpg'))
# [((1070, 728, 1259, 985), 'face', 0.8765808939933777), ((548, 286, 760, 558), 'face', 0.8693087697029114), ((896, 315, 1067, 520), 'face', 0.8671919107437134), ((1198, 220, 1342, 406), 'face', 0.8485829830169678), ((1376, 526, 1546, 719), 'face', 0.8469308018684387)]
```

For more information, see [documentation of realutils](https://dghs-realutils.deepghs.org/main/api_doc/detect/face.html).

|         Model         |  Type  |  FLOPS  |  Params  |  F1 Score  |  Threshold  |  precision(B)  |  recall(B)  |  mAP50(B)  |  mAP50-95(B)  |                                                 F1 Plot                                                 |                                                            Confusion                                                            |  Labels  |
|:---------------------:|:------:|:-------:|:--------:|:----------:|:-----------:|:--------------:|:-----------:|:----------:|:-------------:|:-------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------------------:|:--------:|
| face_detect_v0_s_yv12 |  yolo  |  21.5G  |  9.25M   |    0.74    |    0.272    |    0.86931     |   0.6404    |  0.73074   |    0.42652    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s_yv12/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s_yv12/confusion_matrix_normalized.png) |  `face`  |
| face_detect_v0_n_yv12 |  yolo  |  6.48G  |  2.57M   |    0.7     |    0.258    |    0.85246     |   0.59089   |   0.6793   |    0.39182    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n_yv12/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n_yv12/confusion_matrix_normalized.png) |  `face`  |
| face_detect_v0_l_yv11 |  yolo  |  87.3G  |  25.3M   |    0.77    |    0.291    |    0.88458     |   0.67474   |  0.76666   |    0.45722    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_l_yv11/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_l_yv11/confusion_matrix_normalized.png) |  `face`  |
| face_detect_v0_m_yv11 |  yolo  |  68.2G  |  20.1M   |    0.76    |    0.262    |    0.87947     |   0.67315   |  0.76073   |    0.45288    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_m_yv11/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_m_yv11/confusion_matrix_normalized.png) |  `face`  |
| face_detect_v0_s_yv11 |  yolo  |  21.5G  |  9.43M   |    0.73    |    0.271    |    0.87001     |   0.63572   |  0.72683   |    0.42706    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s_yv11/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s_yv11/confusion_matrix_normalized.png) |  `face`  |
| face_detect_v0_n_yv11 |  yolo  |  6.44G  |  2.59M   |    0.7     |    0.263    |    0.86044     |   0.58577   |  0.67641   |    0.38975    | [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n_yv11/F1_curve.png) | [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n_yv11/confusion_matrix_normalized.png) |  `face`  |
|   face_detect_v0_l    |  yolo  |  165G   |  43.6M   |    0.76    |    0.277    |    0.87894     |   0.67335   |  0.76313   |    0.4532     |   [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_l/F1_curve.png)    |   [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_l/confusion_matrix_normalized.png)    |  `face`  |
|   face_detect_v0_m    |  yolo  |  79.1G  |  25.9M   |    0.75    |    0.277    |    0.87687     |   0.66265   |  0.75114   |    0.44262    |   [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_m/F1_curve.png)    |   [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_m/confusion_matrix_normalized.png)    |  `face`  |
|   face_detect_v0_s    |  yolo  |  28.6G  |  11.1M   |    0.73    |    0.282    |    0.86932     |   0.63557   |  0.72494   |    0.42219    |   [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s/F1_curve.png)    |   [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_s/confusion_matrix_normalized.png)    |  `face`  |
|   face_detect_v0_n    |  yolo  |  8.19G  |  3.01M   |    0.7     |    0.257    |    0.85337     |   0.58877   |  0.67471   |    0.38692    |   [plot](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n/F1_curve.png)    |   [confusion](https://huggingface.co/deepghs/real_face_detection/blob/main/face_detect_v0_n/confusion_matrix_normalized.png)    |  `face`  |