Update README.md
Browse files
README.md
CHANGED
|
@@ -96,13 +96,12 @@ The benchmark result in [MTVQA](https://github.com/bytedance/MTVQA/tree/main)
|
|
| 96 |
| MiniCPM-V2.5 | ✓ | 15.3 |
|
| 97 |
| InternVL-V1.5 | ✗ | 12.4 |
|
| 98 |
|
| 99 |
-
## OpenCompass Benchmark
|
| 100 |
<!--
|
| 101 |
<div align="center">
|
| 102 |
<img src="radar_chart.png" width="400"/>
|
| 103 |
</div> -->
|
| 104 |
|
| 105 |
-
We evaluate Vintern-1B-v2 on [VLMEvalKit](https://github.com/open-compass/VLMEvalKit). (We use GPT4o-mini for some judge model)
|
| 106 |
|
| 107 |
The current results are at a quite good level, and we are expanding the training set in English and other languages to approach models within a comparable parameter range.
|
| 108 |
|
|
@@ -125,7 +124,7 @@ The current results are at a quite good level, and we are expanding the training
|
|
| 125 |
| HallBenchavg | 38.0 | 36.1 | 41.7 | - |
|
| 126 |
| MathVistatestmini| 46.0 | 39.8 | 43.0 | 32.9 |
|
| 127 |
|
| 128 |
-
We are still working on more detailed benchmarks.
|
| 129 |
|
| 130 |
## Examples
|
| 131 |
|
|
|
|
| 96 |
| MiniCPM-V2.5 | ✓ | 15.3 |
|
| 97 |
| InternVL-V1.5 | ✗ | 12.4 |
|
| 98 |
|
|
|
|
| 99 |
<!--
|
| 100 |
<div align="center">
|
| 101 |
<img src="radar_chart.png" width="400"/>
|
| 102 |
</div> -->
|
| 103 |
|
| 104 |
+
<!-- We evaluate Vintern-1B-v2 on [VLMEvalKit](https://github.com/open-compass/VLMEvalKit). (We use GPT4o-mini for some judge model)
|
| 105 |
|
| 106 |
The current results are at a quite good level, and we are expanding the training set in English and other languages to approach models within a comparable parameter range.
|
| 107 |
|
|
|
|
| 124 |
| HallBenchavg | 38.0 | 36.1 | 41.7 | - |
|
| 125 |
| MathVistatestmini| 46.0 | 39.8 | 43.0 | 32.9 |
|
| 126 |
|
| 127 |
+
We are still working on more detailed benchmarks. -->
|
| 128 |
|
| 129 |
## Examples
|
| 130 |
|