pj-mathematician commited on
Commit
523c335
·
verified ·
1 Parent(s): 3039722

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +5 -0
  2. README.md +1444 -3
  3. checkpoint-4000/README.md +1436 -0
  4. checkpoint-4000/modules.json +20 -0
  5. checkpoint-4000/sentencepiece.bpe.model +3 -0
  6. checkpoint-4000/tokenizer.json +3 -0
  7. checkpoint-4200/config.json +27 -0
  8. checkpoint-4200/config_sentence_transformers.json +10 -0
  9. checkpoint-4200/special_tokens_map.json +51 -0
  10. checkpoint-4200/tokenizer.json +3 -0
  11. checkpoint-4200/tokenizer_config.json +56 -0
  12. checkpoint-4200/trainer_state.json +0 -0
  13. checkpoint-4400/1_Pooling/config.json +10 -0
  14. checkpoint-4400/README.md +1440 -0
  15. checkpoint-4400/config_sentence_transformers.json +10 -0
  16. checkpoint-4400/rng_state.pth +3 -0
  17. checkpoint-4400/sentence_bert_config.json +4 -0
  18. checkpoint-4400/sentencepiece.bpe.model +3 -0
  19. checkpoint-4400/special_tokens_map.json +51 -0
  20. checkpoint-4400/tokenizer.json +3 -0
  21. checkpoint-4400/tokenizer_config.json +56 -0
  22. checkpoint-4600/1_Pooling/config.json +10 -0
  23. checkpoint-4600/README.md +1442 -0
  24. checkpoint-4600/config.json +27 -0
  25. checkpoint-4600/config_sentence_transformers.json +10 -0
  26. checkpoint-4600/modules.json +20 -0
  27. checkpoint-4600/rng_state.pth +3 -0
  28. checkpoint-4600/scaler.pt +3 -0
  29. checkpoint-4600/scheduler.pt +3 -0
  30. checkpoint-4600/sentence_bert_config.json +4 -0
  31. checkpoint-4600/tokenizer.json +3 -0
  32. checkpoint-4600/tokenizer_config.json +56 -0
  33. checkpoint-4600/trainer_state.json +0 -0
  34. checkpoint-4600/training_args.bin +3 -0
  35. checkpoint-4800/README.md +1444 -0
  36. checkpoint-4800/config.json +27 -0
  37. checkpoint-4800/config_sentence_transformers.json +10 -0
  38. checkpoint-4800/modules.json +20 -0
  39. checkpoint-4800/scheduler.pt +3 -0
  40. checkpoint-4800/sentence_bert_config.json +4 -0
  41. checkpoint-4800/special_tokens_map.json +51 -0
  42. checkpoint-4800/tokenizer.json +3 -0
  43. checkpoint-4800/tokenizer_config.json +56 -0
  44. checkpoint-4800/trainer_state.json +0 -0
  45. eval/Information-Retrieval_evaluation_full_de_results.csv +25 -0
  46. eval/Information-Retrieval_evaluation_full_en_results.csv +25 -0
  47. eval/Information-Retrieval_evaluation_full_es_results.csv +25 -0
  48. eval/Information-Retrieval_evaluation_full_zh_results.csv +25 -0
  49. eval/Information-Retrieval_evaluation_mix_de_results.csv +25 -0
  50. eval/Information-Retrieval_evaluation_mix_es_results.csv +25 -0
.gitattributes CHANGED
@@ -33,3 +33,8 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ checkpoint-4800/tokenizer.json filter=lfs diff=lfs merge=lfs -text
37
+ checkpoint-4000/tokenizer.json filter=lfs diff=lfs merge=lfs -text
38
+ checkpoint-4200/tokenizer.json filter=lfs diff=lfs merge=lfs -text
39
+ checkpoint-4600/tokenizer.json filter=lfs diff=lfs merge=lfs -text
40
+ checkpoint-4400/tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,3 +1,1444 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on BAAI/bge-m3
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6476190476190476
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6476190476190476
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5061904761904762
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.30647619047619057
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.1858095238095238
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13250793650793652
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10247619047619047
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06690172806447445
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5391510592522911
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7199711948587544
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8253770621157605
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8719997123512196
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9006382758109558
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6476190476190476
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6822066814233797
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.6975329548006446
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7519637922809941
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7724946802449859
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.7827357067553371
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6476190476190476
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.7999999999999998
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.7999999999999998
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.7999999999999998
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.7999999999999998
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.7999999999999998
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6476190476190476
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5391784054866918
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5258287715484311
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5580109313638075
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5665715227835532
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.569529009182472
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5743595458034346
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.11351351351351352
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.11351351351351352
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5667567567567567
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3902702702702703
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.25254054054054054
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.19005405405405407
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.1507837837837838
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0035155918996302815
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.37958552840441906
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5635730197468752
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.672698242387141
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7360036980055802
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7697561816436992
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.11351351351351352
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6136401766234348
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5908459924766464
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6168063266629416
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6488575731321932
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.665316090087272
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.11351351351351352
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5536036036036036
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5536036036036036
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5536036036036036
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5536036036036036
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5536036036036036
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.11351351351351352
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.48095830339282386
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.43038606337879926
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.4335284717646407
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.44851036812148526
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.4550924585301385
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.4677023132311536
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9852216748768473
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9901477832512315
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5403940886699506
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.38275862068965516
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.2503448275862069
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.187816091954023
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.15027093596059116
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3432684453555553
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5339871522541048
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6498636280219438
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.7100921836539074
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7513351913056898
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.5647628262992046
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5522057083055792
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5796033728499559
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.6111851705889818
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6309313367878393
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5164425017655958
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.516559790060224
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.516559790060224
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.516559790060224
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.516559790060224
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.4221760589983628
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.37913413777890953
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3829298798486122
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.39811624371681004
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.40559711033541546
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.4188841643667456
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6796116504854369
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6796116504854369
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.470873786407767
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.28038834951456315
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17320388349514557
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12394822006472495
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09766990291262137
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06427555485009323
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5119331913488326
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.6726577129232287
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.788021792964523
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8328962977521837
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8687397875786594
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6796116504854369
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6515292076635256
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6598571989751485
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7157338182976709
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7357126940189814
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7500853808896866
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6796116504854369
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8216828478964402
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.8216828478964402
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.8216828478964402
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.8216828478964402
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.8216828478964402
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6796116504854369
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.5012149610968577
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.48128476255481567
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5105374388587102
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.518381647971727
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5228375783347256
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.52765377953199
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7394695787831513
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9635985439417577
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.982839313572543
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9927197087883516
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9947997919916797
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9963598543941757
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7394695787831513
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12488299531981278
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05174206968278733
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.02629225169006761
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017635638758883684
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013281331253250133
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.28537503404898107
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.9225949037961519
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9548015253943491
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.970532154619518
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9766337320159473
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9810747096550528
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7394695787831513
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.8119072371250002
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.8208055075822587
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.8242798548838444
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8254601712767063
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.826231823086538
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7394695787831513
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.8059183822863336
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.8065662458714291
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.8067209669800003
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.8067371899834064
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.8067455244059942
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7394695787831513
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7439811728319751
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7464542457655368
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7469341154545359
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7470471963812441
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.7471010455519603
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7471920688836787
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6926677067082684
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9641185647425897
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.983879355174207
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9921996879875195
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9932397295891836
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9942797711908476
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6926677067082684
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12797711908476336
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.053281331253250144
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.027051482059282376
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.018110591090310275
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013619344773790953
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2603830819899463
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.928479805858901
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9650286011440458
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9796325186340786
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9837060149072628
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9862194487779511
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6926677067082684
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7967328692326251
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.8068705787791701
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.810158579950017
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.8109641919896999
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.8114360342473703
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6926677067082684
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7766838069642311
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7773792960985305
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7775026273925645
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7775124036000293
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7775182983569378
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6926677067082684
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.7210301157895639
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.7237555751939095
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.7242426468613273
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.7243265313145111
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.7243628241480395
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.7244144669299598
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.17888715548621945
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 1.0
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 1.0
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 1.0
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 1.0
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 1.0
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.17888715548621945
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.15439417576703063
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.0617576703068123
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.03087883515340615
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.020585890102270757
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.015439417576703075
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.05768764083896689
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 1.0
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 1.0
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 1.0
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 1.0
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 1.0
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.17888715548621945
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.5443156532634228
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.5443156532634228
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.5443156532634228
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.5443156532634228
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.5443156532634228
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.17888715548621945
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.4002437442375043
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.4002437442375043
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.4002437442375043
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.4002437442375043
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.4002437442375043
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.17888715548621945
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.32718437256695937
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.32718437256695937
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.32718437256695937
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.32718437256695937
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.32718437256695937
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.32718437256695937
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # Job - Job matching finetuned BAAI/bge-m3
908
+
909
+ Top performing model on [TalentCLEF 2025](https://talentclef.github.io/talentclef/) Task A. Use it for multilingual job title matching
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 1024 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
939
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("sentence_transformers_model_id")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 1024]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9636 | 0.9641 | 1.0 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9828 | 0.9839 | 1.0 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9927 | 0.9922 | 1.0 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9948 | 0.9932 | 1.0 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9964 | 0.9943 | 1.0 |
1017
+ | cosine_precision@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1018
+ | cosine_precision@20 | 0.5062 | 0.5668 | 0.5404 | 0.4709 | 0.1249 | 0.128 | 0.1544 |
1019
+ | cosine_precision@50 | 0.3065 | 0.3903 | 0.3828 | 0.2804 | 0.0517 | 0.0533 | 0.0618 |
1020
+ | cosine_precision@100 | 0.1858 | 0.2525 | 0.2503 | 0.1732 | 0.0263 | 0.0271 | 0.0309 |
1021
+ | cosine_precision@150 | 0.1325 | 0.1901 | 0.1878 | 0.1239 | 0.0176 | 0.0181 | 0.0206 |
1022
+ | cosine_precision@200 | 0.1025 | 0.1508 | 0.1503 | 0.0977 | 0.0133 | 0.0136 | 0.0154 |
1023
+ | cosine_recall@1 | 0.0669 | 0.0035 | 0.0111 | 0.0643 | 0.2854 | 0.2604 | 0.0577 |
1024
+ | cosine_recall@20 | 0.5392 | 0.3796 | 0.3433 | 0.5119 | 0.9226 | 0.9285 | 1.0 |
1025
+ | cosine_recall@50 | 0.72 | 0.5636 | 0.534 | 0.6727 | 0.9548 | 0.965 | 1.0 |
1026
+ | cosine_recall@100 | 0.8254 | 0.6727 | 0.6499 | 0.788 | 0.9705 | 0.9796 | 1.0 |
1027
+ | cosine_recall@150 | 0.872 | 0.736 | 0.7101 | 0.8329 | 0.9766 | 0.9837 | 1.0 |
1028
+ | cosine_recall@200 | 0.9006 | 0.7698 | 0.7513 | 0.8687 | 0.9811 | 0.9862 | 1.0 |
1029
+ | cosine_ndcg@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1030
+ | cosine_ndcg@20 | 0.6822 | 0.6136 | 0.5648 | 0.6515 | 0.8119 | 0.7967 | 0.5443 |
1031
+ | cosine_ndcg@50 | 0.6975 | 0.5908 | 0.5522 | 0.6599 | 0.8208 | 0.8069 | 0.5443 |
1032
+ | cosine_ndcg@100 | 0.752 | 0.6168 | 0.5796 | 0.7157 | 0.8243 | 0.8102 | 0.5443 |
1033
+ | cosine_ndcg@150 | 0.7725 | 0.6489 | 0.6112 | 0.7357 | 0.8255 | 0.811 | 0.5443 |
1034
+ | **cosine_ndcg@200** | **0.7827** | **0.6653** | **0.6309** | **0.7501** | **0.8262** | **0.8114** | **0.5443** |
1035
+ | cosine_mrr@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1036
+ | cosine_mrr@20 | 0.8 | 0.5536 | 0.5164 | 0.8217 | 0.8059 | 0.7767 | 0.4002 |
1037
+ | cosine_mrr@50 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8066 | 0.7774 | 0.4002 |
1038
+ | cosine_mrr@100 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1039
+ | cosine_mrr@150 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1040
+ | cosine_mrr@200 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1041
+ | cosine_map@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1042
+ | cosine_map@20 | 0.5392 | 0.481 | 0.4222 | 0.5012 | 0.744 | 0.721 | 0.3272 |
1043
+ | cosine_map@50 | 0.5258 | 0.4304 | 0.3791 | 0.4813 | 0.7465 | 0.7238 | 0.3272 |
1044
+ | cosine_map@100 | 0.558 | 0.4335 | 0.3829 | 0.5105 | 0.7469 | 0.7242 | 0.3272 |
1045
+ | cosine_map@150 | 0.5666 | 0.4485 | 0.3981 | 0.5184 | 0.747 | 0.7243 | 0.3272 |
1046
+ | cosine_map@200 | 0.5695 | 0.4551 | 0.4056 | 0.5228 | 0.7471 | 0.7244 | 0.3272 |
1047
+ | cosine_map@500 | 0.5744 | 0.4677 | 0.4189 | 0.5277 | 0.7472 | 0.7244 | 0.3272 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 64
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 64
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.6856 | 0.5207 | 0.4655 | 0.6713 | 0.6224 | 0.5604 | 0.5548 |
1339
+ | 0.0010 | 1 | 5.3354 | - | - | - | - | - | - | - |
1340
+ | 0.1027 | 100 | 2.665 | - | - | - | - | - | - | - |
1341
+ | 0.2053 | 200 | 1.3375 | 0.7691 | 0.6530 | 0.6298 | 0.7517 | 0.7513 | 0.7393 | 0.5490 |
1342
+ | 0.3080 | 300 | 1.1101 | - | - | - | - | - | - | - |
1343
+ | 0.4107 | 400 | 0.9453 | 0.7802 | 0.6643 | 0.6246 | 0.7531 | 0.7610 | 0.7441 | 0.5493 |
1344
+ | 0.5133 | 500 | 0.9202 | - | - | - | - | - | - | - |
1345
+ | 0.6160 | 600 | 0.7887 | 0.7741 | 0.6549 | 0.6171 | 0.7542 | 0.7672 | 0.7540 | 0.5482 |
1346
+ | 0.7187 | 700 | 0.7604 | - | - | - | - | - | - | - |
1347
+ | 0.8214 | 800 | 0.7219 | 0.7846 | 0.6674 | 0.6244 | 0.7648 | 0.7741 | 0.7592 | 0.5497 |
1348
+ | 0.9240 | 900 | 0.6965 | - | - | - | - | - | - | - |
1349
+ | 1.0267 | 1000 | 0.6253 | 0.7646 | 0.6391 | 0.6122 | 0.7503 | 0.7825 | 0.7704 | 0.5463 |
1350
+ | 1.1294 | 1100 | 0.4737 | - | - | - | - | - | - | - |
1351
+ | 1.2320 | 1200 | 0.5055 | 0.7758 | 0.6582 | 0.6178 | 0.7514 | 0.7857 | 0.7764 | 0.5501 |
1352
+ | 1.3347 | 1300 | 0.5042 | - | - | - | - | - | - | - |
1353
+ | 1.4374 | 1400 | 0.5073 | 0.7613 | 0.6578 | 0.6178 | 0.7505 | 0.7829 | 0.7762 | 0.5452 |
1354
+ | 1.5400 | 1500 | 0.4975 | - | - | - | - | - | - | - |
1355
+ | 1.6427 | 1600 | 0.5242 | 0.7736 | 0.6673 | 0.6279 | 0.7555 | 0.7940 | 0.7859 | 0.5477 |
1356
+ | 1.7454 | 1700 | 0.4713 | - | - | - | - | - | - | - |
1357
+ | 1.8480 | 1800 | 0.4814 | 0.7845 | 0.6733 | 0.6285 | 0.7642 | 0.7992 | 0.7904 | 0.5449 |
1358
+ | 1.9507 | 1900 | 0.4526 | - | - | - | - | - | - | - |
1359
+ | 2.0544 | 2000 | 0.36 | 0.7790 | 0.6639 | 0.6252 | 0.7500 | 0.8032 | 0.7888 | 0.5499 |
1360
+ | 2.1571 | 2100 | 0.3744 | - | - | - | - | - | - | - |
1361
+ | 2.2598 | 2200 | 0.3031 | 0.7787 | 0.6614 | 0.6190 | 0.7537 | 0.7993 | 0.7811 | 0.5476 |
1362
+ | 2.3624 | 2300 | 0.3638 | - | - | - | - | - | - | - |
1363
+ | 2.4651 | 2400 | 0.358 | 0.7798 | 0.6615 | 0.6258 | 0.7497 | 0.8018 | 0.7828 | 0.5481 |
1364
+ | 2.5678 | 2500 | 0.3247 | - | - | - | - | - | - | - |
1365
+ | 2.6704 | 2600 | 0.3247 | 0.7854 | 0.6663 | 0.6248 | 0.7560 | 0.8081 | 0.7835 | 0.5452 |
1366
+ | 2.7731 | 2700 | 0.3263 | - | - | - | - | - | - | - |
1367
+ | 2.8758 | 2800 | 0.3212 | 0.7761 | 0.6681 | 0.6250 | 0.7517 | 0.8121 | 0.7927 | 0.5458 |
1368
+ | 2.9784 | 2900 | 0.3291 | - | - | - | - | - | - | - |
1369
+ | 3.0821 | 3000 | 0.2816 | 0.7727 | 0.6604 | 0.6163 | 0.7370 | 0.8163 | 0.7985 | 0.5473 |
1370
+ | 3.1848 | 3100 | 0.2698 | - | - | - | - | - | - | - |
1371
+ | 3.2875 | 3200 | 0.2657 | 0.7757 | 0.6615 | 0.6247 | 0.7417 | 0.8117 | 0.8004 | 0.5436 |
1372
+ | 3.3901 | 3300 | 0.2724 | - | - | - | - | - | - | - |
1373
+ | 3.4928 | 3400 | 0.2584 | 0.7850 | 0.6583 | 0.6320 | 0.7458 | 0.8120 | 0.7980 | 0.5454 |
1374
+ | 3.5955 | 3500 | 0.2573 | - | - | - | - | - | - | - |
1375
+ | 3.6982 | 3600 | 0.2744 | 0.7796 | 0.6552 | 0.6237 | 0.7409 | 0.8193 | 0.8018 | 0.5466 |
1376
+ | 3.8008 | 3700 | 0.3054 | - | - | - | - | - | - | - |
1377
+ | 3.9035 | 3800 | 0.2727 | 0.7825 | 0.6642 | 0.6293 | 0.7504 | 0.8213 | 0.8058 | 0.5463 |
1378
+ | 4.0062 | 3900 | 0.2353 | - | - | - | - | - | - | - |
1379
+ | 4.1088 | 4000 | 0.2353 | 0.7747 | 0.6628 | 0.6263 | 0.7384 | 0.8239 | 0.8065 | 0.5447 |
1380
+ | 4.2115 | 4100 | 0.2385 | - | - | - | - | - | - | - |
1381
+ | 4.3142 | 4200 | 0.231 | 0.7811 | 0.6608 | 0.6254 | 0.7463 | 0.8226 | 0.8051 | 0.5442 |
1382
+ | 4.4168 | 4300 | 0.2115 | - | - | - | - | - | - | - |
1383
+ | 4.5195 | 4400 | 0.2151 | 0.7815 | 0.6634 | 0.6301 | 0.7489 | 0.8251 | 0.8101 | 0.5450 |
1384
+ | 4.6222 | 4500 | 0.2496 | - | - | - | - | - | - | - |
1385
+ | 4.7248 | 4600 | 0.2146 | 0.7814 | 0.6654 | 0.6294 | 0.7523 | 0.8258 | 0.8104 | 0.5436 |
1386
+ | 4.8275 | 4700 | 0.2535 | - | - | - | - | - | - | - |
1387
+ | 4.9302 | 4800 | 0.2058 | 0.7827 | 0.6653 | 0.6309 | 0.7501 | 0.8262 | 0.8114 | 0.5443 |
1388
+
1389
+
1390
+ ### Framework Versions
1391
+ - Python: 3.11.11
1392
+ - Sentence Transformers: 4.1.0
1393
+ - Transformers: 4.51.2
1394
+ - PyTorch: 2.6.0+cu124
1395
+ - Accelerate: 1.6.0
1396
+ - Datasets: 3.5.0
1397
+ - Tokenizers: 0.21.1
1398
+
1399
+ ## Citation
1400
+
1401
+ ### BibTeX
1402
+
1403
+ #### Sentence Transformers
1404
+ ```bibtex
1405
+ @inproceedings{reimers-2019-sentence-bert,
1406
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1407
+ author = "Reimers, Nils and Gurevych, Iryna",
1408
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1409
+ month = "11",
1410
+ year = "2019",
1411
+ publisher = "Association for Computational Linguistics",
1412
+ url = "https://arxiv.org/abs/1908.10084",
1413
+ }
1414
+ ```
1415
+
1416
+ #### GISTEmbedLoss
1417
+ ```bibtex
1418
+ @misc{solatorio2024gistembed,
1419
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1420
+ author={Aivin V. Solatorio},
1421
+ year={2024},
1422
+ eprint={2402.16829},
1423
+ archivePrefix={arXiv},
1424
+ primaryClass={cs.LG}
1425
+ }
1426
+ ```
1427
+
1428
+ <!--
1429
+ ## Glossary
1430
+
1431
+ *Clearly define terms in order to be accessible across audiences.*
1432
+ -->
1433
+
1434
+ <!--
1435
+ ## Model Card Authors
1436
+
1437
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1438
+ -->
1439
+
1440
+ <!--
1441
+ ## Model Card Contact
1442
+
1443
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1444
+ -->
checkpoint-4000/README.md ADDED
@@ -0,0 +1,1436 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on BAAI/bge-m3
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6476190476190476
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6476190476190476
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.499047619047619
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.30266666666666664
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.18447619047619046
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13155555555555554
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10171428571428573
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06690172806447445
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5288155255988508
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7128731386766649
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.821589853989195
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8669290529739844
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.8881772271562451
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6476190476190476
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6737021289484512
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.6897381539459008
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7455379155828873
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7657730626526685
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.7746920852324353
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6476190476190476
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.7969444444444443
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.7969444444444443
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.7969444444444443
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.7969444444444443
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.7969444444444443
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6476190476190476
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5299368408688423
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5170402457535271
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.549577105065989
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5580348324082148
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5609705433942662
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5664835460503455
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.12432432432432433
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.12432432432432433
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5718918918918918
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.38832432432432434
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.25135135135135134
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.1886486486486487
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.15083783783783786
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0036542148230633313
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.3813088657975513
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5589819018381946
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.6712879484837694
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7296378671854172
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7646529145750729
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.12432432432432433
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6162786673767947
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5875500387824142
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6146487956773306
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6449661586574366
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.6628313427507618
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.12432432432432433
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5585585585585586
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5585585585585586
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5585585585585586
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5585585585585586
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5585585585585586
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.12432432432432433
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.4830935685993706
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.4268637780839156
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.43032040469750343
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.4449589410699155
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.4523102942291434
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.4643631946508736
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9753694581280788
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9852216748768473
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9852216748768473
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5399014778325123
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.3829556650246305
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.25098522167487686
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.18742200328407224
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.14911330049261085
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.33926725064737134
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5319613376214742
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6497082600959269
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.7094703332321319
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7445597670438818
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.5621043185251402
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5505636839954736
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5784375922614946
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.6091764880384499
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6263384735475871
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5127296895769795
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5130763416477695
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5130763416477695
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5131188080992728
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5131188080992728
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.42085554479107096
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.3779379416896035
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.38163165810143573
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.3961646378244818
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.40295816570523324
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.4167002568710484
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6407766990291263
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6407766990291263
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.46504854368932047
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.27611650485436895
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17097087378640777
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12291262135922332
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.0969417475728155
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.05744396078263393
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.4978573021507442
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.6611813069264482
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.7796553453979224
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8271677009796732
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8637730394316714
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6407766990291263
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6374339653798218
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6458466090741598
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7026844413104963
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7238302410564206
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7383757321568225
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6407766990291263
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.7983818770226538
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.7983818770226538
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.7983818770226538
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.7983818770226538
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.7983818770226538
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6407766990291263
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.4902515378001179
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.46828607843970593
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.49742002930709256
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5055517135202557
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5100267276205871
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5152273086702759
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7358294331773271
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9625585023400937
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.9802392095683827
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9927197087883516
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9947997919916797
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9958398335933437
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7358294331773271
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12438897555902236
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05158606344253771
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.026224648985959446
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017628705148205928
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013268330733229333
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.28403164698016486
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.9190414283237995
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.952244756456925
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9685820766163981
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9762870514820593
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9801872074882996
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7358294331773271
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.8089516774866639
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.8181299102768375
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.8217009899252086
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8232345422421572
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.8239096085290897
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7358294331773271
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.8035232306901704
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.8041564269676074
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.8043491602665708
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.8043649132860833
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.8043707455995762
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7358294331773271
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7407296211762635
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7433011890905112
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7437599072934008
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7439220951644092
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.7439677461223776
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7440630263326289
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6947477899115965
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.967758710348414
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.984399375975039
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9901196047841914
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9932397295891836
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9932397295891836
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6947477899115965
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12769110764430577
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05316692667706709
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.026978679147165893
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.018082856647599233
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013595943837753513
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.26064309239036226
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.9266163979892529
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9632518634078697
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9771190847633905
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.982232622638239
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.984659386375455
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6947477899115965
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7916550876560119
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.8018356667177752
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.8049830038156018
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.8060041518104935
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.8064526867706615
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6947477899115965
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.775106319970792
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7756762344136855
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7757636235577245
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7757917238264626
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7757917238264626
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6947477899115965
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.7123386461179687
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.7151736057555711
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.7156740227134941
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.7157705885677804
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.7158097678043102
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.7158747359338941
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.1814872594903796
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 1.0
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 1.0
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 1.0
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 1.0
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 1.0
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.1814872594903796
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.15439417576703063
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.0617576703068123
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.03087883515340615
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.020585890102270757
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.015439417576703075
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.058722729861575416
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 1.0
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 1.0
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 1.0
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 1.0
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 1.0
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.1814872594903796
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.5447038314336347
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.5447038314336347
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.5447038314336347
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.5447038314336347
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.5447038314336347
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.1814872594903796
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.40366659543726713
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.40366659543726713
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.40366659543726713
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.40366659543726713
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.40366659543726713
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.1814872594903796
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.32665499722442
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.32665499722442
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.32665499722442
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.32665499722442
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.32665499722442
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.32665499722442
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # SentenceTransformer based on BAAI/bge-m3
908
+
909
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on the full_en, full_de, full_es, full_zh and mix datasets. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 1024 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
939
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("sentence_transformers_model_id")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 1024]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6476 | 0.1243 | 0.2956 | 0.6408 | 0.7358 | 0.6947 | 0.1815 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9754 | 0.9903 | 0.9626 | 0.9678 | 1.0 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9802 | 0.9844 | 1.0 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9927 | 0.9901 | 1.0 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9948 | 0.9932 | 1.0 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9958 | 0.9932 | 1.0 |
1017
+ | cosine_precision@1 | 0.6476 | 0.1243 | 0.2956 | 0.6408 | 0.7358 | 0.6947 | 0.1815 |
1018
+ | cosine_precision@20 | 0.499 | 0.5719 | 0.5399 | 0.465 | 0.1244 | 0.1277 | 0.1544 |
1019
+ | cosine_precision@50 | 0.3027 | 0.3883 | 0.383 | 0.2761 | 0.0516 | 0.0532 | 0.0618 |
1020
+ | cosine_precision@100 | 0.1845 | 0.2514 | 0.251 | 0.171 | 0.0262 | 0.027 | 0.0309 |
1021
+ | cosine_precision@150 | 0.1316 | 0.1886 | 0.1874 | 0.1229 | 0.0176 | 0.0181 | 0.0206 |
1022
+ | cosine_precision@200 | 0.1017 | 0.1508 | 0.1491 | 0.0969 | 0.0133 | 0.0136 | 0.0154 |
1023
+ | cosine_recall@1 | 0.0669 | 0.0037 | 0.0111 | 0.0574 | 0.284 | 0.2606 | 0.0587 |
1024
+ | cosine_recall@20 | 0.5288 | 0.3813 | 0.3393 | 0.4979 | 0.919 | 0.9266 | 1.0 |
1025
+ | cosine_recall@50 | 0.7129 | 0.559 | 0.532 | 0.6612 | 0.9522 | 0.9633 | 1.0 |
1026
+ | cosine_recall@100 | 0.8216 | 0.6713 | 0.6497 | 0.7797 | 0.9686 | 0.9771 | 1.0 |
1027
+ | cosine_recall@150 | 0.8669 | 0.7296 | 0.7095 | 0.8272 | 0.9763 | 0.9822 | 1.0 |
1028
+ | cosine_recall@200 | 0.8882 | 0.7647 | 0.7446 | 0.8638 | 0.9802 | 0.9847 | 1.0 |
1029
+ | cosine_ndcg@1 | 0.6476 | 0.1243 | 0.2956 | 0.6408 | 0.7358 | 0.6947 | 0.1815 |
1030
+ | cosine_ndcg@20 | 0.6737 | 0.6163 | 0.5621 | 0.6374 | 0.809 | 0.7917 | 0.5447 |
1031
+ | cosine_ndcg@50 | 0.6897 | 0.5876 | 0.5506 | 0.6458 | 0.8181 | 0.8018 | 0.5447 |
1032
+ | cosine_ndcg@100 | 0.7455 | 0.6146 | 0.5784 | 0.7027 | 0.8217 | 0.805 | 0.5447 |
1033
+ | cosine_ndcg@150 | 0.7658 | 0.645 | 0.6092 | 0.7238 | 0.8232 | 0.806 | 0.5447 |
1034
+ | **cosine_ndcg@200** | **0.7747** | **0.6628** | **0.6263** | **0.7384** | **0.8239** | **0.8065** | **0.5447** |
1035
+ | cosine_mrr@1 | 0.6476 | 0.1243 | 0.2956 | 0.6408 | 0.7358 | 0.6947 | 0.1815 |
1036
+ | cosine_mrr@20 | 0.7969 | 0.5586 | 0.5127 | 0.7984 | 0.8035 | 0.7751 | 0.4037 |
1037
+ | cosine_mrr@50 | 0.7969 | 0.5586 | 0.5131 | 0.7984 | 0.8042 | 0.7757 | 0.4037 |
1038
+ | cosine_mrr@100 | 0.7969 | 0.5586 | 0.5131 | 0.7984 | 0.8043 | 0.7758 | 0.4037 |
1039
+ | cosine_mrr@150 | 0.7969 | 0.5586 | 0.5131 | 0.7984 | 0.8044 | 0.7758 | 0.4037 |
1040
+ | cosine_mrr@200 | 0.7969 | 0.5586 | 0.5131 | 0.7984 | 0.8044 | 0.7758 | 0.4037 |
1041
+ | cosine_map@1 | 0.6476 | 0.1243 | 0.2956 | 0.6408 | 0.7358 | 0.6947 | 0.1815 |
1042
+ | cosine_map@20 | 0.5299 | 0.4831 | 0.4209 | 0.4903 | 0.7407 | 0.7123 | 0.3267 |
1043
+ | cosine_map@50 | 0.517 | 0.4269 | 0.3779 | 0.4683 | 0.7433 | 0.7152 | 0.3267 |
1044
+ | cosine_map@100 | 0.5496 | 0.4303 | 0.3816 | 0.4974 | 0.7438 | 0.7157 | 0.3267 |
1045
+ | cosine_map@150 | 0.558 | 0.445 | 0.3962 | 0.5056 | 0.7439 | 0.7158 | 0.3267 |
1046
+ | cosine_map@200 | 0.561 | 0.4523 | 0.403 | 0.51 | 0.744 | 0.7158 | 0.3267 |
1047
+ | cosine_map@500 | 0.5665 | 0.4644 | 0.4167 | 0.5152 | 0.7441 | 0.7159 | 0.3267 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 64
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 64
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.6856 | 0.5207 | 0.4655 | 0.6713 | 0.6224 | 0.5604 | 0.5548 |
1339
+ | 0.0010 | 1 | 5.3354 | - | - | - | - | - | - | - |
1340
+ | 0.1027 | 100 | 2.665 | - | - | - | - | - | - | - |
1341
+ | 0.2053 | 200 | 1.3375 | 0.7691 | 0.6530 | 0.6298 | 0.7517 | 0.7513 | 0.7393 | 0.5490 |
1342
+ | 0.3080 | 300 | 1.1101 | - | - | - | - | - | - | - |
1343
+ | 0.4107 | 400 | 0.9453 | 0.7802 | 0.6643 | 0.6246 | 0.7531 | 0.7610 | 0.7441 | 0.5493 |
1344
+ | 0.5133 | 500 | 0.9202 | - | - | - | - | - | - | - |
1345
+ | 0.6160 | 600 | 0.7887 | 0.7741 | 0.6549 | 0.6171 | 0.7542 | 0.7672 | 0.7540 | 0.5482 |
1346
+ | 0.7187 | 700 | 0.7604 | - | - | - | - | - | - | - |
1347
+ | 0.8214 | 800 | 0.7219 | 0.7846 | 0.6674 | 0.6244 | 0.7648 | 0.7741 | 0.7592 | 0.5497 |
1348
+ | 0.9240 | 900 | 0.6965 | - | - | - | - | - | - | - |
1349
+ | 1.0267 | 1000 | 0.6253 | 0.7646 | 0.6391 | 0.6122 | 0.7503 | 0.7825 | 0.7704 | 0.5463 |
1350
+ | 1.1294 | 1100 | 0.4737 | - | - | - | - | - | - | - |
1351
+ | 1.2320 | 1200 | 0.5055 | 0.7758 | 0.6582 | 0.6178 | 0.7514 | 0.7857 | 0.7764 | 0.5501 |
1352
+ | 1.3347 | 1300 | 0.5042 | - | - | - | - | - | - | - |
1353
+ | 1.4374 | 1400 | 0.5073 | 0.7613 | 0.6578 | 0.6178 | 0.7505 | 0.7829 | 0.7762 | 0.5452 |
1354
+ | 1.5400 | 1500 | 0.4975 | - | - | - | - | - | - | - |
1355
+ | 1.6427 | 1600 | 0.5242 | 0.7736 | 0.6673 | 0.6279 | 0.7555 | 0.7940 | 0.7859 | 0.5477 |
1356
+ | 1.7454 | 1700 | 0.4713 | - | - | - | - | - | - | - |
1357
+ | 1.8480 | 1800 | 0.4814 | 0.7845 | 0.6733 | 0.6285 | 0.7642 | 0.7992 | 0.7904 | 0.5449 |
1358
+ | 1.9507 | 1900 | 0.4526 | - | - | - | - | - | - | - |
1359
+ | 2.0544 | 2000 | 0.36 | 0.7790 | 0.6639 | 0.6252 | 0.7500 | 0.8032 | 0.7888 | 0.5499 |
1360
+ | 2.1571 | 2100 | 0.3744 | - | - | - | - | - | - | - |
1361
+ | 2.2598 | 2200 | 0.3031 | 0.7787 | 0.6614 | 0.6190 | 0.7537 | 0.7993 | 0.7811 | 0.5476 |
1362
+ | 2.3624 | 2300 | 0.3638 | - | - | - | - | - | - | - |
1363
+ | 2.4651 | 2400 | 0.358 | 0.7798 | 0.6615 | 0.6258 | 0.7497 | 0.8018 | 0.7828 | 0.5481 |
1364
+ | 2.5678 | 2500 | 0.3247 | - | - | - | - | - | - | - |
1365
+ | 2.6704 | 2600 | 0.3247 | 0.7854 | 0.6663 | 0.6248 | 0.7560 | 0.8081 | 0.7835 | 0.5452 |
1366
+ | 2.7731 | 2700 | 0.3263 | - | - | - | - | - | - | - |
1367
+ | 2.8758 | 2800 | 0.3212 | 0.7761 | 0.6681 | 0.6250 | 0.7517 | 0.8121 | 0.7927 | 0.5458 |
1368
+ | 2.9784 | 2900 | 0.3291 | - | - | - | - | - | - | - |
1369
+ | 3.0821 | 3000 | 0.2816 | 0.7727 | 0.6604 | 0.6163 | 0.7370 | 0.8163 | 0.7985 | 0.5473 |
1370
+ | 3.1848 | 3100 | 0.2698 | - | - | - | - | - | - | - |
1371
+ | 3.2875 | 3200 | 0.2657 | 0.7757 | 0.6615 | 0.6247 | 0.7417 | 0.8117 | 0.8004 | 0.5436 |
1372
+ | 3.3901 | 3300 | 0.2724 | - | - | - | - | - | - | - |
1373
+ | 3.4928 | 3400 | 0.2584 | 0.7850 | 0.6583 | 0.6320 | 0.7458 | 0.8120 | 0.7980 | 0.5454 |
1374
+ | 3.5955 | 3500 | 0.2573 | - | - | - | - | - | - | - |
1375
+ | 3.6982 | 3600 | 0.2744 | 0.7796 | 0.6552 | 0.6237 | 0.7409 | 0.8193 | 0.8018 | 0.5466 |
1376
+ | 3.8008 | 3700 | 0.3054 | - | - | - | - | - | - | - |
1377
+ | 3.9035 | 3800 | 0.2727 | 0.7825 | 0.6642 | 0.6293 | 0.7504 | 0.8213 | 0.8058 | 0.5463 |
1378
+ | 4.0062 | 3900 | 0.2353 | - | - | - | - | - | - | - |
1379
+ | 4.1088 | 4000 | 0.2353 | 0.7747 | 0.6628 | 0.6263 | 0.7384 | 0.8239 | 0.8065 | 0.5447 |
1380
+
1381
+
1382
+ ### Framework Versions
1383
+ - Python: 3.11.11
1384
+ - Sentence Transformers: 4.1.0
1385
+ - Transformers: 4.51.2
1386
+ - PyTorch: 2.6.0+cu124
1387
+ - Accelerate: 1.6.0
1388
+ - Datasets: 3.5.0
1389
+ - Tokenizers: 0.21.1
1390
+
1391
+ ## Citation
1392
+
1393
+ ### BibTeX
1394
+
1395
+ #### Sentence Transformers
1396
+ ```bibtex
1397
+ @inproceedings{reimers-2019-sentence-bert,
1398
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1399
+ author = "Reimers, Nils and Gurevych, Iryna",
1400
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1401
+ month = "11",
1402
+ year = "2019",
1403
+ publisher = "Association for Computational Linguistics",
1404
+ url = "https://arxiv.org/abs/1908.10084",
1405
+ }
1406
+ ```
1407
+
1408
+ #### GISTEmbedLoss
1409
+ ```bibtex
1410
+ @misc{solatorio2024gistembed,
1411
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1412
+ author={Aivin V. Solatorio},
1413
+ year={2024},
1414
+ eprint={2402.16829},
1415
+ archivePrefix={arXiv},
1416
+ primaryClass={cs.LG}
1417
+ }
1418
+ ```
1419
+
1420
+ <!--
1421
+ ## Glossary
1422
+
1423
+ *Clearly define terms in order to be accessible across audiences.*
1424
+ -->
1425
+
1426
+ <!--
1427
+ ## Model Card Authors
1428
+
1429
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1430
+ -->
1431
+
1432
+ <!--
1433
+ ## Model Card Contact
1434
+
1435
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1436
+ -->
checkpoint-4000/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-4000/sentencepiece.bpe.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfc8146abe2a0488e9e2a0c56de7952f7c11ab059eca145a0a727afce0db2865
3
+ size 5069051
checkpoint-4000/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9a6af42442a3e3e9f05f618eae0bb2d98ca4f6a6406cb80ef7a4fa865204d61
3
+ size 17083052
checkpoint-4200/config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "XLMRobertaModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 1024,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 4096,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 8194,
16
+ "model_type": "xlm-roberta",
17
+ "num_attention_heads": 16,
18
+ "num_hidden_layers": 24,
19
+ "output_past": true,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.51.2",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 250002
27
+ }
checkpoint-4200/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-4200/special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
checkpoint-4200/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9a6af42442a3e3e9f05f618eae0bb2d98ca4f6a6406cb80ef7a4fa865204d61
3
+ size 17083052
checkpoint-4200/tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "sp_model_kwargs": {},
54
+ "tokenizer_class": "XLMRobertaTokenizer",
55
+ "unk_token": "<unk>"
56
+ }
checkpoint-4200/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-4400/1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
checkpoint-4400/README.md ADDED
@@ -0,0 +1,1440 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on BAAI/bge-m3
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6571428571428571
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6571428571428571
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.501904761904762
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.30514285714285716
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.18476190476190474
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13238095238095238
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10223809523809524
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06749696615971254
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5348166179254283
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7176194992567407
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8203546241789754
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8712408549365904
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.8993000584751492
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6571428571428571
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6791929962471466
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.6958143211009435
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7493655431536407
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7715718645271473
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.7814931000676181
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6571428571428571
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.8026984126984127
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.8026984126984127
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.8026984126984127
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.8026984126984127
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.8026984126984127
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6571428571428571
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5371258373378305
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5243155763407285
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5561427452138551
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5652920456249697
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5681007357520309
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5730541345190991
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.10810810810810811
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.10810810810810811
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5667567567567569
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3877837837837838
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.25156756756756754
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.18954954954954953
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.15067567567567566
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0033677005752683685
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.3790230473715137
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5587328778405388
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.670664457795493
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7335635895457856
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.766278425246947
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.10810810810810811
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.613008635976177
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5878242736285791
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6148703843706662
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6471060871986968
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.6634453873788777
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.10810810810810811
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5509009009009009
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5509009009009009
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5509009009009009
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5509009009009009
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5509009009009009
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.10810810810810811
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.48105434805966624
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.42917908716630376
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.4322285035959748
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.4473320611795549
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.45413116686066823
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.4666628908850396
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9852216748768473
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9852216748768473
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5438423645320197
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.3827586206896551
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.2493103448275862
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.1868965517241379
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.150320197044335
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3450860009022403
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5334236440941986
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6498536020861698
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.7091695139240046
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7496224791667186
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.567054203369494
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5519557348354142
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5786968752325107
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.6099446866772629
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6301254755200327
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5163441238564384
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5163441238564384
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5164370692974646
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5164370692974646
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5164370692974646
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.4243293426584066
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.37874837593471367
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3817891460614099
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.39643664920094024
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.40443608704984707
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.4176754500966089
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6601941747572816
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6601941747572816
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.47038834951456326
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.27941747572815534
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17242718446601943
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.1239482200647249
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09762135922330097
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06553457566936532
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5048889923213504
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.6723480900580502
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.7839824295594963
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8346078714936033
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.868005364909913
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6601941747572816
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6489154249968472
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6582544798801073
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7132853867809429
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7351428110305336
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7489033638336042
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6601941747572816
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8105987055016182
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.8105987055016182
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.8105987055016182
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.8105987055016182
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.8105987055016182
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6601941747572816
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.5020661402654003
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.4804116383814884
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5096988475054017
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5182607426785758
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5226490945380862
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5274856682898562
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7358294331773271
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9615184607384295
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.983359334373375
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9921996879875195
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9942797711908476
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9947997919916797
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7358294331773271
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12477899115964639
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05174206968278733
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.02630265210608425
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017652972785578088
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013294331773270933
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.28398831191342894
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.9220748829953198
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9548448604610851
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9711388455538221
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9777604437510833
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9823019587450165
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7358294331773271
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.8104398530748719
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.8194810222604678
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.8230427127064399
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8243283104602539
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.8251186561711241
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7358294331773271
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.8034886386855536
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.8042294348215404
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.8043610639446989
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.8043778926448901
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.8043807816493392
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7358294331773271
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.742446597316252
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7448760952950458
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7453727938942869
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7454980553388746
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.7455568923614244
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7456455633479137
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6942277691107644
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9667186687467498
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.983359334373375
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9916796671866874
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9932397295891836
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9942797711908476
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6942277691107644
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12784711388455536
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05319812792511702
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.0270306812272491
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.018110591090310275
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013616744669786796
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.26120644825793027
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.927873114924597
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9637285491419657
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9789391575663027
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9837926850407349
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9862194487779511
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6942277691107644
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7952836406043297
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.8052399503452229
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.8086752401344494
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.8096382458419952
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.810085192105751
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6942277691107644
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7761581892584265
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7766868481375114
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7768104145556238
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7768244234791826
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7768305684544853
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6942277691107644
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.7188197545745756
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.7215707141808124
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.7220898692554206
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.7221900369972237
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.7222223600003219
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.7222810622423789
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.18200728029121166
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 1.0
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 1.0
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 1.0
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 1.0
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 1.0
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.18200728029121166
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.15439417576703063
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.0617576703068123
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.03087883515340615
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.020585890102270757
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.015439417576703075
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.05850234009360374
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 1.0
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 1.0
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 1.0
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 1.0
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 1.0
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.18200728029121166
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.5450053067257837
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.5450053067257837
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.5450053067257837
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.5450053067257837
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.5450053067257837
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.18200728029121166
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.40246777114951904
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.40246777114951904
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.40246777114951904
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.40246777114951904
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.40246777114951904
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.18200728029121166
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.3277096647667185
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.3277096647667185
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.3277096647667185
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.3277096647667185
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.3277096647667185
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.3277096647667185
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # SentenceTransformer based on BAAI/bge-m3
908
+
909
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on the full_en, full_de, full_es, full_zh and mix datasets. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 1024 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
939
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("sentence_transformers_model_id")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 1024]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:----------|
1011
+ | cosine_accuracy@1 | 0.6571 | 0.1081 | 0.2956 | 0.6602 | 0.7358 | 0.6942 | 0.182 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9615 | 0.9667 | 1.0 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9834 | 0.9834 | 1.0 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9922 | 0.9917 | 1.0 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9943 | 0.9932 | 1.0 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9948 | 0.9943 | 1.0 |
1017
+ | cosine_precision@1 | 0.6571 | 0.1081 | 0.2956 | 0.6602 | 0.7358 | 0.6942 | 0.182 |
1018
+ | cosine_precision@20 | 0.5019 | 0.5668 | 0.5438 | 0.4704 | 0.1248 | 0.1278 | 0.1544 |
1019
+ | cosine_precision@50 | 0.3051 | 0.3878 | 0.3828 | 0.2794 | 0.0517 | 0.0532 | 0.0618 |
1020
+ | cosine_precision@100 | 0.1848 | 0.2516 | 0.2493 | 0.1724 | 0.0263 | 0.027 | 0.0309 |
1021
+ | cosine_precision@150 | 0.1324 | 0.1895 | 0.1869 | 0.1239 | 0.0177 | 0.0181 | 0.0206 |
1022
+ | cosine_precision@200 | 0.1022 | 0.1507 | 0.1503 | 0.0976 | 0.0133 | 0.0136 | 0.0154 |
1023
+ | cosine_recall@1 | 0.0675 | 0.0034 | 0.0111 | 0.0655 | 0.284 | 0.2612 | 0.0585 |
1024
+ | cosine_recall@20 | 0.5348 | 0.379 | 0.3451 | 0.5049 | 0.9221 | 0.9279 | 1.0 |
1025
+ | cosine_recall@50 | 0.7176 | 0.5587 | 0.5334 | 0.6723 | 0.9548 | 0.9637 | 1.0 |
1026
+ | cosine_recall@100 | 0.8204 | 0.6707 | 0.6499 | 0.784 | 0.9711 | 0.9789 | 1.0 |
1027
+ | cosine_recall@150 | 0.8712 | 0.7336 | 0.7092 | 0.8346 | 0.9778 | 0.9838 | 1.0 |
1028
+ | cosine_recall@200 | 0.8993 | 0.7663 | 0.7496 | 0.868 | 0.9823 | 0.9862 | 1.0 |
1029
+ | cosine_ndcg@1 | 0.6571 | 0.1081 | 0.2956 | 0.6602 | 0.7358 | 0.6942 | 0.182 |
1030
+ | cosine_ndcg@20 | 0.6792 | 0.613 | 0.5671 | 0.6489 | 0.8104 | 0.7953 | 0.545 |
1031
+ | cosine_ndcg@50 | 0.6958 | 0.5878 | 0.552 | 0.6583 | 0.8195 | 0.8052 | 0.545 |
1032
+ | cosine_ndcg@100 | 0.7494 | 0.6149 | 0.5787 | 0.7133 | 0.823 | 0.8087 | 0.545 |
1033
+ | cosine_ndcg@150 | 0.7716 | 0.6471 | 0.6099 | 0.7351 | 0.8243 | 0.8096 | 0.545 |
1034
+ | **cosine_ndcg@200** | **0.7815** | **0.6634** | **0.6301** | **0.7489** | **0.8251** | **0.8101** | **0.545** |
1035
+ | cosine_mrr@1 | 0.6571 | 0.1081 | 0.2956 | 0.6602 | 0.7358 | 0.6942 | 0.182 |
1036
+ | cosine_mrr@20 | 0.8027 | 0.5509 | 0.5163 | 0.8106 | 0.8035 | 0.7762 | 0.4025 |
1037
+ | cosine_mrr@50 | 0.8027 | 0.5509 | 0.5163 | 0.8106 | 0.8042 | 0.7767 | 0.4025 |
1038
+ | cosine_mrr@100 | 0.8027 | 0.5509 | 0.5164 | 0.8106 | 0.8044 | 0.7768 | 0.4025 |
1039
+ | cosine_mrr@150 | 0.8027 | 0.5509 | 0.5164 | 0.8106 | 0.8044 | 0.7768 | 0.4025 |
1040
+ | cosine_mrr@200 | 0.8027 | 0.5509 | 0.5164 | 0.8106 | 0.8044 | 0.7768 | 0.4025 |
1041
+ | cosine_map@1 | 0.6571 | 0.1081 | 0.2956 | 0.6602 | 0.7358 | 0.6942 | 0.182 |
1042
+ | cosine_map@20 | 0.5371 | 0.4811 | 0.4243 | 0.5021 | 0.7424 | 0.7188 | 0.3277 |
1043
+ | cosine_map@50 | 0.5243 | 0.4292 | 0.3787 | 0.4804 | 0.7449 | 0.7216 | 0.3277 |
1044
+ | cosine_map@100 | 0.5561 | 0.4322 | 0.3818 | 0.5097 | 0.7454 | 0.7221 | 0.3277 |
1045
+ | cosine_map@150 | 0.5653 | 0.4473 | 0.3964 | 0.5183 | 0.7455 | 0.7222 | 0.3277 |
1046
+ | cosine_map@200 | 0.5681 | 0.4541 | 0.4044 | 0.5226 | 0.7456 | 0.7222 | 0.3277 |
1047
+ | cosine_map@500 | 0.5731 | 0.4667 | 0.4177 | 0.5275 | 0.7456 | 0.7223 | 0.3277 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 64
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 64
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.6856 | 0.5207 | 0.4655 | 0.6713 | 0.6224 | 0.5604 | 0.5548 |
1339
+ | 0.0010 | 1 | 5.3354 | - | - | - | - | - | - | - |
1340
+ | 0.1027 | 100 | 2.665 | - | - | - | - | - | - | - |
1341
+ | 0.2053 | 200 | 1.3375 | 0.7691 | 0.6530 | 0.6298 | 0.7517 | 0.7513 | 0.7393 | 0.5490 |
1342
+ | 0.3080 | 300 | 1.1101 | - | - | - | - | - | - | - |
1343
+ | 0.4107 | 400 | 0.9453 | 0.7802 | 0.6643 | 0.6246 | 0.7531 | 0.7610 | 0.7441 | 0.5493 |
1344
+ | 0.5133 | 500 | 0.9202 | - | - | - | - | - | - | - |
1345
+ | 0.6160 | 600 | 0.7887 | 0.7741 | 0.6549 | 0.6171 | 0.7542 | 0.7672 | 0.7540 | 0.5482 |
1346
+ | 0.7187 | 700 | 0.7604 | - | - | - | - | - | - | - |
1347
+ | 0.8214 | 800 | 0.7219 | 0.7846 | 0.6674 | 0.6244 | 0.7648 | 0.7741 | 0.7592 | 0.5497 |
1348
+ | 0.9240 | 900 | 0.6965 | - | - | - | - | - | - | - |
1349
+ | 1.0267 | 1000 | 0.6253 | 0.7646 | 0.6391 | 0.6122 | 0.7503 | 0.7825 | 0.7704 | 0.5463 |
1350
+ | 1.1294 | 1100 | 0.4737 | - | - | - | - | - | - | - |
1351
+ | 1.2320 | 1200 | 0.5055 | 0.7758 | 0.6582 | 0.6178 | 0.7514 | 0.7857 | 0.7764 | 0.5501 |
1352
+ | 1.3347 | 1300 | 0.5042 | - | - | - | - | - | - | - |
1353
+ | 1.4374 | 1400 | 0.5073 | 0.7613 | 0.6578 | 0.6178 | 0.7505 | 0.7829 | 0.7762 | 0.5452 |
1354
+ | 1.5400 | 1500 | 0.4975 | - | - | - | - | - | - | - |
1355
+ | 1.6427 | 1600 | 0.5242 | 0.7736 | 0.6673 | 0.6279 | 0.7555 | 0.7940 | 0.7859 | 0.5477 |
1356
+ | 1.7454 | 1700 | 0.4713 | - | - | - | - | - | - | - |
1357
+ | 1.8480 | 1800 | 0.4814 | 0.7845 | 0.6733 | 0.6285 | 0.7642 | 0.7992 | 0.7904 | 0.5449 |
1358
+ | 1.9507 | 1900 | 0.4526 | - | - | - | - | - | - | - |
1359
+ | 2.0544 | 2000 | 0.36 | 0.7790 | 0.6639 | 0.6252 | 0.7500 | 0.8032 | 0.7888 | 0.5499 |
1360
+ | 2.1571 | 2100 | 0.3744 | - | - | - | - | - | - | - |
1361
+ | 2.2598 | 2200 | 0.3031 | 0.7787 | 0.6614 | 0.6190 | 0.7537 | 0.7993 | 0.7811 | 0.5476 |
1362
+ | 2.3624 | 2300 | 0.3638 | - | - | - | - | - | - | - |
1363
+ | 2.4651 | 2400 | 0.358 | 0.7798 | 0.6615 | 0.6258 | 0.7497 | 0.8018 | 0.7828 | 0.5481 |
1364
+ | 2.5678 | 2500 | 0.3247 | - | - | - | - | - | - | - |
1365
+ | 2.6704 | 2600 | 0.3247 | 0.7854 | 0.6663 | 0.6248 | 0.7560 | 0.8081 | 0.7835 | 0.5452 |
1366
+ | 2.7731 | 2700 | 0.3263 | - | - | - | - | - | - | - |
1367
+ | 2.8758 | 2800 | 0.3212 | 0.7761 | 0.6681 | 0.6250 | 0.7517 | 0.8121 | 0.7927 | 0.5458 |
1368
+ | 2.9784 | 2900 | 0.3291 | - | - | - | - | - | - | - |
1369
+ | 3.0821 | 3000 | 0.2816 | 0.7727 | 0.6604 | 0.6163 | 0.7370 | 0.8163 | 0.7985 | 0.5473 |
1370
+ | 3.1848 | 3100 | 0.2698 | - | - | - | - | - | - | - |
1371
+ | 3.2875 | 3200 | 0.2657 | 0.7757 | 0.6615 | 0.6247 | 0.7417 | 0.8117 | 0.8004 | 0.5436 |
1372
+ | 3.3901 | 3300 | 0.2724 | - | - | - | - | - | - | - |
1373
+ | 3.4928 | 3400 | 0.2584 | 0.7850 | 0.6583 | 0.6320 | 0.7458 | 0.8120 | 0.7980 | 0.5454 |
1374
+ | 3.5955 | 3500 | 0.2573 | - | - | - | - | - | - | - |
1375
+ | 3.6982 | 3600 | 0.2744 | 0.7796 | 0.6552 | 0.6237 | 0.7409 | 0.8193 | 0.8018 | 0.5466 |
1376
+ | 3.8008 | 3700 | 0.3054 | - | - | - | - | - | - | - |
1377
+ | 3.9035 | 3800 | 0.2727 | 0.7825 | 0.6642 | 0.6293 | 0.7504 | 0.8213 | 0.8058 | 0.5463 |
1378
+ | 4.0062 | 3900 | 0.2353 | - | - | - | - | - | - | - |
1379
+ | 4.1088 | 4000 | 0.2353 | 0.7747 | 0.6628 | 0.6263 | 0.7384 | 0.8239 | 0.8065 | 0.5447 |
1380
+ | 4.2115 | 4100 | 0.2385 | - | - | - | - | - | - | - |
1381
+ | 4.3142 | 4200 | 0.231 | 0.7811 | 0.6608 | 0.6254 | 0.7463 | 0.8226 | 0.8051 | 0.5442 |
1382
+ | 4.4168 | 4300 | 0.2115 | - | - | - | - | - | - | - |
1383
+ | 4.5195 | 4400 | 0.2151 | 0.7815 | 0.6634 | 0.6301 | 0.7489 | 0.8251 | 0.8101 | 0.5450 |
1384
+
1385
+
1386
+ ### Framework Versions
1387
+ - Python: 3.11.11
1388
+ - Sentence Transformers: 4.1.0
1389
+ - Transformers: 4.51.2
1390
+ - PyTorch: 2.6.0+cu124
1391
+ - Accelerate: 1.6.0
1392
+ - Datasets: 3.5.0
1393
+ - Tokenizers: 0.21.1
1394
+
1395
+ ## Citation
1396
+
1397
+ ### BibTeX
1398
+
1399
+ #### Sentence Transformers
1400
+ ```bibtex
1401
+ @inproceedings{reimers-2019-sentence-bert,
1402
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1403
+ author = "Reimers, Nils and Gurevych, Iryna",
1404
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1405
+ month = "11",
1406
+ year = "2019",
1407
+ publisher = "Association for Computational Linguistics",
1408
+ url = "https://arxiv.org/abs/1908.10084",
1409
+ }
1410
+ ```
1411
+
1412
+ #### GISTEmbedLoss
1413
+ ```bibtex
1414
+ @misc{solatorio2024gistembed,
1415
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1416
+ author={Aivin V. Solatorio},
1417
+ year={2024},
1418
+ eprint={2402.16829},
1419
+ archivePrefix={arXiv},
1420
+ primaryClass={cs.LG}
1421
+ }
1422
+ ```
1423
+
1424
+ <!--
1425
+ ## Glossary
1426
+
1427
+ *Clearly define terms in order to be accessible across audiences.*
1428
+ -->
1429
+
1430
+ <!--
1431
+ ## Model Card Authors
1432
+
1433
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1434
+ -->
1435
+
1436
+ <!--
1437
+ ## Model Card Contact
1438
+
1439
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1440
+ -->
checkpoint-4400/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-4400/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc07beb5989f5633d0eef7e35354d8f2551911ba9a300cafdd5fceb1120bf15a
3
+ size 15958
checkpoint-4400/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
checkpoint-4400/sentencepiece.bpe.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfc8146abe2a0488e9e2a0c56de7952f7c11ab059eca145a0a727afce0db2865
3
+ size 5069051
checkpoint-4400/special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
checkpoint-4400/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9a6af42442a3e3e9f05f618eae0bb2d98ca4f6a6406cb80ef7a4fa865204d61
3
+ size 17083052
checkpoint-4400/tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "sp_model_kwargs": {},
54
+ "tokenizer_class": "XLMRobertaTokenizer",
55
+ "unk_token": "<unk>"
56
+ }
checkpoint-4600/1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
checkpoint-4600/README.md ADDED
@@ -0,0 +1,1442 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on BAAI/bge-m3
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6476190476190476
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6476190476190476
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5033333333333333
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.3051428571428572
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.18504761904761904
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13263492063492063
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10238095238095238
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06690172806447445
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5361893486281004
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7178301231768206
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8209713456799689
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8719838465781551
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9002628694890553
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6476190476190476
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6792043770713534
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.6952356840844034
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7491776279498115
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7714889294157944
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.7814307168109694
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6476190476190476
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.7979365079365079
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.7979365079365079
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.7979365079365079
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.7979365079365079
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.7979365079365079
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6476190476190476
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5373325378117988
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5240005650356997
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5562356661851569
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5654875568184526
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5682618444726486
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5731371282665402
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.11351351351351352
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.11351351351351352
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5654054054054054
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3897297297297298
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.25324324324324327
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.1901981981981982
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.15102702702702703
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0034454146142631225
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.37952988347964417
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5627963647085502
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.6735817765534955
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.735181694329396
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7691645515769362
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.11351351351351352
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6127764701851742
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5903129713737418
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6173381508468064
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6486192970671256
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.6654238285942606
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.11351351351351352
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5531531531531532
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5531531531531532
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5531531531531532
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5531531531531532
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5531531531531532
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.11351351351351352
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.47962787853124583
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.43007675118643823
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.4338635407926422
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.4486007040723234
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.4552653077040697
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.46787303947870823
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9852216748768473
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9901477832512315
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5406403940886699
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.38216748768472913
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.24970443349753693
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.18712643678160917
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.14992610837438422
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3424896767056911
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5329881987535446
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.64863278001433
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.7085620603885778
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7485450670219227
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.5645228585682827
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5512955891986082
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5785579235074741
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.6098517448142098
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6294320172892106
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5164545268058354
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5165841612367403
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5165841612367403
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5165841612367403
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5165841612367403
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.4226075638788329
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.3782003700226031
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3819826209063663
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.39685072286452894
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.40449418055036124
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.41779880211141207
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6796116504854369
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6796116504854369
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.47087378640776706
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.27941747572815534
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.1731067961165048
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12427184466019418
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09771844660194175
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06824731124903408
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5113376722170427
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.6702499652138949
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.7852754872268105
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8349821570732121
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8695735570177355
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6796116504854369
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6530045751641177
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.660508346535501
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7168972604426495
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7384624839319754
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7523092684328068
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6796116504854369
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8217907227615966
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.8217907227615966
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.8217907227615966
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.8217907227615966
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.8217907227615966
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6796116504854369
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.5041722603829375
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.48337509847759624
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5132814739940859
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5217542791488864
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5259152810735493
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5307415745410603
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7384295371814873
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9630785231409257
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.983359334373375
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9921996879875195
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9942797711908476
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9963598543941757
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7384295371814873
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12472698907956316
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05177327093083725
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.026271450858034326
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017635638758883684
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013273530941237652
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.285115023648565
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.9216415323279598
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9554082163286531
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9698387935517421
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9766337320159473
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9803813485872768
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7384295371814873
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.8113619374567514
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.8206771431445348
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.8238555673509024
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8251818493639627
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.8258365367322651
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7384295371814873
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.8055430723700101
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.806216320765459
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.8063554621873477
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.8063737837993774
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.8063859337270688
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7384295371814873
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7435414485170373
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7461028121794654
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7465435818030448
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7466754353712773
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.7467184450113576
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.746815569330807
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6895475819032761
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9672386895475819
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.983879355174207
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9921996879875195
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9932397295891836
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9942797711908476
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6895475819032761
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12810712428497137
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05329173166926679
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.0270566822672907
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.018107124284971396
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013619344773790953
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2593430403882822
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.9296931877275091
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9652886115444618
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9797191887675507
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9835326746403189
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9862194487779511
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6895475819032761
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7960673141716103
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.8059435010700053
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.8091935286265849
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.8099392087586132
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.8104433137439041
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6895475819032761
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.77474131277538
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7752786950401535
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7753996393885253
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7754090671651442
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7754150110363532
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6895475819032761
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.7198609917140115
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.7225763770177105
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.7230590007971497
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.7231361328506057
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.7231741651357827
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.7232269917591311
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.17836713468538742
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 1.0
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 1.0
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 1.0
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 1.0
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 1.0
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.17836713468538742
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.15439417576703063
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.0617576703068123
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.03087883515340615
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.020585890102270757
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.015439417576703075
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.05735829433177326
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 1.0
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 1.0
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 1.0
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 1.0
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 1.0
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.17836713468538742
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.5435666858139967
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.5435666858139967
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.5435666858139967
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.5435666858139967
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.5435666858139967
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.17836713468538742
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.39877387158978467
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.39877387158978467
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.39877387158978467
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.39877387158978467
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.39877387158978467
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.17836713468538742
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.3263343144665112
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.3263343144665112
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.3263343144665112
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.3263343144665112
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.3263343144665112
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.3263343144665112
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # SentenceTransformer based on BAAI/bge-m3
908
+
909
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on the full_en, full_de, full_es, full_zh and mix datasets. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 1024 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
939
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("sentence_transformers_model_id")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 1024]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7384 | 0.6895 | 0.1784 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9631 | 0.9672 | 1.0 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9834 | 0.9839 | 1.0 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9922 | 0.9922 | 1.0 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9943 | 0.9932 | 1.0 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9964 | 0.9943 | 1.0 |
1017
+ | cosine_precision@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7384 | 0.6895 | 0.1784 |
1018
+ | cosine_precision@20 | 0.5033 | 0.5654 | 0.5406 | 0.4709 | 0.1247 | 0.1281 | 0.1544 |
1019
+ | cosine_precision@50 | 0.3051 | 0.3897 | 0.3822 | 0.2794 | 0.0518 | 0.0533 | 0.0618 |
1020
+ | cosine_precision@100 | 0.185 | 0.2532 | 0.2497 | 0.1731 | 0.0263 | 0.0271 | 0.0309 |
1021
+ | cosine_precision@150 | 0.1326 | 0.1902 | 0.1871 | 0.1243 | 0.0176 | 0.0181 | 0.0206 |
1022
+ | cosine_precision@200 | 0.1024 | 0.151 | 0.1499 | 0.0977 | 0.0133 | 0.0136 | 0.0154 |
1023
+ | cosine_recall@1 | 0.0669 | 0.0034 | 0.0111 | 0.0682 | 0.2851 | 0.2593 | 0.0574 |
1024
+ | cosine_recall@20 | 0.5362 | 0.3795 | 0.3425 | 0.5113 | 0.9216 | 0.9297 | 1.0 |
1025
+ | cosine_recall@50 | 0.7178 | 0.5628 | 0.533 | 0.6702 | 0.9554 | 0.9653 | 1.0 |
1026
+ | cosine_recall@100 | 0.821 | 0.6736 | 0.6486 | 0.7853 | 0.9698 | 0.9797 | 1.0 |
1027
+ | cosine_recall@150 | 0.872 | 0.7352 | 0.7086 | 0.835 | 0.9766 | 0.9835 | 1.0 |
1028
+ | cosine_recall@200 | 0.9003 | 0.7692 | 0.7485 | 0.8696 | 0.9804 | 0.9862 | 1.0 |
1029
+ | cosine_ndcg@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7384 | 0.6895 | 0.1784 |
1030
+ | cosine_ndcg@20 | 0.6792 | 0.6128 | 0.5645 | 0.653 | 0.8114 | 0.7961 | 0.5436 |
1031
+ | cosine_ndcg@50 | 0.6952 | 0.5903 | 0.5513 | 0.6605 | 0.8207 | 0.8059 | 0.5436 |
1032
+ | cosine_ndcg@100 | 0.7492 | 0.6173 | 0.5786 | 0.7169 | 0.8239 | 0.8092 | 0.5436 |
1033
+ | cosine_ndcg@150 | 0.7715 | 0.6486 | 0.6099 | 0.7385 | 0.8252 | 0.8099 | 0.5436 |
1034
+ | **cosine_ndcg@200** | **0.7814** | **0.6654** | **0.6294** | **0.7523** | **0.8258** | **0.8104** | **0.5436** |
1035
+ | cosine_mrr@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7384 | 0.6895 | 0.1784 |
1036
+ | cosine_mrr@20 | 0.7979 | 0.5532 | 0.5165 | 0.8218 | 0.8055 | 0.7747 | 0.3988 |
1037
+ | cosine_mrr@50 | 0.7979 | 0.5532 | 0.5166 | 0.8218 | 0.8062 | 0.7753 | 0.3988 |
1038
+ | cosine_mrr@100 | 0.7979 | 0.5532 | 0.5166 | 0.8218 | 0.8064 | 0.7754 | 0.3988 |
1039
+ | cosine_mrr@150 | 0.7979 | 0.5532 | 0.5166 | 0.8218 | 0.8064 | 0.7754 | 0.3988 |
1040
+ | cosine_mrr@200 | 0.7979 | 0.5532 | 0.5166 | 0.8218 | 0.8064 | 0.7754 | 0.3988 |
1041
+ | cosine_map@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7384 | 0.6895 | 0.1784 |
1042
+ | cosine_map@20 | 0.5373 | 0.4796 | 0.4226 | 0.5042 | 0.7435 | 0.7199 | 0.3263 |
1043
+ | cosine_map@50 | 0.524 | 0.4301 | 0.3782 | 0.4834 | 0.7461 | 0.7226 | 0.3263 |
1044
+ | cosine_map@100 | 0.5562 | 0.4339 | 0.382 | 0.5133 | 0.7465 | 0.7231 | 0.3263 |
1045
+ | cosine_map@150 | 0.5655 | 0.4486 | 0.3969 | 0.5218 | 0.7467 | 0.7231 | 0.3263 |
1046
+ | cosine_map@200 | 0.5683 | 0.4553 | 0.4045 | 0.5259 | 0.7467 | 0.7232 | 0.3263 |
1047
+ | cosine_map@500 | 0.5731 | 0.4679 | 0.4178 | 0.5307 | 0.7468 | 0.7232 | 0.3263 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 64
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 64
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.6856 | 0.5207 | 0.4655 | 0.6713 | 0.6224 | 0.5604 | 0.5548 |
1339
+ | 0.0010 | 1 | 5.3354 | - | - | - | - | - | - | - |
1340
+ | 0.1027 | 100 | 2.665 | - | - | - | - | - | - | - |
1341
+ | 0.2053 | 200 | 1.3375 | 0.7691 | 0.6530 | 0.6298 | 0.7517 | 0.7513 | 0.7393 | 0.5490 |
1342
+ | 0.3080 | 300 | 1.1101 | - | - | - | - | - | - | - |
1343
+ | 0.4107 | 400 | 0.9453 | 0.7802 | 0.6643 | 0.6246 | 0.7531 | 0.7610 | 0.7441 | 0.5493 |
1344
+ | 0.5133 | 500 | 0.9202 | - | - | - | - | - | - | - |
1345
+ | 0.6160 | 600 | 0.7887 | 0.7741 | 0.6549 | 0.6171 | 0.7542 | 0.7672 | 0.7540 | 0.5482 |
1346
+ | 0.7187 | 700 | 0.7604 | - | - | - | - | - | - | - |
1347
+ | 0.8214 | 800 | 0.7219 | 0.7846 | 0.6674 | 0.6244 | 0.7648 | 0.7741 | 0.7592 | 0.5497 |
1348
+ | 0.9240 | 900 | 0.6965 | - | - | - | - | - | - | - |
1349
+ | 1.0267 | 1000 | 0.6253 | 0.7646 | 0.6391 | 0.6122 | 0.7503 | 0.7825 | 0.7704 | 0.5463 |
1350
+ | 1.1294 | 1100 | 0.4737 | - | - | - | - | - | - | - |
1351
+ | 1.2320 | 1200 | 0.5055 | 0.7758 | 0.6582 | 0.6178 | 0.7514 | 0.7857 | 0.7764 | 0.5501 |
1352
+ | 1.3347 | 1300 | 0.5042 | - | - | - | - | - | - | - |
1353
+ | 1.4374 | 1400 | 0.5073 | 0.7613 | 0.6578 | 0.6178 | 0.7505 | 0.7829 | 0.7762 | 0.5452 |
1354
+ | 1.5400 | 1500 | 0.4975 | - | - | - | - | - | - | - |
1355
+ | 1.6427 | 1600 | 0.5242 | 0.7736 | 0.6673 | 0.6279 | 0.7555 | 0.7940 | 0.7859 | 0.5477 |
1356
+ | 1.7454 | 1700 | 0.4713 | - | - | - | - | - | - | - |
1357
+ | 1.8480 | 1800 | 0.4814 | 0.7845 | 0.6733 | 0.6285 | 0.7642 | 0.7992 | 0.7904 | 0.5449 |
1358
+ | 1.9507 | 1900 | 0.4526 | - | - | - | - | - | - | - |
1359
+ | 2.0544 | 2000 | 0.36 | 0.7790 | 0.6639 | 0.6252 | 0.7500 | 0.8032 | 0.7888 | 0.5499 |
1360
+ | 2.1571 | 2100 | 0.3744 | - | - | - | - | - | - | - |
1361
+ | 2.2598 | 2200 | 0.3031 | 0.7787 | 0.6614 | 0.6190 | 0.7537 | 0.7993 | 0.7811 | 0.5476 |
1362
+ | 2.3624 | 2300 | 0.3638 | - | - | - | - | - | - | - |
1363
+ | 2.4651 | 2400 | 0.358 | 0.7798 | 0.6615 | 0.6258 | 0.7497 | 0.8018 | 0.7828 | 0.5481 |
1364
+ | 2.5678 | 2500 | 0.3247 | - | - | - | - | - | - | - |
1365
+ | 2.6704 | 2600 | 0.3247 | 0.7854 | 0.6663 | 0.6248 | 0.7560 | 0.8081 | 0.7835 | 0.5452 |
1366
+ | 2.7731 | 2700 | 0.3263 | - | - | - | - | - | - | - |
1367
+ | 2.8758 | 2800 | 0.3212 | 0.7761 | 0.6681 | 0.6250 | 0.7517 | 0.8121 | 0.7927 | 0.5458 |
1368
+ | 2.9784 | 2900 | 0.3291 | - | - | - | - | - | - | - |
1369
+ | 3.0821 | 3000 | 0.2816 | 0.7727 | 0.6604 | 0.6163 | 0.7370 | 0.8163 | 0.7985 | 0.5473 |
1370
+ | 3.1848 | 3100 | 0.2698 | - | - | - | - | - | - | - |
1371
+ | 3.2875 | 3200 | 0.2657 | 0.7757 | 0.6615 | 0.6247 | 0.7417 | 0.8117 | 0.8004 | 0.5436 |
1372
+ | 3.3901 | 3300 | 0.2724 | - | - | - | - | - | - | - |
1373
+ | 3.4928 | 3400 | 0.2584 | 0.7850 | 0.6583 | 0.6320 | 0.7458 | 0.8120 | 0.7980 | 0.5454 |
1374
+ | 3.5955 | 3500 | 0.2573 | - | - | - | - | - | - | - |
1375
+ | 3.6982 | 3600 | 0.2744 | 0.7796 | 0.6552 | 0.6237 | 0.7409 | 0.8193 | 0.8018 | 0.5466 |
1376
+ | 3.8008 | 3700 | 0.3054 | - | - | - | - | - | - | - |
1377
+ | 3.9035 | 3800 | 0.2727 | 0.7825 | 0.6642 | 0.6293 | 0.7504 | 0.8213 | 0.8058 | 0.5463 |
1378
+ | 4.0062 | 3900 | 0.2353 | - | - | - | - | - | - | - |
1379
+ | 4.1088 | 4000 | 0.2353 | 0.7747 | 0.6628 | 0.6263 | 0.7384 | 0.8239 | 0.8065 | 0.5447 |
1380
+ | 4.2115 | 4100 | 0.2385 | - | - | - | - | - | - | - |
1381
+ | 4.3142 | 4200 | 0.231 | 0.7811 | 0.6608 | 0.6254 | 0.7463 | 0.8226 | 0.8051 | 0.5442 |
1382
+ | 4.4168 | 4300 | 0.2115 | - | - | - | - | - | - | - |
1383
+ | 4.5195 | 4400 | 0.2151 | 0.7815 | 0.6634 | 0.6301 | 0.7489 | 0.8251 | 0.8101 | 0.5450 |
1384
+ | 4.6222 | 4500 | 0.2496 | - | - | - | - | - | - | - |
1385
+ | 4.7248 | 4600 | 0.2146 | 0.7814 | 0.6654 | 0.6294 | 0.7523 | 0.8258 | 0.8104 | 0.5436 |
1386
+
1387
+
1388
+ ### Framework Versions
1389
+ - Python: 3.11.11
1390
+ - Sentence Transformers: 4.1.0
1391
+ - Transformers: 4.51.2
1392
+ - PyTorch: 2.6.0+cu124
1393
+ - Accelerate: 1.6.0
1394
+ - Datasets: 3.5.0
1395
+ - Tokenizers: 0.21.1
1396
+
1397
+ ## Citation
1398
+
1399
+ ### BibTeX
1400
+
1401
+ #### Sentence Transformers
1402
+ ```bibtex
1403
+ @inproceedings{reimers-2019-sentence-bert,
1404
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1405
+ author = "Reimers, Nils and Gurevych, Iryna",
1406
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1407
+ month = "11",
1408
+ year = "2019",
1409
+ publisher = "Association for Computational Linguistics",
1410
+ url = "https://arxiv.org/abs/1908.10084",
1411
+ }
1412
+ ```
1413
+
1414
+ #### GISTEmbedLoss
1415
+ ```bibtex
1416
+ @misc{solatorio2024gistembed,
1417
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1418
+ author={Aivin V. Solatorio},
1419
+ year={2024},
1420
+ eprint={2402.16829},
1421
+ archivePrefix={arXiv},
1422
+ primaryClass={cs.LG}
1423
+ }
1424
+ ```
1425
+
1426
+ <!--
1427
+ ## Glossary
1428
+
1429
+ *Clearly define terms in order to be accessible across audiences.*
1430
+ -->
1431
+
1432
+ <!--
1433
+ ## Model Card Authors
1434
+
1435
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1436
+ -->
1437
+
1438
+ <!--
1439
+ ## Model Card Contact
1440
+
1441
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1442
+ -->
checkpoint-4600/config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "XLMRobertaModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 1024,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 4096,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 8194,
16
+ "model_type": "xlm-roberta",
17
+ "num_attention_heads": 16,
18
+ "num_hidden_layers": 24,
19
+ "output_past": true,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.51.2",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 250002
27
+ }
checkpoint-4600/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-4600/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-4600/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:be1ba9989926d2e34c05f8abef9d3c2a3bfbcdea2edd80772c983050504a7aef
3
+ size 15958
checkpoint-4600/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8975976e3c794ee1e146996c5cd27f6ccb9342013cd7b0ff6eb9c3ecb20b77fa
3
+ size 988
checkpoint-4600/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:61ae51005e9932cc38f3324bfdb2fe2816e54cc6abdbe372036027168b08f812
3
+ size 1064
checkpoint-4600/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
checkpoint-4600/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9a6af42442a3e3e9f05f618eae0bb2d98ca4f6a6406cb80ef7a4fa865204d61
3
+ size 17083052
checkpoint-4600/tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "sp_model_kwargs": {},
54
+ "tokenizer_class": "XLMRobertaTokenizer",
55
+ "unk_token": "<unk>"
56
+ }
checkpoint-4600/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-4600/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a948c4f5667f6700da28d0d70c0c6f024b018ee933ba85d5cc9de9d626dadca
3
+ size 5624
checkpoint-4800/README.md ADDED
@@ -0,0 +1,1444 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on BAAI/bge-m3
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6476190476190476
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6476190476190476
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5061904761904762
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.30647619047619057
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.1858095238095238
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13250793650793652
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10247619047619047
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06690172806447445
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5391510592522911
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7199711948587544
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8253770621157605
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8719997123512196
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9006382758109558
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6476190476190476
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6822066814233797
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.6975329548006446
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7519637922809941
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7724946802449859
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.7827357067553371
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6476190476190476
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.7999999999999998
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.7999999999999998
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.7999999999999998
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.7999999999999998
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.7999999999999998
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6476190476190476
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5391784054866918
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5258287715484311
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5580109313638075
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5665715227835532
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.569529009182472
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5743595458034346
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.11351351351351352
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.11351351351351352
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5667567567567567
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3902702702702703
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.25254054054054054
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.19005405405405407
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.1507837837837838
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0035155918996302815
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.37958552840441906
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5635730197468752
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.672698242387141
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7360036980055802
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7697561816436992
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.11351351351351352
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6136401766234348
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5908459924766464
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6168063266629416
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6488575731321932
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.665316090087272
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.11351351351351352
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5536036036036036
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5536036036036036
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5536036036036036
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5536036036036036
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5536036036036036
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.11351351351351352
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.48095830339282386
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.43038606337879926
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.4335284717646407
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.44851036812148526
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.4550924585301385
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.4677023132311536
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9852216748768473
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9901477832512315
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5403940886699506
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.38275862068965516
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.2503448275862069
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.187816091954023
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.15027093596059116
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3432684453555553
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5339871522541048
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6498636280219438
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.7100921836539074
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7513351913056898
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.5647628262992046
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5522057083055792
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5796033728499559
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.6111851705889818
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6309313367878393
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5164425017655958
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.516559790060224
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.516559790060224
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.516559790060224
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.516559790060224
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.4221760589983628
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.37913413777890953
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3829298798486122
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.39811624371681004
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.40559711033541546
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.4188841643667456
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6796116504854369
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6796116504854369
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.470873786407767
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.28038834951456315
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17320388349514557
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12394822006472495
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09766990291262137
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06427555485009323
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5119331913488326
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.6726577129232287
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.788021792964523
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8328962977521837
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8687397875786594
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6796116504854369
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6515292076635256
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6598571989751485
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7157338182976709
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7357126940189814
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7500853808896866
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6796116504854369
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8216828478964402
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.8216828478964402
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.8216828478964402
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.8216828478964402
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.8216828478964402
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6796116504854369
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.5012149610968577
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.48128476255481567
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5105374388587102
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.518381647971727
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5228375783347256
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.52765377953199
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7394695787831513
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9635985439417577
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.982839313572543
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9927197087883516
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9947997919916797
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9963598543941757
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7394695787831513
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12488299531981278
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05174206968278733
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.02629225169006761
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017635638758883684
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013281331253250133
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.28537503404898107
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.9225949037961519
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9548015253943491
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.970532154619518
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9766337320159473
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9810747096550528
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7394695787831513
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.8119072371250002
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.8208055075822587
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.8242798548838444
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8254601712767063
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.826231823086538
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7394695787831513
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.8059183822863336
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.8065662458714291
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.8067209669800003
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.8067371899834064
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.8067455244059942
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7394695787831513
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7439811728319751
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7464542457655368
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7469341154545359
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7470471963812441
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.7471010455519603
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7471920688836787
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6926677067082684
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9641185647425897
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.983879355174207
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9921996879875195
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9932397295891836
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9942797711908476
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6926677067082684
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12797711908476336
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.053281331253250144
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.027051482059282376
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.018110591090310275
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013619344773790953
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2603830819899463
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.928479805858901
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9650286011440458
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9796325186340786
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9837060149072628
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9862194487779511
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6926677067082684
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7967328692326251
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.8068705787791701
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.810158579950017
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.8109641919896999
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.8114360342473703
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6926677067082684
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7766838069642311
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7773792960985305
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7775026273925645
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7775124036000293
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7775182983569378
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6926677067082684
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.7210301157895639
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.7237555751939095
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.7242426468613273
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.7243265313145111
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.7243628241480395
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.7244144669299598
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.17888715548621945
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 1.0
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 1.0
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 1.0
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 1.0
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 1.0
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.17888715548621945
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.15439417576703063
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.0617576703068123
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.03087883515340615
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.020585890102270757
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.015439417576703075
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.05768764083896689
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 1.0
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 1.0
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 1.0
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 1.0
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 1.0
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.17888715548621945
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.5443156532634228
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.5443156532634228
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.5443156532634228
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.5443156532634228
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.5443156532634228
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.17888715548621945
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.4002437442375043
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.4002437442375043
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.4002437442375043
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.4002437442375043
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.4002437442375043
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.17888715548621945
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.32718437256695937
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.32718437256695937
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.32718437256695937
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.32718437256695937
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.32718437256695937
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.32718437256695937
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # Job - Job matching finetuned BAAI/bge-m3
908
+
909
+ Top performing model on [TalentCLEF 2025](https://talentclef.github.io/talentclef/) Task A. Use it for multilingual job title matching
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 1024 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
939
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("pj-mathematician/JobBGE-m3")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 1024]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9636 | 0.9641 | 1.0 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9828 | 0.9839 | 1.0 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9927 | 0.9922 | 1.0 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9948 | 0.9932 | 1.0 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9964 | 0.9943 | 1.0 |
1017
+ | cosine_precision@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1018
+ | cosine_precision@20 | 0.5062 | 0.5668 | 0.5404 | 0.4709 | 0.1249 | 0.128 | 0.1544 |
1019
+ | cosine_precision@50 | 0.3065 | 0.3903 | 0.3828 | 0.2804 | 0.0517 | 0.0533 | 0.0618 |
1020
+ | cosine_precision@100 | 0.1858 | 0.2525 | 0.2503 | 0.1732 | 0.0263 | 0.0271 | 0.0309 |
1021
+ | cosine_precision@150 | 0.1325 | 0.1901 | 0.1878 | 0.1239 | 0.0176 | 0.0181 | 0.0206 |
1022
+ | cosine_precision@200 | 0.1025 | 0.1508 | 0.1503 | 0.0977 | 0.0133 | 0.0136 | 0.0154 |
1023
+ | cosine_recall@1 | 0.0669 | 0.0035 | 0.0111 | 0.0643 | 0.2854 | 0.2604 | 0.0577 |
1024
+ | cosine_recall@20 | 0.5392 | 0.3796 | 0.3433 | 0.5119 | 0.9226 | 0.9285 | 1.0 |
1025
+ | cosine_recall@50 | 0.72 | 0.5636 | 0.534 | 0.6727 | 0.9548 | 0.965 | 1.0 |
1026
+ | cosine_recall@100 | 0.8254 | 0.6727 | 0.6499 | 0.788 | 0.9705 | 0.9796 | 1.0 |
1027
+ | cosine_recall@150 | 0.872 | 0.736 | 0.7101 | 0.8329 | 0.9766 | 0.9837 | 1.0 |
1028
+ | cosine_recall@200 | 0.9006 | 0.7698 | 0.7513 | 0.8687 | 0.9811 | 0.9862 | 1.0 |
1029
+ | cosine_ndcg@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1030
+ | cosine_ndcg@20 | 0.6822 | 0.6136 | 0.5648 | 0.6515 | 0.8119 | 0.7967 | 0.5443 |
1031
+ | cosine_ndcg@50 | 0.6975 | 0.5908 | 0.5522 | 0.6599 | 0.8208 | 0.8069 | 0.5443 |
1032
+ | cosine_ndcg@100 | 0.752 | 0.6168 | 0.5796 | 0.7157 | 0.8243 | 0.8102 | 0.5443 |
1033
+ | cosine_ndcg@150 | 0.7725 | 0.6489 | 0.6112 | 0.7357 | 0.8255 | 0.811 | 0.5443 |
1034
+ | **cosine_ndcg@200** | **0.7827** | **0.6653** | **0.6309** | **0.7501** | **0.8262** | **0.8114** | **0.5443** |
1035
+ | cosine_mrr@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1036
+ | cosine_mrr@20 | 0.8 | 0.5536 | 0.5164 | 0.8217 | 0.8059 | 0.7767 | 0.4002 |
1037
+ | cosine_mrr@50 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8066 | 0.7774 | 0.4002 |
1038
+ | cosine_mrr@100 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1039
+ | cosine_mrr@150 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1040
+ | cosine_mrr@200 | 0.8 | 0.5536 | 0.5166 | 0.8217 | 0.8067 | 0.7775 | 0.4002 |
1041
+ | cosine_map@1 | 0.6476 | 0.1135 | 0.2956 | 0.6796 | 0.7395 | 0.6927 | 0.1789 |
1042
+ | cosine_map@20 | 0.5392 | 0.481 | 0.4222 | 0.5012 | 0.744 | 0.721 | 0.3272 |
1043
+ | cosine_map@50 | 0.5258 | 0.4304 | 0.3791 | 0.4813 | 0.7465 | 0.7238 | 0.3272 |
1044
+ | cosine_map@100 | 0.558 | 0.4335 | 0.3829 | 0.5105 | 0.7469 | 0.7242 | 0.3272 |
1045
+ | cosine_map@150 | 0.5666 | 0.4485 | 0.3981 | 0.5184 | 0.747 | 0.7243 | 0.3272 |
1046
+ | cosine_map@200 | 0.5695 | 0.4551 | 0.4056 | 0.5228 | 0.7471 | 0.7244 | 0.3272 |
1047
+ | cosine_map@500 | 0.5744 | 0.4677 | 0.4189 | 0.5277 | 0.7472 | 0.7244 | 0.3272 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 64
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 64
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.6856 | 0.5207 | 0.4655 | 0.6713 | 0.6224 | 0.5604 | 0.5548 |
1339
+ | 0.0010 | 1 | 5.3354 | - | - | - | - | - | - | - |
1340
+ | 0.1027 | 100 | 2.665 | - | - | - | - | - | - | - |
1341
+ | 0.2053 | 200 | 1.3375 | 0.7691 | 0.6530 | 0.6298 | 0.7517 | 0.7513 | 0.7393 | 0.5490 |
1342
+ | 0.3080 | 300 | 1.1101 | - | - | - | - | - | - | - |
1343
+ | 0.4107 | 400 | 0.9453 | 0.7802 | 0.6643 | 0.6246 | 0.7531 | 0.7610 | 0.7441 | 0.5493 |
1344
+ | 0.5133 | 500 | 0.9202 | - | - | - | - | - | - | - |
1345
+ | 0.6160 | 600 | 0.7887 | 0.7741 | 0.6549 | 0.6171 | 0.7542 | 0.7672 | 0.7540 | 0.5482 |
1346
+ | 0.7187 | 700 | 0.7604 | - | - | - | - | - | - | - |
1347
+ | 0.8214 | 800 | 0.7219 | 0.7846 | 0.6674 | 0.6244 | 0.7648 | 0.7741 | 0.7592 | 0.5497 |
1348
+ | 0.9240 | 900 | 0.6965 | - | - | - | - | - | - | - |
1349
+ | 1.0267 | 1000 | 0.6253 | 0.7646 | 0.6391 | 0.6122 | 0.7503 | 0.7825 | 0.7704 | 0.5463 |
1350
+ | 1.1294 | 1100 | 0.4737 | - | - | - | - | - | - | - |
1351
+ | 1.2320 | 1200 | 0.5055 | 0.7758 | 0.6582 | 0.6178 | 0.7514 | 0.7857 | 0.7764 | 0.5501 |
1352
+ | 1.3347 | 1300 | 0.5042 | - | - | - | - | - | - | - |
1353
+ | 1.4374 | 1400 | 0.5073 | 0.7613 | 0.6578 | 0.6178 | 0.7505 | 0.7829 | 0.7762 | 0.5452 |
1354
+ | 1.5400 | 1500 | 0.4975 | - | - | - | - | - | - | - |
1355
+ | 1.6427 | 1600 | 0.5242 | 0.7736 | 0.6673 | 0.6279 | 0.7555 | 0.7940 | 0.7859 | 0.5477 |
1356
+ | 1.7454 | 1700 | 0.4713 | - | - | - | - | - | - | - |
1357
+ | 1.8480 | 1800 | 0.4814 | 0.7845 | 0.6733 | 0.6285 | 0.7642 | 0.7992 | 0.7904 | 0.5449 |
1358
+ | 1.9507 | 1900 | 0.4526 | - | - | - | - | - | - | - |
1359
+ | 2.0544 | 2000 | 0.36 | 0.7790 | 0.6639 | 0.6252 | 0.7500 | 0.8032 | 0.7888 | 0.5499 |
1360
+ | 2.1571 | 2100 | 0.3744 | - | - | - | - | - | - | - |
1361
+ | 2.2598 | 2200 | 0.3031 | 0.7787 | 0.6614 | 0.6190 | 0.7537 | 0.7993 | 0.7811 | 0.5476 |
1362
+ | 2.3624 | 2300 | 0.3638 | - | - | - | - | - | - | - |
1363
+ | 2.4651 | 2400 | 0.358 | 0.7798 | 0.6615 | 0.6258 | 0.7497 | 0.8018 | 0.7828 | 0.5481 |
1364
+ | 2.5678 | 2500 | 0.3247 | - | - | - | - | - | - | - |
1365
+ | 2.6704 | 2600 | 0.3247 | 0.7854 | 0.6663 | 0.6248 | 0.7560 | 0.8081 | 0.7835 | 0.5452 |
1366
+ | 2.7731 | 2700 | 0.3263 | - | - | - | - | - | - | - |
1367
+ | 2.8758 | 2800 | 0.3212 | 0.7761 | 0.6681 | 0.6250 | 0.7517 | 0.8121 | 0.7927 | 0.5458 |
1368
+ | 2.9784 | 2900 | 0.3291 | - | - | - | - | - | - | - |
1369
+ | 3.0821 | 3000 | 0.2816 | 0.7727 | 0.6604 | 0.6163 | 0.7370 | 0.8163 | 0.7985 | 0.5473 |
1370
+ | 3.1848 | 3100 | 0.2698 | - | - | - | - | - | - | - |
1371
+ | 3.2875 | 3200 | 0.2657 | 0.7757 | 0.6615 | 0.6247 | 0.7417 | 0.8117 | 0.8004 | 0.5436 |
1372
+ | 3.3901 | 3300 | 0.2724 | - | - | - | - | - | - | - |
1373
+ | 3.4928 | 3400 | 0.2584 | 0.7850 | 0.6583 | 0.6320 | 0.7458 | 0.8120 | 0.7980 | 0.5454 |
1374
+ | 3.5955 | 3500 | 0.2573 | - | - | - | - | - | - | - |
1375
+ | 3.6982 | 3600 | 0.2744 | 0.7796 | 0.6552 | 0.6237 | 0.7409 | 0.8193 | 0.8018 | 0.5466 |
1376
+ | 3.8008 | 3700 | 0.3054 | - | - | - | - | - | - | - |
1377
+ | 3.9035 | 3800 | 0.2727 | 0.7825 | 0.6642 | 0.6293 | 0.7504 | 0.8213 | 0.8058 | 0.5463 |
1378
+ | 4.0062 | 3900 | 0.2353 | - | - | - | - | - | - | - |
1379
+ | 4.1088 | 4000 | 0.2353 | 0.7747 | 0.6628 | 0.6263 | 0.7384 | 0.8239 | 0.8065 | 0.5447 |
1380
+ | 4.2115 | 4100 | 0.2385 | - | - | - | - | - | - | - |
1381
+ | 4.3142 | 4200 | 0.231 | 0.7811 | 0.6608 | 0.6254 | 0.7463 | 0.8226 | 0.8051 | 0.5442 |
1382
+ | 4.4168 | 4300 | 0.2115 | - | - | - | - | - | - | - |
1383
+ | 4.5195 | 4400 | 0.2151 | 0.7815 | 0.6634 | 0.6301 | 0.7489 | 0.8251 | 0.8101 | 0.5450 |
1384
+ | 4.6222 | 4500 | 0.2496 | - | - | - | - | - | - | - |
1385
+ | 4.7248 | 4600 | 0.2146 | 0.7814 | 0.6654 | 0.6294 | 0.7523 | 0.8258 | 0.8104 | 0.5436 |
1386
+ | 4.8275 | 4700 | 0.2535 | - | - | - | - | - | - | - |
1387
+ | 4.9302 | 4800 | 0.2058 | 0.7827 | 0.6653 | 0.6309 | 0.7501 | 0.8262 | 0.8114 | 0.5443 |
1388
+
1389
+
1390
+ ### Framework Versions
1391
+ - Python: 3.11.11
1392
+ - Sentence Transformers: 4.1.0
1393
+ - Transformers: 4.51.2
1394
+ - PyTorch: 2.6.0+cu124
1395
+ - Accelerate: 1.6.0
1396
+ - Datasets: 3.5.0
1397
+ - Tokenizers: 0.21.1
1398
+
1399
+ ## Citation
1400
+
1401
+ ### BibTeX
1402
+
1403
+ #### Sentence Transformers
1404
+ ```bibtex
1405
+ @inproceedings{reimers-2019-sentence-bert,
1406
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1407
+ author = "Reimers, Nils and Gurevych, Iryna",
1408
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1409
+ month = "11",
1410
+ year = "2019",
1411
+ publisher = "Association for Computational Linguistics",
1412
+ url = "https://arxiv.org/abs/1908.10084",
1413
+ }
1414
+ ```
1415
+
1416
+ #### GISTEmbedLoss
1417
+ ```bibtex
1418
+ @misc{solatorio2024gistembed,
1419
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1420
+ author={Aivin V. Solatorio},
1421
+ year={2024},
1422
+ eprint={2402.16829},
1423
+ archivePrefix={arXiv},
1424
+ primaryClass={cs.LG}
1425
+ }
1426
+ ```
1427
+
1428
+ <!--
1429
+ ## Glossary
1430
+
1431
+ *Clearly define terms in order to be accessible across audiences.*
1432
+ -->
1433
+
1434
+ <!--
1435
+ ## Model Card Authors
1436
+
1437
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1438
+ -->
1439
+
1440
+ <!--
1441
+ ## Model Card Contact
1442
+
1443
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1444
+ -->
checkpoint-4800/config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "XLMRobertaModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 1024,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 4096,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 8194,
16
+ "model_type": "xlm-roberta",
17
+ "num_attention_heads": 16,
18
+ "num_hidden_layers": 24,
19
+ "output_past": true,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.51.2",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 250002
27
+ }
checkpoint-4800/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-4800/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-4800/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7806c09bf53b6eaf769e4e730690af90c3e65f08bdabeae00e6b5222364bdfb3
3
+ size 1064
checkpoint-4800/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
checkpoint-4800/special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
checkpoint-4800/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9a6af42442a3e3e9f05f618eae0bb2d98ca4f6a6406cb80ef7a4fa865204d61
3
+ size 17083052
checkpoint-4800/tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "sp_model_kwargs": {},
54
+ "tokenizer_class": "XLMRobertaTokenizer",
55
+ "unk_token": "<unk>"
56
+ }
checkpoint-4800/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
eval/Information-Retrieval_evaluation_full_de_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.2955665024630542,0.9704433497536946,0.9852216748768473,0.9852216748768473,0.9901477832512315,1.0,0.2955665024630542,0.01108543831680986,0.526847290640394,0.33997899163378226,0.38256157635467986,0.5329482495961547,0.2506896551724138,0.6466708714044761,0.18850574712643678,0.7170054017290216,0.14992610837438425,0.757186954897764,0.2955665024630542,0.5078908272322837,0.5084839372706711,0.5084839372706711,0.5085283166253952,0.5085904375145991,0.2955665024630542,0.5525861907831302,0.5476826717811635,0.577500254319055,0.6115084469958894,0.6298255260700456,0.2955665024630542,0.41207283022299424,0.3761644996751836,0.3846019968857037,0.4002146277234498,0.40668320859216317,0.41843360900992016
3
+ 0.4106776180698152,400,0.2955665024630542,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.529064039408867,0.333856762150003,0.3786206896551724,0.5204724259897241,0.24310344827586208,0.6358838225521213,0.18522167487684726,0.7057204769301312,0.1489408866995074,0.750542949274788,0.2955665024630542,0.5158023826590521,0.5159169433189215,0.5159169433189215,0.5159169433189215,0.5159169433189215,0.2955665024630542,0.5533495213716284,0.541627594497738,0.5674484436014927,0.6034306524299401,0.6246250289617958,0.2955665024630542,0.41102037496160615,0.3699127700481449,0.37112788976163313,0.3880003777572498,0.3960495758977859,0.40921478589780685
4
+ 0.6160164271047228,600,0.2955665024630542,0.9704433497536946,0.9852216748768473,0.9852216748768473,0.9852216748768473,0.9950738916256158,0.2955665024630542,0.01108543831680986,0.5066502463054187,0.31540727953967196,0.3692610837438424,0.5131571813019469,0.2449753694581281,0.6372094849738309,0.1851888341543514,0.699423327288352,0.14862068965517242,0.7495520376882282,0.2955665024630542,0.5067862829562333,0.5073549289639824,0.5073549289639824,0.5073549289639824,0.5074080595589848,0.2955665024630542,0.5339328446611992,0.5298870989350547,0.5629797543503309,0.5961154824826048,0.6171396032615172,0.2955665024630542,0.39212560076575215,0.3575995040561464,0.36379620975336774,0.379617399924168,0.3870981755049584,0.40072258481042783
5
+ 0.8213552361396304,800,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5273399014778325,0.3384633946557978,0.37635467980295567,0.5189272951837391,0.24645320197044335,0.6362060259503323,0.1858128078817734,0.7020309603230102,0.15007389162561577,0.7524618905297367,0.2955665024630542,0.5067957967435992,0.5069497376302987,0.5069497376302987,0.5069497376302987,0.5069753944447486,0.2955665024630542,0.5522945285216687,0.5398462365071318,0.5679789551112355,0.6014413656486646,0.6243817523144597,0.2955665024630542,0.4101876166001242,0.3702220759554515,0.3743298686627841,0.3897060524249026,0.3982157906508172,0.4119277869279311
6
+ 1.0266940451745379,1000,0.2955665024630542,0.9753694581280788,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9950738916256158,0.2955665024630542,0.01108543831680986,0.5118226600985222,0.32294794046647457,0.3719211822660099,0.5083307835638083,0.24344827586206896,0.6272571938167967,0.1819376026272578,0.6852075458183687,0.14677339901477834,0.7331160432262994,0.2955665024630542,0.5086201722068132,0.509041130558806,0.5091205839196832,0.5091205839196832,0.509145589545949,0.2955665024630542,0.5381731400700649,0.5318345986417106,0.5605064881525029,0.5905602043041761,0.6121961825152601,0.2955665024630542,0.3937947221604949,0.3597176383959752,0.3631658175560496,0.3771054336840075,0.3851190241922741,0.39813779081620493
7
+ 1.2320328542094456,1200,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9950738916256158,1.0,0.2955665024630542,0.01108543831680986,0.5273399014778325,0.33042780571358715,0.36935960591133016,0.5108586472262386,0.2455665024630542,0.6331988831136598,0.18305418719211822,0.7004104381492438,0.14738916256157636,0.7437628227294042,0.2955665024630542,0.5101534344145175,0.5103233002205309,0.5103233002205309,0.5103934663018975,0.5104246442030013,0.2955665024630542,0.5527863211153075,0.5343062049616903,0.56593487166238,0.5972233212319282,0.6178404032961763,0.2955665024630542,0.4120043838349936,0.36266111505202364,0.36791509207144174,0.38199891136237907,0.3898780972927332,0.4033829063484561
8
+ 1.4373716632443532,1400,0.2955665024630542,0.9704433497536946,0.9901477832512315,0.9950738916256158,0.9950738916256158,0.9950738916256158,0.2955665024630542,0.01108543831680986,0.5231527093596059,0.3278801304157996,0.3711330049261084,0.5128384545714132,0.2429556650246305,0.6433518862892785,0.18190476190476193,0.7017862793136533,0.14689655172413796,0.7435163764591487,0.2955665024630542,0.5133177210828228,0.5138666303016828,0.513927446454453,0.513927446454453,0.513927446454453,0.2955665024630542,0.5499220143572477,0.5360013491896902,0.5666653784831412,0.5969538397761598,0.6177849859422981,0.2955665024630542,0.4102393571764651,0.36766608282065005,0.3694987567403448,0.38368594865677574,0.39176914325526546,0.4049358069492717
9
+ 1.6427104722792607,1600,0.2955665024630542,0.9704433497536946,0.9753694581280788,0.9852216748768473,0.9950738916256158,0.9950738916256158,0.2955665024630542,0.01108543831680986,0.5357142857142857,0.3428153068159694,0.37802955665024635,0.526248910479838,0.24852216748768474,0.6443471039216859,0.18604269293924466,0.7131534592979561,0.14926108374384237,0.7520677918099861,0.2955665024630542,0.5139807647196803,0.5141566971616226,0.5143232332117794,0.5144014451138929,0.5144014451138929,0.2955665024630542,0.5632026792456064,0.5472408941128026,0.576258387229448,0.608786561009545,0.6279177120268447,0.2955665024630542,0.4200660788034118,0.37501577080213316,0.3795048219766166,0.39399020595139694,0.4015800614794058,0.4154301946015271
10
+ 1.8480492813141685,1800,0.2955665024630542,0.9802955665024631,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9852216748768473,0.2955665024630542,0.01108543831680986,0.5433497536945813,0.34649075291973525,0.3753694581280788,0.5342683704304785,0.24738916256157634,0.64814503965833,0.18509031198686374,0.7029727001198689,0.14913793103448278,0.7447632074569691,0.2955665024630542,0.5103093403586009,0.5103093403586009,0.5103875325550197,0.5103875325550197,0.5103875325550197,0.2955665024630542,0.569219729851605,0.5492554978740849,0.5781466989890375,0.6078911170848468,0.628456854160075,0.2955665024630542,0.43126837760530895,0.38090628795046844,0.38517582900653985,0.40009553506469014,0.4082183527049491,0.4229834676341414
11
+ 2.0544147843942504,2000,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9852216748768473,1.0,0.2955665024630542,0.01108543831680986,0.5344827586206896,0.3372411018948766,0.38,0.5274390541875749,0.24719211822660103,0.6429645564376526,0.18568144499178987,0.7056426852599205,0.14837438423645322,0.7451581316754955,0.2955665024630542,0.5133063570255683,0.5134326674967064,0.5134326674967064,0.5134326674967064,0.5135130133384336,0.2955665024630542,0.561266900221278,0.547929685743258,0.5746206334123237,0.6068885326987699,0.6252471370750249,0.2955665024630542,0.4189518554454275,0.37608857084062136,0.3792152927927448,0.39457302095253266,0.4018784235862324,0.4147070411298425
12
+ 2.259753593429158,2200,0.2955665024630542,0.9753694581280788,0.9753694581280788,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5366995073891626,0.33256168361734295,0.37270935960591134,0.5101126426815528,0.2435960591133005,0.6291559515160196,0.1844334975369458,0.6931068331713032,0.1484975369458128,0.7333616906118039,0.2955665024630542,0.5162682418840047,0.5162682418840047,0.5163949040220294,0.5164352819595244,0.5164352819595244,0.2955665024630542,0.5601851594116809,0.5374927615041049,0.5656287626884717,0.5988777556610647,0.6190248307223765,0.2955665024630542,0.41978841073418566,0.36876648325683553,0.3706857800366641,0.38691090125617594,0.39485014323897416,0.4084041911386392
13
+ 2.465092402464066,2400,0.2955665024630542,0.9753694581280788,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5460591133004926,0.3422389863129073,0.37733990147783253,0.528512298439324,0.24645320197044335,0.6390909862359677,0.1850246305418719,0.700757502839122,0.1493103448275862,0.7400725337597357,0.2955665024630542,0.5108951803533078,0.5112182906875417,0.5112791068403119,0.5112791068403119,0.5112791068403119,0.2955665024630542,0.5685187633587889,0.5479572757407638,0.5734848426105378,0.6055101812205554,0.6257720554233988,0.2955665024630542,0.43020895989345487,0.3777014853592311,0.3801370263257621,0.39481485631556423,0.4028812956172389,0.417211241115348
14
+ 2.6704312114989732,2600,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5426108374384236,0.3423144497848103,0.37694581280788175,0.5210138326374587,0.24522167487684726,0.63789560057548,0.1852216748768473,0.6985955136526781,0.14876847290640394,0.7388154181798423,0.2955665024630542,0.5117653487727373,0.5119477972310478,0.5119477972310478,0.5119477972310478,0.5119728028573137,0.2955665024630542,0.5667006599239481,0.5460096111442544,0.572747793812579,0.605107506995129,0.6248072314388501,0.2955665024630542,0.42804115783707986,0.37941479535255773,0.3811217511863993,0.39651851068020316,0.40392640998605456,0.41766692522018695
15
+ 2.875770020533881,2800,0.2955665024630542,0.9753694581280788,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5465517241379311,0.34605370825268694,0.37871921182266005,0.525961661033002,0.2450738916256158,0.642011077963911,0.18489326765188832,0.7036043570567155,0.14807881773399015,0.7399840980295014,0.2955665024630542,0.512207756665884,0.5123250449605122,0.5123851194528828,0.5123851194528828,0.512412794219031,0.2955665024630542,0.5695990525754638,0.548707191800076,0.5739021948904237,0.6065561500926906,0.6249694707944361,0.2955665024630542,0.4314031520364939,0.3810503904876955,0.3810947207204988,0.39657872773731245,0.4039027412816606,0.4175738761694523
16
+ 3.082135523613963,3000,0.2955665024630542,0.9753694581280788,0.9753694581280788,0.9753694581280788,0.9802955665024631,0.9852216748768473,0.2955665024630542,0.01108543831680986,0.5366995073891625,0.33564739373606467,0.37733990147783253,0.5180650917363757,0.24699507389162562,0.6391099957471003,0.1851559934318555,0.6976467975910169,0.1469704433497537,0.7281818186601358,0.2955665024630542,0.510668347798889,0.510668347798889,0.510668347798889,0.5107119416783084,0.5107405818432756,0.2955665024630542,0.5588766637879935,0.5412683426801599,0.5700085303896804,0.6006983786588814,0.616250201156856,0.2955665024630542,0.41945867946449716,0.3714556630934012,0.3747299606093105,0.3895839274497047,0.3957857474501027,0.40921768100491024
17
+ 3.2874743326488707,3200,0.2955665024630542,0.9704433497536946,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9950738916256158,0.2955665024630542,0.01108543831680986,0.5364532019704433,0.3368027806443025,0.3810837438423646,0.5305945275217815,0.2468472906403941,0.6391119305530873,0.18620689655172412,0.7011966317650978,0.14911330049261082,0.7464024315195112,0.2955665024630542,0.5129072760846154,0.5132574161406378,0.5133248970772732,0.5133248970772732,0.5133800768894768,0.2955665024630542,0.5594152093567816,0.5481501194034158,0.571523837193819,0.6042363671821084,0.6246757876126449,0.2955665024630542,0.41658785672295073,0.3762136040779343,0.37675254586116064,0.3921469866254101,0.39960436689206214,0.4133160379838731
18
+ 3.4928131416837784,3400,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,1.0,1.0,0.2955665024630542,0.01108543831680986,0.5408866995073891,0.3430815122152388,0.3795073891625616,0.5284717254168452,0.2477832512315271,0.6414531015782087,0.18715927750410513,0.7108554057352572,0.15123152709359605,0.7608486105173586,0.2955665024630542,0.5146682545697318,0.5148090005232856,0.5148090005232856,0.5149387343823474,0.5149387343823474,0.2955665024630542,0.5653232640836018,0.5490215829809095,0.5746816548365397,0.6088752623686502,0.6319586843696184,0.2955665024630542,0.42325814511721116,0.3777572515958531,0.379739934491265,0.3952484460693154,0.4038152652001957,0.4174841255214519
19
+ 3.6981519507186857,3600,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5366995073891626,0.3401346879927814,0.3791133004926109,0.5303037629501949,0.24709359605911332,0.6445982114703962,0.18627257799671593,0.7036829384339114,0.14879310344827584,0.738997347866535,0.2955665024630542,0.5137388288648785,0.513903032477358,0.513903032477358,0.5139499477952093,0.5139499477952093,0.2955665024630542,0.5596611646789449,0.5474847714789869,0.5740588226442057,0.6058874126410968,0.6236804209778385,0.2955665024630542,0.41754379956573634,0.37586975333555905,0.3785935040344039,0.39393548779646775,0.4010175758956046,0.4154350378212818
20
+ 3.9034907597535935,3800,0.2955665024630542,0.9852216748768473,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5401477832512316,0.340926305387699,0.38325123152709356,0.5363592788435627,0.2508374384236453,0.6505329619912926,0.1873234811165846,0.7087634262948774,0.1500985221674877,0.7488065881630089,0.2955665024630542,0.5158275923799639,0.5158275923799639,0.5158275923799639,0.5158649113828002,0.5158649113828002,0.2955665024630542,0.5635472720092801,0.5522408390126089,0.579355676813206,0.6098919370239626,0.6293058474265337,0.2955665024630542,0.42161148096751283,0.37893753580848294,0.38273006736355336,0.3971050363944898,0.40481585304347145,0.41893049656927983
21
+ 4.108829568788501,4000,0.2955665024630542,0.9753694581280788,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5399014778325123,0.33926725064737134,0.3829556650246305,0.5319613376214742,0.25098522167487686,0.6497082600959269,0.18742200328407224,0.7094703332321319,0.14911330049261085,0.7445597670438818,0.2955665024630542,0.5127296895769795,0.5130763416477695,0.5130763416477695,0.5131188080992728,0.5131188080992728,0.2955665024630542,0.5621043185251402,0.5505636839954736,0.5784375922614946,0.6091764880384499,0.6263384735475871,0.2955665024630542,0.42085554479107096,0.3779379416896035,0.38163165810143573,0.3961646378244818,0.40295816570523324,0.4167002568710484
22
+ 4.314168377823409,4200,0.2955665024630542,0.9802955665024631,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5391625615763547,0.3399387209539555,0.3801970443349754,0.5308580040187325,0.2476847290640394,0.6430327898382845,0.18568144499178982,0.7043523082318627,0.14891625615763546,0.7435945575564449,0.2955665024630542,0.5138789918346562,0.5140086262655611,0.5140086262655611,0.5140546646615833,0.5140546646615833,0.2955665024630542,0.5620736680453444,0.5486209217219633,0.5742560822304251,0.6059775924816383,0.6254063201510274,0.2955665024630542,0.42010224651188977,0.37517744419195703,0.3784520844424068,0.3928983602214202,0.40049621656562834,0.4142041780241764
23
+ 4.519507186858316,4400,0.2955665024630542,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5438423645320197,0.3450860009022403,0.3827586206896551,0.5334236440941986,0.2493103448275862,0.6498536020861698,0.1868965517241379,0.7091695139240046,0.150320197044335,0.7496224791667186,0.2955665024630542,0.5163441238564384,0.5163441238564384,0.5164370692974646,0.5164370692974646,0.5164370692974646,0.2955665024630542,0.567054203369494,0.5519557348354142,0.5786968752325107,0.6099446866772629,0.6301254755200327,0.2955665024630542,0.4243293426584066,0.37874837593471367,0.3817891460614099,0.39643664920094024,0.40443608704984707,0.4176754500966089
24
+ 4.724845995893224,4600,0.2955665024630542,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5406403940886699,0.3424896767056911,0.38216748768472913,0.5329881987535446,0.24970443349753693,0.64863278001433,0.18712643678160917,0.7085620603885778,0.14992610837438422,0.7485450670219227,0.2955665024630542,0.5164545268058354,0.5165841612367403,0.5165841612367403,0.5165841612367403,0.5165841612367403,0.2955665024630542,0.5645228585682827,0.5512955891986082,0.5785579235074741,0.6098517448142098,0.6294320172892106,0.2955665024630542,0.4226075638788329,0.3782003700226031,0.3819826209063663,0.39685072286452894,0.40449418055036124,0.41779880211141207
25
+ 4.930184804928132,4800,0.2955665024630542,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5403940886699506,0.3432684453555553,0.38275862068965516,0.5339871522541048,0.2503448275862069,0.6498636280219438,0.187816091954023,0.7100921836539074,0.15027093596059116,0.7513351913056898,0.2955665024630542,0.5164425017655958,0.516559790060224,0.516559790060224,0.516559790060224,0.516559790060224,0.2955665024630542,0.5647628262992046,0.5522057083055792,0.5796033728499559,0.6111851705889818,0.6309313367878393,0.2955665024630542,0.4221760589983628,0.37913413777890953,0.3829298798486122,0.39811624371681004,0.40559711033541546,0.4188841643667456
eval/Information-Retrieval_evaluation_full_en_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.6476190476190476,0.9809523809523809,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06676436542711181,0.49571428571428566,0.5230697861318131,0.3020952380952381,0.7080259046952858,0.18333333333333332,0.8161311111436004,0.1306031746031746,0.8563260772569234,0.10171428571428572,0.8847564499984696,0.6476190476190476,0.7995238095238094,0.7998765432098764,0.7998765432098764,0.7998765432098764,0.7998765432098764,0.6476190476190476,0.6644717188365947,0.6838731244376804,0.7388666160500343,0.7576507160710139,0.7690741704926698,0.6476190476190476,0.5190094635649438,0.5100636599562821,0.5425074074401441,0.5505730044270032,0.5539698889983905,0.5589203827192654
3
+ 0.4106776180698152,400,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.4919047619047619,0.5275645203591144,0.3024761904761905,0.7212292133682638,0.18390476190476193,0.8283045379034879,0.13047619047619047,0.8695738991278473,0.1010952380952381,0.8925616316092626,0.6571428571428571,0.8082539682539683,0.8082539682539683,0.8082539682539683,0.8082539682539683,0.8082539682539683,0.6571428571428571,0.6744852128542342,0.6977698227605581,0.7525910022322675,0.7708410726215603,0.7802313019837117,0.6571428571428571,0.5320954455063226,0.524953258894126,0.5576471653873537,0.56547526121545,0.56846227050141,0.5737418088844222
4
+ 0.6160164271047228,600,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.49714285714285705,0.5320253566284492,0.30076190476190473,0.7115733472291327,0.18323809523809526,0.8219646215406632,0.13041269841269842,0.8617749760161685,0.10114285714285713,0.8893829886546073,0.6571428571428571,0.804021164021164,0.804021164021164,0.804021164021164,0.804021164021164,0.804021164021164,0.6571428571428571,0.6733931876121629,0.6890144347375734,0.7449674094030438,0.7634521894736199,0.7741481455055373,0.6571428571428571,0.5289650852721914,0.5158905016746862,0.5487511761373439,0.5566661649955296,0.5596172788621484,0.5650001976301273
5
+ 0.8213552361396304,800,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5038095238095238,0.5359291078896171,0.30742857142857144,0.7231242624344728,0.1865714285714286,0.8287617472898247,0.13365079365079366,0.8786134724132268,0.10380952380952381,0.9055879512169813,0.6571428571428571,0.8044444444444445,0.8044444444444445,0.8044444444444445,0.8044444444444445,0.8044444444444445,0.6571428571428571,0.6785113493414832,0.6975630243306563,0.752028646761221,0.7737844137631992,0.7846443667926732,0.6571428571428571,0.5347106441393371,0.5241837933698414,0.5566961145365878,0.5657821503646497,0.5693382792322864,0.5737149303396668
6
+ 1.0266940451745379,1000,0.6285714285714286,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6285714285714286,0.06144015747132491,0.48666666666666664,0.5190783328447762,0.2984761904761905,0.7029438415578318,0.18028571428571427,0.8025196996741193,0.12996825396825396,0.8601000074558084,0.10123809523809525,0.8895776654389056,0.6285714285714286,0.7890476190476191,0.7890476190476191,0.7890476190476191,0.7890476190476191,0.7890476190476191,0.6285714285714286,0.6582983813490093,0.6774907717336247,0.7286679443011637,0.7530937110321057,0.7645734207974859,0.6285714285714286,0.5083977868647599,0.499177438310868,0.5288096388957085,0.5382215452187799,0.5417681626822481,0.5470447121881155
7
+ 1.2320328542094456,1200,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06663116529391168,0.4985714285714285,0.5274043896361065,0.30514285714285716,0.7164112471836823,0.18380952380952384,0.8182171667709184,0.1307936507936508,0.8605021089413554,0.1020952380952381,0.8861966729229761,0.6476190476190476,0.8025510204081632,0.8025510204081632,0.8025510204081632,0.8025510204081632,0.8025510204081632,0.6476190476190476,0.6738060826265202,0.69309074159277,0.7455517121617797,0.764629242110271,0.775831697875017,0.6476190476190476,0.527753461859858,0.5177922810871799,0.5488619677765896,0.5569088592059974,0.5607603829438326,0.5656807907851564
8
+ 1.4373716632443532,1400,0.6285714285714286,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6285714285714286,0.06578234091567553,0.4871428571428572,0.5262403047082601,0.2958095238095238,0.7041602690604255,0.18057142857142858,0.8067854191731868,0.1286349206349206,0.8502005901248406,0.10004761904761905,0.8781986904521015,0.6285714285714286,0.7821012849584278,0.7821012849584278,0.7821012849584278,0.7821012849584278,0.7821012849584278,0.6285714285714286,0.6600708782204088,0.6772614460634251,0.7310099761248214,0.7502370479349909,0.7613023671882065,0.6285714285714286,0.515099817667692,0.5048262539678839,0.5360694674393071,0.5436433732018198,0.5468719999953328,0.5522736180544501
9
+ 1.6427104722792607,1600,0.638095238095238,0.9809523809523809,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.638095238095238,0.06645147077459453,0.5057142857142857,0.5408501384859498,0.3011428571428571,0.7039497531282471,0.18323809523809526,0.8149169584215467,0.13149206349206352,0.86154788686459,0.10180952380952381,0.8885557123621743,0.638095238095238,0.7936507936507936,0.7939014202172097,0.7939014202172097,0.7939014202172097,0.7939014202172097,0.638095238095238,0.6798878231277623,0.6864500951469342,0.7420752428957396,0.7633975396525856,0.773568097259661,0.638095238095238,0.5377260892294139,0.5168804370300233,0.5490354997420103,0.5578819157363086,0.5608869398099069,0.5657588110818693
10
+ 1.8480492813141685,1800,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06663116529391168,0.5,0.5323766774893577,0.3066666666666667,0.7169760400508648,0.18704761904761902,0.8300921519124121,0.13282539682539682,0.8809811090248916,0.10276190476190475,0.9048445770550351,0.6476190476190476,0.7995238095238095,0.7995238095238095,0.7995238095238095,0.7995238095238095,0.7995238095238095,0.6476190476190476,0.6784472205772625,0.6973772696885832,0.7547025889878075,0.775088261543757,0.7844906301669485,0.6476190476190476,0.5385362138763904,0.5285529294613129,0.5617905444617531,0.5698035298215903,0.5727905418678222,0.5782034665907338
11
+ 2.0544147843942504,2000,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.49571428571428566,0.5248643906796471,0.3066666666666667,0.7163373132466915,0.1840952380952381,0.8162248364216093,0.13111111111111112,0.8653175620033385,0.10266666666666666,0.8986935143343762,0.6571428571428571,0.8044557823129251,0.8044557823129251,0.8044557823129251,0.8044557823129251,0.8044557823129251,0.6571428571428571,0.6707185927009709,0.6936505525493303,0.744935413705723,0.7658484281091853,0.7789999142162379,0.6571428571428571,0.5269426054597909,0.5207795539860273,0.5516406780722526,0.5597264759039408,0.5638904989917146,0.5683130781356641
12
+ 2.259753593429158,2200,0.6476190476190476,0.9809523809523809,0.9809523809523809,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06792880556214018,0.4942857142857143,0.5223081727154428,0.30533333333333335,0.7071800124875403,0.18571428571428575,0.8192659010347552,0.1316190476190476,0.8609020678064689,0.10285714285714287,0.8986091880806178,0.6476190476190476,0.7987301587301588,0.7987301587301588,0.7988702147525676,0.7988702147525676,0.7988702147525676,0.6476190476190476,0.6702040498560261,0.6903324238990569,0.7467613366998026,0.7651024673749862,0.7786806786966142,0.6476190476190476,0.5297896190927607,0.5215145725259477,0.5544983772799095,0.5620768803854702,0.5658887687906468,0.5701289711049591
13
+ 2.465092402464066,2400,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06693674207007669,0.5000000000000001,0.531774643661011,0.30323809523809525,0.7177254554536393,0.18657142857142858,0.8296161805502853,0.13276190476190475,0.8732880588733515,0.10295238095238096,0.8999175531255579,0.6476190476190476,0.8000840336134455,0.8000840336134455,0.8000840336134455,0.8000840336134455,0.8000840336134455,0.6476190476190476,0.6757164269458221,0.6916378829119919,0.7496877625671391,0.7692463125021095,0.7798422840685559,0.6476190476190476,0.5334199357549213,0.5178117598205277,0.5519994505109528,0.5600923279944481,0.563274220526865,0.5679273137348514
14
+ 2.6704312114989732,2600,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5057142857142858,0.5415398307419956,0.3057142857142857,0.7196671413604459,0.1859047619047619,0.8306491779436598,0.13295238095238096,0.8742280173088942,0.10295238095238096,0.9017365552841332,0.6571428571428571,0.8062881562881563,0.8062881562881563,0.8062881562881563,0.8062881562881563,0.8062881562881563,0.6571428571428571,0.6840906182081484,0.6982626991632435,0.7548148189484676,0.7750493713075031,0.785402064753857,0.6571428571428571,0.5413044577050904,0.5287752449157558,0.5617169949879072,0.5706202218240289,0.5737976704017311,0.5789570244235601
15
+ 2.875770020533881,2800,0.6476190476190476,0.9809523809523809,0.9809523809523809,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06693674207007669,0.5052380952380953,0.5441715119455589,0.3017142857142857,0.7127783868145353,0.18504761904761907,0.8237991792231755,0.1316190476190476,0.8666664505414008,0.10180952380952382,0.8866598562166411,0.6476190476190476,0.7931746031746033,0.7931746031746033,0.7933446712018142,0.7933446712018142,0.7933446712018142,0.6476190476190476,0.6812927997182552,0.6907705627858918,0.7481581352199016,0.7674753494080262,0.7760946556972169,0.6476190476190476,0.5381098958879589,0.5223540671062366,0.5554274433066995,0.563553989418442,0.5664785392893349,0.5715937883797771
16
+ 3.082135523613963,3000,0.6571428571428571,0.9809523809523809,0.9809523809523809,0.9809523809523809,0.9809523809523809,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5033333333333333,0.5382446964666946,0.30552380952380953,0.7129698365726076,0.1841904761904762,0.815491595950009,0.1300952380952381,0.8518948634351995,0.10085714285714287,0.8767539696447332,0.6571428571428571,0.8011904761904762,0.8011904761904762,0.8011904761904762,0.8011904761904762,0.8012527233115468,0.6571428571428571,0.6793996042024946,0.6934734187651711,0.746086175203451,0.7626434641459138,0.7726525610685311,0.6571428571428571,0.5356659309302372,0.5224042271891631,0.5541274363252567,0.5613348566263541,0.5644152173403065,0.5696868832931363
17
+ 3.2874743326488707,3200,0.6476190476190476,0.9809523809523809,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06690172806447445,0.49428571428571433,0.5247548068275921,0.3049523809523809,0.7145748926853251,0.18571428571428575,0.8259500749123918,0.13053968253968254,0.8617070221614398,0.10147619047619048,0.8867623718280766,0.6476190476190476,0.7944444444444444,0.7947420634920634,0.7947420634920634,0.7947420634920634,0.7947420634920634,0.6476190476190476,0.6712553739080126,0.6930067882080397,0.7492851647400585,0.7653067105368627,0.7757073400796275,0.6476190476190476,0.5296741628455891,0.5221420866285574,0.5556460060519156,0.5622730928172809,0.5655913186548956,0.5705569252206568
18
+ 3.4928131416837784,3400,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5023809523809524,0.5367661170448913,0.3055238095238096,0.7187234501313271,0.18704761904761905,0.8306763764733734,0.13282539682539685,0.8734278450876739,0.10323809523809525,0.9038494279152669,0.6571428571428571,0.8119047619047619,0.8119047619047619,0.8119047619047619,0.8119047619047619,0.8119047619047619,0.6571428571428571,0.6807047163205504,0.6967388214413348,0.7543645239608835,0.773505711144443,0.7849671182499111,0.6571428571428571,0.5341199283903049,0.5221550904234853,0.5565500886925563,0.5647214093335157,0.5681730912474996,0.5728454391682805
19
+ 3.6981519507186857,3600,0.638095238095238,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.638095238095238,0.06603592719867359,0.4976190476190476,0.5326063008275358,0.30533333333333335,0.7183726738054269,0.18571428571428572,0.8244364002982505,0.13212698412698412,0.8687573084198223,0.10247619047619048,0.8932574206684566,0.638095238095238,0.7992063492063493,0.7992063492063493,0.7992063492063493,0.7992063492063493,0.7992063492063493,0.638095238095238,0.6749960374944168,0.6949465477197223,0.7498344486377847,0.7694784080484888,0.7795608604404137,0.638095238095238,0.5282685417143183,0.5203740591721733,0.5531520665437176,0.5614325415507148,0.5646826546106057,0.5694892550206667
20
+ 3.9034907597535935,3800,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5061904761904761,0.5445546987827694,0.30704761904761907,0.7225037707304278,0.18542857142857141,0.8194792897094019,0.13244444444444445,0.8691531505423884,0.103,0.8956440486016682,0.6571428571428571,0.8037414965986395,0.8037414965986395,0.8037414965986395,0.8037414965986395,0.8037414965986395,0.6571428571428571,0.6832075931194561,0.6982940828341648,0.7500305615955617,0.7715917747038767,0.7824517671036126,0.6571428571428571,0.5386195545614166,0.525776872057281,0.5572584724863334,0.5662576632089249,0.5697800738419977,0.5745197549922925
21
+ 4.108829568788501,4000,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06690172806447445,0.499047619047619,0.5288155255988508,0.30266666666666664,0.7128731386766649,0.18447619047619046,0.821589853989195,0.13155555555555554,0.8669290529739844,0.10171428571428573,0.8881772271562451,0.6476190476190476,0.7969444444444443,0.7969444444444443,0.7969444444444443,0.7969444444444443,0.7969444444444443,0.6476190476190476,0.6737021289484512,0.6897381539459008,0.7455379155828873,0.7657730626526685,0.7746920852324353,0.6476190476190476,0.5299368408688423,0.5170402457535271,0.549577105065989,0.5580348324082148,0.5609705433942662,0.5664835460503455
22
+ 4.314168377823409,4200,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.5042857142857142,0.5373072040835736,0.30342857142857144,0.7066915041490871,0.18485714285714283,0.8223255763807351,0.13161904761904764,0.8681298207585033,0.1020952380952381,0.8939381871513931,0.6571428571428571,0.8050793650793651,0.8050793650793651,0.8050793650793651,0.8050793650793651,0.8050793650793651,0.6571428571428571,0.6828242233504754,0.6934957075565445,0.7508237653332346,0.7708996755918012,0.7810547976165594,0.6571428571428571,0.5403780248322398,0.5246924299662313,0.5574701928996357,0.5657362210212612,0.5689495406824301,0.5740394717933254
23
+ 4.519507186858316,4400,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06749696615971254,0.501904761904762,0.5348166179254283,0.30514285714285716,0.7176194992567407,0.18476190476190474,0.8203546241789754,0.13238095238095238,0.8712408549365904,0.10223809523809524,0.8993000584751492,0.6571428571428571,0.8026984126984127,0.8026984126984127,0.8026984126984127,0.8026984126984127,0.8026984126984127,0.6571428571428571,0.6791929962471466,0.6958143211009435,0.7493655431536407,0.7715718645271473,0.7814931000676181,0.6571428571428571,0.5371258373378305,0.5243155763407285,0.5561427452138551,0.5652920456249697,0.5681007357520309,0.5730541345190991
24
+ 4.724845995893224,4600,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06690172806447445,0.5033333333333333,0.5361893486281004,0.3051428571428572,0.7178301231768206,0.18504761904761904,0.8209713456799689,0.13263492063492063,0.8719838465781551,0.10238095238095238,0.9002628694890553,0.6476190476190476,0.7979365079365079,0.7979365079365079,0.7979365079365079,0.7979365079365079,0.7979365079365079,0.6476190476190476,0.6792043770713534,0.6952356840844034,0.7491776279498115,0.7714889294157944,0.7814307168109694,0.6476190476190476,0.5373325378117988,0.5240005650356997,0.5562356661851569,0.5654875568184526,0.5682618444726486,0.5731371282665402
25
+ 4.930184804928132,4800,0.6476190476190476,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6476190476190476,0.06690172806447445,0.5061904761904762,0.5391510592522911,0.30647619047619057,0.7199711948587544,0.1858095238095238,0.8253770621157605,0.13250793650793652,0.8719997123512196,0.10247619047619047,0.9006382758109558,0.6476190476190476,0.7999999999999998,0.7999999999999998,0.7999999999999998,0.7999999999999998,0.7999999999999998,0.6476190476190476,0.6822066814233797,0.6975329548006446,0.7519637922809941,0.7724946802449859,0.7827357067553371,0.6476190476190476,0.5391784054866918,0.5258287715484311,0.5580109313638075,0.5665715227835532,0.569529009182472,0.5743595458034346
eval/Information-Retrieval_evaluation_full_es_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036027986426148927,0.5548648648648649,0.37393347961139234,0.37632432432432433,0.5448061409296977,0.24529729729729732,0.6577711963066953,0.18699099099099098,0.7238313233550039,0.14954054054054053,0.7639027591391713,0.12432432432432433,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.12432432432432433,0.596051015946299,0.5711547957546739,0.5997522698105078,0.6339018154199011,0.6530423947558011,0.12432432432432433,0.45972243774251287,0.4076240020338008,0.4144683877194271,0.43035403315296905,0.4378112228742522,0.4501903666018089
3
+ 0.4106776180698152,400,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0035547842888501878,0.5756756756756757,0.382834454265531,0.3876756756756757,0.5650401570341179,0.24929729729729733,0.6748166280760668,0.1876036036036036,0.7332569271668431,0.15005405405405403,0.7715626882580108,0.12432432432432433,0.5540540540540542,0.5540540540540542,0.5540540540540542,0.5540540540540542,0.5540540540540542,0.12432432432432433,0.6148924512961366,0.5898418519056956,0.6154202525561722,0.6457616366909722,0.6642589555117913,0.12432432432432433,0.4862878224538023,0.4329777882377841,0.43589976811267217,0.4503550290841326,0.45768065331543906,0.4700308817344323
4
+ 0.6160164271047228,600,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0029430925006878115,0.5575675675675675,0.37690250933096175,0.38129729729729733,0.5549476206136158,0.24691891891891893,0.6647752045410573,0.18587387387387388,0.7250893778157846,0.14824324324324323,0.7627104953940022,0.11351351351351352,0.5509009009009008,0.5509009009009008,0.5509009009009008,0.5509009009009008,0.5509009009009008,0.11351351351351352,0.6023728216114188,0.578976868549403,0.6058097622442319,0.6368206734394307,0.654916091513035,0.11351351351351352,0.47196928985453857,0.4190229638109246,0.4238406948205412,0.43842290397754335,0.44522365472659053,0.4575807266993981
5
+ 0.8213552361396304,800,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5681081081081081,0.3739268011473366,0.38735135135135135,0.552712474827668,0.25497297297297294,0.6745275771791883,0.19264864864864864,0.7433637986150478,0.15378378378378377,0.7809900218019078,0.12432432432432433,0.5567567567567567,0.5567567567567567,0.5567567567567567,0.5567567567567567,0.5567567567567567,0.12432432432432433,0.608238532915008,0.5816440615734259,0.6145250012152813,0.6486016184837509,0.667376641912895,0.12432432432432433,0.4759157265536747,0.4198620973885256,0.4267844411804261,0.4426247510389689,0.4501342227928204,0.4620481417833252
6
+ 1.0266940451745379,1000,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036138931714884822,0.5424324324324324,0.36915785507009463,0.36540540540540545,0.5330377203448686,0.23972972972972972,0.6494552171809718,0.17913513513513513,0.7037904487423663,0.1442972972972973,0.7429741803759865,0.12432432432432433,0.5558558558558558,0.5558558558558558,0.5558558558558558,0.5558558558558558,0.5558558558558558,0.12432432432432433,0.5884880702119168,0.5608420127224972,0.591450400755495,0.6194431599763025,0.6390949481425205,0.12432432432432433,0.4475293976487679,0.3945138762422232,0.402000090021349,0.4143868824911826,0.4219185032204542,0.43574984797470584
7
+ 1.2320328542094456,1200,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035436931012884127,0.5624324324324325,0.3749287656821326,0.3818378378378379,0.5499914000928026,0.24989189189189193,0.670535096286274,0.185981981981982,0.7248554774311073,0.14921621621621622,0.7676635974920542,0.11891891891891893,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.11891891891891893,0.6058464275961338,0.5784986623377951,0.6101173023146221,0.6379964576353052,0.6582237993778991,0.11891891891891893,0.4714134240726968,0.4150062826950716,0.4218574244695916,0.4351613303786019,0.4427513871929523,0.45476834946759676
8
+ 1.4373716632443532,1400,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0034454146142631225,0.5656756756756757,0.3742168151709032,0.3827027027027028,0.555525970528024,0.2504864864864865,0.6739370924160176,0.1872792792792793,0.7323421060617122,0.1489189189189189,0.765891725172557,0.11351351351351352,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.11351351351351352,0.6070452213140873,0.5809871810405777,0.6114591008114354,0.6412451347199153,0.6578418001807836,0.11351351351351352,0.47438405107732806,0.41829205227926136,0.42574544531206737,0.43940862571315725,0.4459089793048117,0.4577045117057749
9
+ 1.6427104722792607,1600,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0030225837566496554,0.5764864864864865,0.37886723443485576,0.3934054054054054,0.5651151201192979,0.2538918918918919,0.6805248552916797,0.18882882882882884,0.7388426125145136,0.15143243243243243,0.7770302047645695,0.11891891891891893,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.11891891891891893,0.6163969831748072,0.5918130760666188,0.618913882136453,0.6479860688044038,0.6672543301166167,0.11891891891891893,0.4835466233061855,0.43057057237031016,0.43514514508198443,0.4483722754502125,0.45620250646424887,0.46814889290522227
10
+ 1.8480492813141685,1800,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0034361234285013482,0.572972972972973,0.37921640068608053,0.3898378378378379,0.571918445801414,0.25675675675675674,0.6909194666143896,0.19102702702702704,0.7527590970685212,0.1516216216216216,0.7872025933931285,0.11351351351351352,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.5527027027027027,0.11351351351351352,0.6162669484275393,0.5939669961906182,0.6265400328183519,0.6566777550930192,0.6732652519983943,0.11351351351351352,0.487532737215657,0.4328608208494426,0.44154149744473514,0.455705797416168,0.46223288702755577,0.4744677922669752
11
+ 2.0544147843942504,2000,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0034361234285013482,0.5678378378378378,0.37185149609377927,0.3868108108108108,0.5538880192680598,0.25243243243243246,0.6692573058813261,0.18879279279279282,0.7328414715180762,0.15116216216216216,0.7762160560418868,0.11351351351351352,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.11351351351351352,0.6103530658776057,0.5835888373735475,0.6124810389467954,0.643715197655626,0.6639184386289334,0.11351351351351352,0.47642760777088633,0.42112234095594264,0.42724796018693306,0.4409014199272696,0.4484513088852489,0.46005419810682735
12
+ 2.259753593429158,2200,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035747235671014874,0.5640540540540541,0.3692527069818092,0.3856216216216216,0.5486859743996353,0.2534054054054054,0.665182622715675,0.18922522522522528,0.7243768092155874,0.15148648648648647,0.7623039454108149,0.11891891891891893,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.11891891891891893,0.6085810490710928,0.5824768918703372,0.6124499133410561,0.6422929562996472,0.6613618089888628,0.11891891891891893,0.4760592399408784,0.42221925316942405,0.428186711292096,0.44203885893977524,0.44989811413329556,0.46196549575756446
13
+ 2.465092402464066,2400,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035156146844631925,0.5724324324324324,0.37436809187892894,0.3876756756756757,0.5607912974425261,0.25308108108108107,0.6794887692385309,0.18774774774774775,0.7337135298631903,0.14832432432432432,0.7662511890360664,0.11891891891891893,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.11891891891891893,0.614398583810333,0.587784861271955,0.6181979077059248,0.6459355616704676,0.6615088166184556,0.11891891891891893,0.4846625158570983,0.4284460975553748,0.434750492464762,0.44775583229060273,0.4536965841404994,0.46605913889080786
14
+ 2.6704312114989732,2600,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5708108108108108,0.3752213866634721,0.3917837837837838,0.5671935219645406,0.2547027027027027,0.6863661078007391,0.18904504504504502,0.7406537741279986,0.1497027027027027,0.7729920499268153,0.12432432432432433,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.12432432432432433,0.6141190868607808,0.5924255342702686,0.6227567200955056,0.6504326010497111,0.666333989180959,0.12432432432432433,0.48333688172086503,0.43102364277829264,0.4366067786223888,0.449770669745084,0.45621710772733703,0.46895544676889367
15
+ 2.875770020533881,2800,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035156146844631925,0.5724324324324324,0.37783496582445003,0.3918918918918919,0.5658814625076302,0.2532432432432432,0.6783520641778591,0.1885405405405405,0.7353111064933018,0.15105405405405406,0.7757218423184007,0.11891891891891893,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.554954954954955,0.11891891891891893,0.6150653647357389,0.5923840259698727,0.61973316310359,0.6484934654942248,0.6680592351177694,0.11891891891891893,0.48397038118160446,0.43217201433344876,0.43707131980336056,0.45015269768007266,0.45787598364964693,0.47020161913512704
16
+ 3.082135523613963,3000,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5645945945945945,0.37080506242115724,0.3874594594594594,0.5561474933972658,0.25243243243243246,0.6759767696731587,0.18634234234234232,0.7251868631906863,0.149,0.7643468756129986,0.12432432432432433,0.5590090090090091,0.5590090090090091,0.5590090090090091,0.5590090090090091,0.5590090090090091,0.12432432432432433,0.6089522913829681,0.5856953890482939,0.6162051340957286,0.6416632755009771,0.6604336236281284,0.12432432432432433,0.4769670282999621,0.4235650752914417,0.42968852383695133,0.4419704441008062,0.44914583795545937,0.46139208886574296
17
+ 3.2874743326488707,3200,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5594594594594594,0.3674330812698951,0.3843243243243244,0.550758758139416,0.25,0.6669881337125826,0.188036036036036,0.7275998257204263,0.15097297297297296,0.7706610125733176,0.12432432432432433,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.12432432432432433,0.6050644478934001,0.5809058589833067,0.6098925225338429,0.6410014736361206,0.6615173527461848,0.12432432432432433,0.4741499067823761,0.4197953347844919,0.42446157415371616,0.43915567960176516,0.446983283921267,0.45887291000491937
18
+ 3.4928131416837784,3400,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5662162162162162,0.3741936725511336,0.38345945945945953,0.551626438985741,0.25005405405405406,0.668281602495888,0.18724324324324323,0.7255230483983959,0.1496756756756757,0.7629690702109315,0.12432432432432433,0.5576576576576577,0.5576576576576577,0.5576576576576577,0.5576576576576577,0.5576576576576577,0.12432432432432433,0.610016238780624,0.5808878958542367,0.61028319858239,0.6399007501727882,0.6583217187592327,0.12432432432432433,0.4766824977456947,0.41900574983934114,0.42385943807093396,0.437790582370841,0.4452542989816324,0.45812176180968966
19
+ 3.6981519507186857,3600,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035156146844631925,0.5637837837837838,0.3740205970563281,0.383027027027027,0.552763470548829,0.24702702702702703,0.661194930340044,0.1848648648648649,0.7190931400305525,0.1478918918918919,0.7552141114312797,0.11891891891891893,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.5563063063063063,0.11891891891891893,0.6092151142192643,0.5815862902131917,0.6075151607798029,0.6370202345538115,0.6552303680059667,0.11891891891891893,0.4768679050882057,0.41990189653561893,0.4234842769998353,0.4372747294973208,0.44450918997226513,0.4577896323717848
20
+ 3.9034907597535935,3800,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035840147528632613,0.5743243243243243,0.38329375101630103,0.3890810810810812,0.5599611558625714,0.25356756756756754,0.676934673521209,0.1891891891891892,0.7320629347542283,0.1504864864864865,0.7653343962902701,0.11891891891891893,0.555855855855856,0.555855855855856,0.555855855855856,0.555855855855856,0.555855855855856,0.11891891891891893,0.61860626881223,0.5891796950902745,0.6187842301547685,0.6474658086247029,0.6641503890510578,0.11891891891891893,0.4869387635579613,0.42988037593606077,0.4348527028748626,0.4485959807276641,0.4553843587655029,0.4682285808574658
21
+ 4.108829568788501,4000,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036542148230633313,0.5718918918918918,0.3813088657975513,0.38832432432432434,0.5589819018381946,0.25135135135135134,0.6712879484837694,0.1886486486486487,0.7296378671854172,0.15083783783783786,0.7646529145750729,0.12432432432432433,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.5585585585585586,0.12432432432432433,0.6162786673767947,0.5875500387824142,0.6146487956773306,0.6449661586574366,0.6628313427507618,0.12432432432432433,0.4830935685993706,0.4268637780839156,0.43032040469750343,0.4449589410699155,0.4523102942291434,0.4643631946508736
22
+ 4.314168377823409,4200,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0035155918996302815,0.5678378378378378,0.37836142042267473,0.38616216216216215,0.5571586783455559,0.24956756756756757,0.6675392853403386,0.18836036036036036,0.7304539075934318,0.14981081081081082,0.762368065923207,0.11351351351351352,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.11351351351351352,0.6138712223781554,0.5860105244597086,0.612222606218991,0.6445206608822607,0.6607643472995034,0.11351351351351352,0.48205571119205054,0.426066001253444,0.4286297248227863,0.44367730975701125,0.45055470203697434,0.4632014183024849
23
+ 4.519507186858316,4400,0.10810810810810811,1.0,1.0,1.0,1.0,1.0,0.10810810810810811,0.0033677005752683685,0.5667567567567569,0.3790230473715137,0.3877837837837838,0.5587328778405388,0.25156756756756754,0.670664457795493,0.18954954954954953,0.7335635895457856,0.15067567567567566,0.766278425246947,0.10810810810810811,0.5509009009009009,0.5509009009009009,0.5509009009009009,0.5509009009009009,0.5509009009009009,0.10810810810810811,0.613008635976177,0.5878242736285791,0.6148703843706662,0.6471060871986968,0.6634453873788777,0.10810810810810811,0.48105434805966624,0.42917908716630376,0.4322285035959748,0.4473320611795549,0.45413116686066823,0.4666628908850396
24
+ 4.724845995893224,4600,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0034454146142631225,0.5654054054054054,0.37952988347964417,0.3897297297297298,0.5627963647085502,0.25324324324324327,0.6735817765534955,0.1901981981981982,0.735181694329396,0.15102702702702703,0.7691645515769362,0.11351351351351352,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.5531531531531532,0.11351351351351352,0.6127764701851742,0.5903129713737418,0.6173381508468064,0.6486192970671256,0.6654238285942606,0.11351351351351352,0.47962787853124583,0.43007675118643823,0.4338635407926422,0.4486007040723234,0.4552653077040697,0.46787303947870823
25
+ 4.930184804928132,4800,0.11351351351351352,1.0,1.0,1.0,1.0,1.0,0.11351351351351352,0.0035155918996302815,0.5667567567567567,0.37958552840441906,0.3902702702702703,0.5635730197468752,0.25254054054054054,0.672698242387141,0.19005405405405407,0.7360036980055802,0.1507837837837838,0.7697561816436992,0.11351351351351352,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.5536036036036036,0.11351351351351352,0.6136401766234348,0.5908459924766464,0.6168063266629416,0.6488575731321932,0.665316090087272,0.11351351351351352,0.48095830339282386,0.43038606337879926,0.4335284717646407,0.44851036812148526,0.4550924585301385,0.4677023132311536
eval/Information-Retrieval_evaluation_full_zh_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.06671174708572336,0.4742718446601942,0.5142070210631146,0.2860194174757282,0.6863598298644913,0.1754368932038835,0.7993247827308323,0.12563106796116508,0.8458435112454522,0.09844660194174758,0.881320984737063,0.6699029126213593,0.8074433656957929,0.8074433656957929,0.8074433656957929,0.8074433656957929,0.8074433656957929,0.6699029126213593,0.648079404872881,0.663139150698011,0.7175582475051152,0.7381391111139272,0.7516637589878762,0.6699029126213593,0.49492427828808666,0.48064885783518935,0.5100509113474142,0.5182886670429253,0.5223226571381689,0.5273218280377672
3
+ 0.4106776180698152,400,0.6504854368932039,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6504854368932039,0.06521498333167806,0.479611650485437,0.5172505716152186,0.28563106796116505,0.6863774860254501,0.1750485436893204,0.7921143014857813,0.12569579288025892,0.8412594813538471,0.09864077669902913,0.8755105061006315,0.6504854368932039,0.8002353633421596,0.8002353633421596,0.8002353633421596,0.8002353633421596,0.8002353633421596,0.6504854368932039,0.654082205796757,0.6649804893012102,0.7180956805258507,0.7397271243264008,0.7530965410267277,0.6504854368932039,0.5061559900753001,0.48897781404852814,0.5189869115446097,0.5274143437737836,0.5314938068513844,0.5366301953809629
4
+ 0.6160164271047228,600,0.6893203883495146,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6893203883495146,0.06960420120652665,0.4849514563106796,0.5185220682418866,0.2871844660194175,0.6820267359031588,0.17466019417475726,0.7938699037700918,0.1259546925566343,0.8422246695562658,0.09839805825242716,0.8693104263172221,0.6893203883495146,0.8145476961010943,0.8145476961010943,0.8145476961010943,0.8145476961010943,0.8145476961010943,0.6893203883495146,0.659665404664246,0.667608886060256,0.720849740491483,0.7428557294670357,0.7542074266139583,0.6893203883495146,0.5114816208613794,0.4919730518674867,0.5203652900057457,0.5295071239880426,0.5331560525760697,0.5390539624315537
5
+ 0.8213552361396304,800,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.0668914656268579,0.4868932038834951,0.517797105890383,0.2899029126213593,0.6889084013960686,0.17766990291262136,0.8016408450159429,0.127831715210356,0.8495235872979315,0.09975728155339804,0.8839842970578673,0.6699029126213593,0.8195792880258899,0.8195792880258899,0.8195792880258899,0.8195792880258899,0.8195792880258899,0.6699029126213593,0.6650214094282284,0.6756802162850395,0.7304632544230837,0.7522071238106548,0.7648271824008246,0.6699029126213593,0.5193221095704731,0.5000757475814764,0.5301609751242916,0.5393944328771004,0.5431877788909137,0.5482298957592407
6
+ 1.0266940451745379,1000,0.6504854368932039,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6504854368932039,0.06374263327838452,0.470873786407767,0.5056869268958247,0.286990291262136,0.6867599386776999,0.17592233009708733,0.796513632701893,0.12608414239482202,0.8445138135248258,0.09864077669902911,0.8779531282385074,0.6504854368932039,0.8011368890009668,0.8011368890009668,0.8011368890009668,0.8011368890009668,0.8011368890009668,0.6504854368932039,0.6451481379204927,0.6626734556154236,0.716730883888469,0.7375095957525313,0.7503178257668185,0.6504854368932039,0.4943176710897972,0.47975263631803416,0.5089284276437599,0.5173843595469786,0.5212942519694627,0.5266903756888776
7
+ 1.2320328542094456,1200,0.6601941747572816,0.9805825242718447,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.06637208311657224,0.4699029126213593,0.5114473522410131,0.2856310679611651,0.6861205543352007,0.1755339805825243,0.7957104293648646,0.12627831715210358,0.8416539993715157,0.09873786407766992,0.8722319477434431,0.6601941747572816,0.8069579288025891,0.8073462783171521,0.8073462783171521,0.8073462783171521,0.8073462783171521,0.6601941747572816,0.647509569845552,0.6636256314835147,0.7180603170049379,0.7392496814142943,0.75142384485359,0.6601941747572816,0.4950904670779875,0.4804146521645792,0.5106244340822919,0.5194067842110492,0.5231799499801336,0.5286305942432927
8
+ 1.4373716632443532,1400,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.06845190151053764,0.46504854368932036,0.5090781814370999,0.2788349514563107,0.6741253535510389,0.16980582524271848,0.7781267983557943,0.12265372168284792,0.8296475876251914,0.09699029126213593,0.875403335462795,0.6699029126213593,0.8131323454266736,0.8131323454266736,0.8131323454266736,0.8131323454266736,0.8131323454266736,0.6699029126213593,0.6485416287417809,0.6602604068397879,0.7110678268293067,0.7338219895333261,0.7505220515399763,0.6699029126213593,0.4991554790801001,0.4816709768736872,0.5086830461071907,0.5171809806531933,0.521654510221628,0.527305595260522
9
+ 1.6427104722792607,1600,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.06750238960810089,0.47135922330097085,0.5090633306477279,0.28291262135922335,0.6773859773072358,0.17368932038834953,0.7904884245188994,0.12524271844660195,0.8394265084209854,0.09844660194174759,0.8740181776046768,0.6699029126213593,0.8153182308522116,0.8153182308522116,0.8153182308522116,0.8153182308522116,0.8153182308522116,0.6699029126213593,0.6534719681577705,0.6651874764061042,0.7197783891114586,0.7419398665127896,0.7555484961332182,0.6699029126213593,0.5082004969596765,0.48806690840439837,0.5171547641982063,0.5261477521963548,0.530412300347724,0.5360212712713517
10
+ 1.8480492813141685,1800,0.6796116504854369,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06879513971785352,0.4800970873786408,0.5241454775803328,0.28873786407766994,0.694383316835914,0.17621359223300972,0.8022764348101481,0.12699029126213593,0.8587255710530268,0.09893203883495146,0.8883109835900415,0.6796116504854369,0.8219442369927806,0.8219442369927806,0.8219442369927806,0.8219442369927806,0.8219442369927806,0.6796116504854369,0.6630416418080642,0.6757325970656028,0.7290099068409519,0.7527782597412692,0.7641823675703906,0.6796116504854369,0.5157348037496887,0.49801642995704726,0.5277748138942802,0.5368859735886921,0.5404313137147164,0.5459166655079852
11
+ 2.0544147843942504,2000,0.6796116504854369,0.9805825242718447,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06879513971785352,0.46650485436893213,0.5000069747494543,0.27766990291262134,0.6674092810907462,0.17145631067961165,0.7736174616385345,0.12517799352750808,0.8418748050270234,0.09766990291262137,0.8709889407310529,0.6796116504854369,0.816747572815534,0.8171071556993887,0.8171071556993887,0.8171071556993887,0.8171071556993887,0.6796116504854369,0.6467352591193695,0.6572518155079017,0.7104096432240307,0.7383498360811994,0.7499553401377297,0.6796116504854369,0.4979214480710345,0.47815811455043,0.5062347709725405,0.5167085124573587,0.5202063155939766,0.525956171993899
12
+ 2.259753593429158,2200,0.6796116504854369,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06427555485009323,0.48252427184466024,0.5105098495134462,0.28291262135922335,0.6740327269698048,0.17427184466019413,0.7916158662499747,0.12627831715210355,0.8502903346942817,0.09907766990291259,0.8819321390743221,0.6796116504854369,0.8156957928802591,0.8156957928802591,0.8156957928802591,0.8156957928802591,0.8156957928802591,0.6796116504854369,0.6544573773824907,0.659656525966717,0.7161060810083814,0.7408464936274834,0.7536586007294086,0.6796116504854369,0.5067156996774812,0.4825104584106149,0.5123095179131745,0.5218353225379503,0.525919053774772,0.5308008384230012
13
+ 2.465092402464066,2400,0.6601941747572816,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.0656219892755845,0.47669902912621354,0.5086828676301197,0.28291262135922335,0.677501932151629,0.17233009708737865,0.7889958319778536,0.12433656957928804,0.8386075001491912,0.09825242718446603,0.8707818194134918,0.6601941747572816,0.8099441012062373,0.8099441012062373,0.8099441012062373,0.8099441012062373,0.8099441012062373,0.6601941747572816,0.651624834884448,0.6609810560494164,0.7137401918968503,0.7356861834214455,0.7496985732081209,0.6601941747572816,0.503364312848485,0.4824661562629566,0.5095601428062084,0.518338918822531,0.52293288163652,0.5280588344576235
14
+ 2.6704312114989732,2600,0.6601941747572816,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.06379186622901371,0.47572815533980595,0.5117982776509837,0.283495145631068,0.6788228939168225,0.17398058252427187,0.7966221055950995,0.12614886731391586,0.8514400443851018,0.09849514563106797,0.8809753383516714,0.6601941747572816,0.8069579288025891,0.8069579288025891,0.8069579288025891,0.8069579288025891,0.8069579288025891,0.6601941747572816,0.6535457150911655,0.6645220079912586,0.7201432897215181,0.7442771799546066,0.7559927318742949,0.6601941747572816,0.5082341741604441,0.4900222998545032,0.5187162145425247,0.5282918154973173,0.5319442187891804,0.5371054053033005
15
+ 2.875770020533881,2800,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.06177481933964896,0.4762135922330099,0.5156692617843701,0.2821359223300971,0.6739343322534869,0.17281553398058253,0.7847359108139387,0.124789644012945,0.8408096405846606,0.09810679611650486,0.877189142919609,0.6699029126213593,0.8107335490830637,0.8107335490830637,0.8107335490830637,0.8107335490830637,0.8107335490830637,0.6699029126213593,0.6532911255449996,0.6599833534783301,0.7140915902480576,0.737466009847695,0.7516880715450132,0.6699029126213593,0.5043376858299589,0.4839297775403755,0.5126442225957247,0.521842066301839,0.5260474913184744,0.5312782096529248
16
+ 3.082135523613963,3000,0.6504854368932039,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6504854368932039,0.06048032095777195,0.46116504854368934,0.4984933699479645,0.2782524271844661,0.6673119340409225,0.1722330097087379,0.7893226216525688,0.12317152103559871,0.8360561655224406,0.0966019417475728,0.8650657671672598,0.6504854368932039,0.801294498381877,0.801294498381877,0.801294498381877,0.801294498381877,0.801294498381877,0.6504854368932039,0.6338507126018751,0.6468993502401953,0.7045840170763786,0.7247096420308125,0.7369553723592752,0.6504854368932039,0.48308194143446237,0.4647426182330597,0.4947205147151672,0.5025556665390721,0.50649653015356,0.5118461790308642
17
+ 3.2874743326488707,3200,0.6310679611650486,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6310679611650486,0.06208903273843333,0.4655339805825242,0.5012734237696118,0.27980582524271846,0.6678242686279954,0.1722330097087379,0.7916566922646293,0.12459546925566344,0.8430812690717603,0.09747572815533981,0.8690444100263183,0.6310679611650486,0.7919829361576934,0.7919829361576934,0.7919829361576934,0.7919829361576934,0.7919829361576934,0.6310679611650486,0.6376414058708825,0.6501283883746707,0.7075088983840053,0.7304832241479023,0.7417044925821275,0.6310679611650486,0.4863824590886396,0.46897794772956386,0.49844081902636395,0.5074293237920947,0.5111472070326477,0.516346162232267
18
+ 3.4928131416837784,3400,0.6407766990291263,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6407766990291263,0.05953636118437481,0.46893203883495155,0.5017631710801468,0.28233009708737866,0.6778470334920769,0.17213592233009703,0.7813486392107438,0.12381877022653723,0.8300033012337814,0.09762135922330097,0.8719061229259629,0.6407766990291263,0.7988673139158576,0.7988673139158576,0.7988673139158576,0.7988673139158576,0.7988673139158576,0.6407766990291263,0.643953131132988,0.6572787342898772,0.7087934468117895,0.7302106705735004,0.7458202627345809,0.6407766990291263,0.49589359848244985,0.47571155025268164,0.5034700410179739,0.5120237790030302,0.516459676501696,0.5216694248286186
19
+ 3.6981519507186857,3600,0.6796116504854369,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06427555485009323,0.4679611650485436,0.5015347280469608,0.2766990291262136,0.6668441234056627,0.17009708737864077,0.7722986999037393,0.12213592233009711,0.8233244141080392,0.09592233009708741,0.8580638334327066,0.6796116504854369,0.8174988441978734,0.8174988441978734,0.8174988441978734,0.8174988441978734,0.8174988441978734,0.6796116504854369,0.6454328728754617,0.653326555155644,0.7059748890018152,0.727594687489884,0.740940522430805,0.6796116504854369,0.4969276045833893,0.47391172043684465,0.5021526210055403,0.5103517821061276,0.5143246519745102,0.519766002204848
20
+ 3.9034907597535935,3800,0.6699029126213593,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.06540824093423561,0.4733009708737864,0.5085472320149603,0.28019417475728153,0.6703958761573777,0.17339805825242718,0.7889616470601033,0.12453074433656959,0.839160692178304,0.09781553398058251,0.8682906828740438,0.6699029126213593,0.8127831715210356,0.8127831715210356,0.8127831715210356,0.8127831715210356,0.8127831715210356,0.6699029126213593,0.6513608069160104,0.6588660943900457,0.7162166494548273,0.7379060319726523,0.7504435328731304,0.6699029126213593,0.5069278937534705,0.48429584478711935,0.5144657628868865,0.5229907031303573,0.5271575634144027,0.5321097733743518
21
+ 4.108829568788501,4000,0.6407766990291263,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6407766990291263,0.05744396078263393,0.46504854368932047,0.4978573021507442,0.27611650485436895,0.6611813069264482,0.17097087378640777,0.7796553453979224,0.12291262135922332,0.8271677009796732,0.0969417475728155,0.8637730394316714,0.6407766990291263,0.7983818770226538,0.7983818770226538,0.7983818770226538,0.7983818770226538,0.7983818770226538,0.6407766990291263,0.6374339653798218,0.6458466090741598,0.7026844413104963,0.7238302410564206,0.7383757321568225,0.6407766990291263,0.4902515378001179,0.46828607843970593,0.49742002930709256,0.5055517135202557,0.5100267276205871,0.5152273086702759
22
+ 4.314168377823409,4200,0.6601941747572816,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.06391645269201905,0.46990291262135936,0.5028687618433456,0.2766990291262136,0.6651242597088418,0.17145631067961165,0.7783273755437382,0.12381877022653723,0.8334866166756513,0.09747572815533984,0.8666706510858552,0.6601941747572816,0.8101941747572816,0.8101941747572816,0.8101941747572816,0.8101941747572816,0.8101941747572816,0.6601941747572816,0.6467729312304265,0.6531754449097694,0.7091690247935931,0.7326072552384693,0.7462718534326636,0.6601941747572816,0.5008318658399892,0.47687535367801903,0.506399482523297,0.515344178164581,0.5196266745217748,0.5245537410408139
23
+ 4.519507186858316,4400,0.6601941747572816,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.06553457566936532,0.47038834951456326,0.5048889923213504,0.27941747572815534,0.6723480900580502,0.17242718446601943,0.7839824295594963,0.1239482200647249,0.8346078714936033,0.09762135922330097,0.868005364909913,0.6601941747572816,0.8105987055016182,0.8105987055016182,0.8105987055016182,0.8105987055016182,0.8105987055016182,0.6601941747572816,0.6489154249968472,0.6582544798801073,0.7132853867809429,0.7351428110305336,0.7489033638336042,0.6601941747572816,0.5020661402654003,0.4804116383814884,0.5096988475054017,0.5182607426785758,0.5226490945380862,0.5274856682898562
24
+ 4.724845995893224,4600,0.6796116504854369,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06824731124903408,0.47087378640776706,0.5113376722170427,0.27941747572815534,0.6702499652138949,0.1731067961165048,0.7852754872268105,0.12427184466019418,0.8349821570732121,0.09771844660194175,0.8695735570177355,0.6796116504854369,0.8217907227615966,0.8217907227615966,0.8217907227615966,0.8217907227615966,0.8217907227615966,0.6796116504854369,0.6530045751641177,0.660508346535501,0.7168972604426495,0.7384624839319754,0.7523092684328068,0.6796116504854369,0.5041722603829375,0.48337509847759624,0.5132814739940859,0.5217542791488864,0.5259152810735493,0.5307415745410603
25
+ 4.930184804928132,4800,0.6796116504854369,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06427555485009323,0.470873786407767,0.5119331913488326,0.28038834951456315,0.6726577129232287,0.17320388349514557,0.788021792964523,0.12394822006472495,0.8328962977521837,0.09766990291262137,0.8687397875786594,0.6796116504854369,0.8216828478964402,0.8216828478964402,0.8216828478964402,0.8216828478964402,0.8216828478964402,0.6796116504854369,0.6515292076635256,0.6598571989751485,0.7157338182976709,0.7357126940189814,0.7500853808896866,0.6796116504854369,0.5012149610968577,0.48128476255481567,0.5105374388587102,0.518381647971727,0.5228375783347256,0.52765377953199
eval/Information-Retrieval_evaluation_mix_de_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.6344253770150806,0.9162766510660426,0.9469578783151326,0.9667186687467498,0.9771190847633905,0.9807592303692148,0.6344253770150806,0.23920090136938812,0.11479459178367135,0.8375368348067257,0.04939157566302653,0.8966545328479806,0.025647425897035885,0.9294591783671347,0.017403362801178716,0.9454411509793725,0.013179927197087887,0.954671520194141,0.6344253770150806,0.7154567496545969,0.7164873896797027,0.7167851059045405,0.7168681243456226,0.7168889628204629,0.6344253770150806,0.7104440920923791,0.727052248004168,0.7344754268508454,0.7376522077785336,0.7393387420118572,0.6344253770150806,0.6257814749104975,0.6304109268976704,0.6314799748263475,0.6317780435067671,0.6318998853259603,0.6320817333061355
3
+ 0.4106776180698152,400,0.6302652106084243,0.9193967758710349,0.9568382735309412,0.9734789391575663,0.9802392095683827,0.984399375975039,0.6302652106084243,0.237467498699948,0.11697867914716588,0.852071416189981,0.0500988039521581,0.909247703241463,0.025943837753510147,0.9403276131045242,0.017548968625411682,0.9540921987756703,0.013283931357254294,0.9625265712382881,0.6302652106084243,0.71339454101978,0.714618220993756,0.7148715019847143,0.714926945318205,0.7149496528819548,0.6302652106084243,0.7169177792467166,0.732815407577552,0.7398233973489737,0.7425152641631246,0.7440761165796961,0.6302652106084243,0.6317881781240169,0.6360012087906985,0.6370435250510715,0.6372978183386906,0.6374099972729859,0.637592220733555
4
+ 0.6160164271047228,600,0.6437857514300572,0.9261570462818512,0.9599583983359334,0.9739989599583984,0.982839313572543,0.9890795631825273,0.6437857514300572,0.24232102617438028,0.11822672906916276,0.8599410643092391,0.05055642225689029,0.9178713815219275,0.026110244409776395,0.9466545328479806,0.017642572369561446,0.9592217022014214,0.013367134685387418,0.9686094461322312,0.6437857514300572,0.7212451319321626,0.7224096732633243,0.7226097405775619,0.7226813963132841,0.7227176521950948,0.6437857514300572,0.7273890361910328,0.743377128471396,0.7498261458492479,0.7523048554376615,0.7540488657467284,0.6437857514300572,0.6446502504587962,0.6488585347794174,0.6498004011509979,0.6500217432824585,0.6501418202292603,0.6503067787784532
5
+ 0.8213552361396304,800,0.6365054602184087,0.9318772750910036,0.9609984399375975,0.9781591263650546,0.9864794591783671,0.9901196047841914,0.6365054602184087,0.2388455538221529,0.12028081123244928,0.8737996186514126,0.051024440977639106,0.9250563355867567,0.026224648985959446,0.9500780031201248,0.017746576529727855,0.9645519154099498,0.013429537181487265,0.9726396073386796,0.6365054602184087,0.7204184984710036,0.7214267618606719,0.7216699715120036,0.7217382545624995,0.7217609153800677,0.6365054602184087,0.7349531773333724,0.7492214597066293,0.7548626670122398,0.7576951821628768,0.7592195705497963,0.6365054602184087,0.6518006861926513,0.6556844260524086,0.6564911604045681,0.6567530624481217,0.6568648714252107,0.6569837408778183
6
+ 1.0266940451745379,1000,0.6614664586583463,0.9417576703068122,0.968278731149246,0.982839313572543,0.9859594383775351,0.9906396255850234,0.6614664586583463,0.24892529034494712,0.12210088403536141,0.8867481365921304,0.05140925637025483,0.9321632865314612,0.026406656266250658,0.9572109551048709,0.017822846247183218,0.9683914023227596,0.013447737909516384,0.97424163633212,0.6614664586583463,0.7378667349683634,0.7387347213064364,0.7389434291025708,0.7389675509835664,0.7389950054618923,0.6614664586583463,0.74879398763607,0.7614850254616436,0.7670787375213847,0.7693192347084264,0.770391520120578,0.6614664586583463,0.6661456500034447,0.669742959851898,0.6705324561641784,0.6707525102840138,0.6708227426410505,0.6709548774084962
7
+ 1.2320328542094456,1200,0.6640665626625065,0.9417576703068122,0.9713988559542381,0.9859594383775351,0.9911596463858554,0.9927197087883516,0.6640665626625065,0.2498353267464032,0.12246489859594382,0.8881348587276824,0.051814872594903805,0.9393135725429018,0.026573062922516905,0.9629051828739816,0.017923383602010744,0.9741723002253424,0.01352314092563703,0.9803258797018548,0.6640665626625065,0.7428284555066176,0.7438335867331516,0.7440466868161973,0.7440892009760369,0.7440981514489251,0.6640665626625065,0.7536588794532624,0.7677844465395418,0.773065427251908,0.7752830717709518,0.7763930270674854,0.6640665626625065,0.6723321385155943,0.6761597915175159,0.6769148751810758,0.6771206893130433,0.6772043778397764,0.6773218281064297
8
+ 1.4373716632443532,1400,0.6625065002600105,0.9427977119084764,0.9719188767550702,0.982839313572543,0.9875195007800313,0.9895995839833593,0.6625065002600105,0.24970532154619518,0.12228289131565262,0.8879615184607385,0.051690067602704115,0.9367741376321719,0.026557462298491947,0.9619691454324839,0.0178783151326053,0.9713208528341133,0.013481539261570467,0.9764083896689201,0.6625065002600105,0.7441340613877233,0.745067726819568,0.7452240350359648,0.7452597501909166,0.7452721286641601,0.6625065002600105,0.7541985517706143,0.7677594387352457,0.7733732054144106,0.7752072319852946,0.7761516373021919,0.6625065002600105,0.6725920024525417,0.6762626434895126,0.6770828454535611,0.6772510249288679,0.6773223502333154,0.677431017638506
9
+ 1.6427104722792607,1600,0.6760270410816432,0.9474778991159646,0.9760790431617264,0.9854394175767031,0.9890795631825273,0.9937597503900156,0.6760270410816432,0.2547495233142659,0.12360894435777431,0.897408563009187,0.052220488819552796,0.9460738429537182,0.026671866874674995,0.9660079736522794,0.017909516380655226,0.9730455884902062,0.013520540821632869,0.9796325186340786,0.6760270410816432,0.7551551616521203,0.7561259821739399,0.7562634626773264,0.7562934872519096,0.7563201808542712,0.6760270410816432,0.7653287180372449,0.7788107955171397,0.7832977993397948,0.7846828186152417,0.7858806317410185,0.6760270410816432,0.6850192938170144,0.6886134215834364,0.6892864805351727,0.6894222348913406,0.6895044968976431,0.689598657948706
10
+ 1.8480492813141685,1800,0.6807072282891315,0.9495579823192928,0.9765990639625585,0.9875195007800313,0.9895995839833593,0.9937597503900156,0.6807072282891315,0.256396255850234,0.12412896515860634,0.9012480499219968,0.05233489339573584,0.9489772924250303,0.026755070202808116,0.969232102617438,0.01796498526607731,0.9761657132951984,0.013541341653666149,0.981046610285464,0.6807072282891315,0.758900318948171,0.7598338243609097,0.7599956996949736,0.7600137442617205,0.7600384897417922,0.6807072282891315,0.7703615492283789,0.7835275429746791,0.7881172738084409,0.7894880201135128,0.7903753937102421,0.6807072282891315,0.6916545098842778,0.6951878095332933,0.6958637254929401,0.6959986247601696,0.6960559481172263,0.6961225657360403
11
+ 2.0544147843942504,2000,0.672386895475819,0.9490379615184608,0.9745189807592304,0.9906396255850234,0.9927197087883516,0.9947997919916797,0.672386895475819,0.25405616224648986,0.12436297451898075,0.9026347720575489,0.05219968798751951,0.9466805338880222,0.026812272490899642,0.9715028601144045,0.01801005373548275,0.9781518278274991,0.013590743629745194,0.9834414429208748,0.672386895475819,0.7533253719879798,0.7542129131078366,0.7544610680683066,0.7544793051434533,0.7544912688639335,0.672386895475819,0.7685948264550578,0.7807793025416258,0.7864173979749887,0.7877716046064623,0.7887795296626381,0.672386895475819,0.6900330240667011,0.6933112237170552,0.6941401155906461,0.6942763113476816,0.6943529423224158,0.6944171436906563
12
+ 2.259753593429158,2200,0.6677067082683308,0.9511180447217888,0.9760790431617264,0.9864794591783671,0.9895995839833593,0.9901196047841914,0.6677067082683308,0.25028601144045765,0.12420696827873115,0.9009447044548449,0.0525533021320853,0.952114751256717,0.026749869994799797,0.9686527811989672,0.017978852487432834,0.9762478218426983,0.013523140925637028,0.9788479258468583,0.6677067082683308,0.7477333529386719,0.7486077915260884,0.7487654508851975,0.7487899615001886,0.7487927275682782,0.6677067082683308,0.7611174278828916,0.775342697232971,0.7790892939556582,0.780601469332911,0.7810925737542194,0.6677067082683308,0.6783749999025972,0.6823904468318427,0.6829323799715729,0.6830793649335287,0.6831174579577803,0.6832206641797627
13
+ 2.465092402464066,2400,0.6614664586583463,0.9490379615184608,0.9812792511700468,0.9880395215808633,0.9916796671866874,0.9942797711908476,0.6614664586583463,0.24902929450511355,0.12462298491939676,0.9041254983532674,0.0526053042121685,0.952938117524701,0.026848673946957884,0.9726122378228462,0.01805165539954931,0.9807592303692148,0.013585543421736873,0.9842260357080949,0.6614664586583463,0.7461957526657247,0.747303388795859,0.7474051768900506,0.747439409127278,0.7474540130538289,0.6614664586583463,0.7626398759341885,0.7760966969247224,0.7805014154384526,0.7821224861121152,0.7827536359014168,0.6614664586583463,0.6796068264670015,0.6831090574375399,0.6837955449232651,0.6839570106233552,0.6840007226117791,0.6840581762549828
14
+ 2.6704312114989732,2600,0.6593863754550182,0.9521580863234529,0.9781591263650546,0.9885595423816953,0.9916796671866874,0.9927197087883516,0.6593863754550182,0.24888195527821114,0.12464898595943837,0.9037788178193794,0.052615704628185135,0.9536748136592129,0.026838273530941245,0.9726122378228462,0.018030854567516033,0.9798058589010227,0.013577743109724393,0.983706014907263,0.6593863754550182,0.7450118792974496,0.7459233716098844,0.7460836163226817,0.746110080911613,0.7461160577740643,0.6593863754550182,0.7632647768405295,0.7771206498354316,0.7813352527148321,0.7827797812335575,0.78349550041372,0.6593863754550182,0.6816473846685526,0.6855400594001172,0.6861365106059225,0.6862809580744986,0.6863345734580474,0.6864072135069593
15
+ 2.875770020533881,2800,0.6713468538741549,0.9604784191367655,0.9807592303692148,0.9911596463858554,0.9937597503900156,0.9942797711908476,0.6713468538741549,0.2525394349107298,0.12607904316172647,0.9151326053042121,0.052761310452418116,0.9560582423296932,0.026905876235049406,0.9749523314265903,0.01805165539954931,0.9808459005026867,0.013598543941757673,0.9849901013584402,0.6713468538741549,0.75623493583724,0.7568769527408898,0.7570352887431343,0.7570553317195455,0.7570580831523541,0.6713468538741549,0.775184328419084,0.7865631782623674,0.790784471766941,0.7919595308129842,0.792734216359285,0.6713468538741549,0.6943671308998962,0.6975066003178053,0.6981198142197379,0.6982320992439925,0.6982994930665031,0.6983639554236528
16
+ 3.082135523613963,3000,0.6822672906916276,0.9568382735309412,0.982839313572543,0.9901196047841914,0.9932397295891836,0.9942797711908476,0.6822672906916276,0.256829606517594,0.12628705148205926,0.9162159819726122,0.052854914196567876,0.9577483099323973,0.02693187727509101,0.9752990119604784,0.018093257063615874,0.9829259837060149,0.013598543941757673,0.9848327266423991,0.6822672906916276,0.7621079979725878,0.7629566962864154,0.7630752350897654,0.7631038399168613,0.7631096181040986,0.6822672906916276,0.7811729750018951,0.792644981529038,0.7966378758298511,0.7981349306609089,0.7984956990220653,0.6822672906916276,0.7035374462515644,0.7066075681736091,0.7072291845777486,0.7073722184169415,0.7073993213084007,0.7074583945732716
17
+ 3.2874743326488707,3200,0.6843473738949558,0.9604784191367655,0.9823192927717108,0.9911596463858554,0.9932397295891836,0.9937597503900156,0.6843473738949558,0.2576963078523141,0.1268070722828913,0.9202894782457965,0.0528861154446178,0.9586150112671172,0.02693707748309933,0.9756456924943665,0.018082856647599233,0.9823192927717108,0.013603744149765992,0.985352747443231,0.6843473738949558,0.7654486840283385,0.7662126999529462,0.7663396046000881,0.7663583738615874,0.7663615255028047,0.6843473738949558,0.7840580491445673,0.7946441512924994,0.7985033528291549,0.799829589194914,0.8003891111923614,0.6843473738949558,0.7054119453921247,0.7082500969062815,0.7088309364659366,0.7089589691771594,0.7090062630791154,0.7090663209231032
18
+ 3.4928131416837784,3400,0.6786271450858035,0.9578783151326054,0.9812792511700468,0.9901196047841914,0.9921996879875195,0.9942797711908476,0.6786271450858035,0.2553562142485699,0.12605304212168486,0.9155919570116138,0.05284451378055124,0.9570809499046629,0.026916276651066048,0.9747789911596464,0.018082856647599236,0.9820592823712948,0.013590743629745194,0.984052695441151,0.6786271450858035,0.7618123785704909,0.7626483312770265,0.762783296123538,0.7628022594493395,0.7628139366048121,0.6786271450858035,0.7804996298069545,0.7922411941240319,0.7962178675445386,0.7976697743080848,0.7980366726946424,0.6786271450858035,0.7020244655303899,0.7053358993662525,0.7059294889833734,0.7060755964681326,0.7060977540861221,0.7061502667559332
19
+ 3.6981519507186857,3600,0.6838273530941238,0.9578783151326054,0.983359334373375,0.9921996879875195,0.9932397295891836,0.9942797711908476,0.6838273530941238,0.25669960131738606,0.1267030681227249,0.91910209741723,0.05295891835673429,0.9595250476685734,0.027020280811232453,0.9784191367654707,0.018089790258276995,0.9825793031721269,0.013601144045761834,0.985006066909343,0.6838273530941238,0.7661729799331041,0.7671025816416667,0.7672384131014173,0.7672462613586101,0.7672518447713536,0.6838273530941238,0.7850330793077392,0.796268140815968,0.8005411338457662,0.801353721980022,0.8018034956410856,0.6838273530941238,0.7072059841747639,0.7101927966897146,0.7108518832571713,0.7109339172640142,0.7109687933004619,0.7110279718628123
20
+ 3.9034907597535935,3800,0.6911076443057722,0.9667186687467498,0.9817992719708788,0.9901196047841914,0.9932397295891836,0.9942797711908476,0.6911076443057722,0.2594297105217542,0.12737909516380655,0.9245796498526608,0.05304212168486741,0.9610417750043336,0.02693707748309933,0.9754723522274225,0.018096723868954757,0.9829259837060148,0.013603744149765994,0.985266077309759,0.6911076443057722,0.7742165554746149,0.7747482453528861,0.7748712121103124,0.7749002572614547,0.7749061846272107,0.6911076443057722,0.790403614299072,0.8006346620691371,0.803900673057157,0.8053724034204305,0.8057940253076566,0.6911076443057722,0.7113657198577088,0.7143288879146555,0.7148088766611875,0.7149540804999928,0.7149832843865522,0.7150425998464863
21
+ 4.108829568788501,4000,0.6947477899115965,0.967758710348414,0.984399375975039,0.9901196047841914,0.9932397295891836,0.9932397295891836,0.6947477899115965,0.26064309239036226,0.12769110764430577,0.9266163979892529,0.05316692667706709,0.9632518634078697,0.026978679147165893,0.9771190847633905,0.018082856647599233,0.982232622638239,0.013595943837753513,0.984659386375455,0.6947477899115965,0.775106319970792,0.7756762344136855,0.7757636235577245,0.7757917238264626,0.7757917238264626,0.6947477899115965,0.7916550876560119,0.8018356667177752,0.8049830038156018,0.8060041518104935,0.8064526867706615,0.6947477899115965,0.7123386461179687,0.7151736057555711,0.7156740227134941,0.7157705885677804,0.7158097678043102,0.7158747359338941
22
+ 4.314168377823409,4200,0.6859074362974519,0.9661986479459178,0.982839313572543,0.9927197087883516,0.9932397295891836,0.9937597503900156,0.6859074362974519,0.2577396429190501,0.12732709308372334,0.9241896342520368,0.05308372334893397,0.9614317906049575,0.027025481019240776,0.9787224822326227,0.018103657479632513,0.983359334373375,0.013606344253770154,0.9854394175767031,0.6859074362974519,0.7703397211809108,0.7708870204854694,0.7710242509181896,0.7710286578741289,0.7710319701085292,0.6859074362974519,0.7894367570955271,0.7998923204035095,0.8037683941688618,0.8046891228048068,0.8050715563658618,0.6859074362974519,0.711359959198991,0.7143436554485498,0.7149332520404413,0.7150312982701879,0.7150609466134881,0.715115635794944
23
+ 4.519507186858316,4400,0.6942277691107644,0.9667186687467498,0.983359334373375,0.9916796671866874,0.9932397295891836,0.9942797711908476,0.6942277691107644,0.26120644825793027,0.12784711388455536,0.927873114924597,0.05319812792511702,0.9637285491419657,0.0270306812272491,0.9789391575663027,0.018110591090310275,0.9837926850407349,0.013616744669786796,0.9862194487779511,0.6942277691107644,0.7761581892584265,0.7766868481375114,0.7768104145556238,0.7768244234791826,0.7768305684544853,0.6942277691107644,0.7952836406043297,0.8052399503452229,0.8086752401344494,0.8096382458419952,0.810085192105751,0.6942277691107644,0.7188197545745756,0.7215707141808124,0.7220898692554206,0.7221900369972237,0.7222223600003219,0.7222810622423789
24
+ 4.724845995893224,4600,0.6895475819032761,0.9672386895475819,0.983879355174207,0.9921996879875195,0.9932397295891836,0.9942797711908476,0.6895475819032761,0.2593430403882822,0.12810712428497137,0.9296931877275091,0.05329173166926679,0.9652886115444618,0.0270566822672907,0.9797191887675507,0.018107124284971396,0.9835326746403189,0.013619344773790953,0.9862194487779511,0.6895475819032761,0.77474131277538,0.7752786950401535,0.7753996393885253,0.7754090671651442,0.7754150110363532,0.6895475819032761,0.7960673141716103,0.8059435010700053,0.8091935286265849,0.8099392087586132,0.8104433137439041,0.6895475819032761,0.7198609917140115,0.7225763770177105,0.7230590007971497,0.7231361328506057,0.7231741651357827,0.7232269917591311
25
+ 4.930184804928132,4800,0.6926677067082684,0.9641185647425897,0.983879355174207,0.9921996879875195,0.9932397295891836,0.9942797711908476,0.6926677067082684,0.2603830819899463,0.12797711908476336,0.928479805858901,0.053281331253250144,0.9650286011440458,0.027051482059282376,0.9796325186340786,0.018110591090310275,0.9837060149072628,0.013619344773790953,0.9862194487779511,0.6926677067082684,0.7766838069642311,0.7773792960985305,0.7775026273925645,0.7775124036000293,0.7775182983569378,0.6926677067082684,0.7967328692326251,0.8068705787791701,0.810158579950017,0.8109641919896999,0.8114360342473703,0.6926677067082684,0.7210301157895639,0.7237555751939095,0.7242426468613273,0.7243265313145111,0.7243628241480395,0.7244144669299598
eval/Information-Retrieval_evaluation_mix_es_results.csv ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 0.2053388090349076,200,0.6682267290691628,0.9136765470618825,0.9516380655226209,0.9724388975559022,0.983359334373375,0.9859594383775351,0.6682267290691628,0.25837728747245126,0.11190847633905356,0.831244583116658,0.048268330733229343,0.8938377535101404,0.02508580343213729,0.9276737736176114,0.017042815045935168,0.945597157219622,0.012917316692667708,0.9556248916623331,0.6682267290691628,0.7381483779939648,0.739357462180436,0.7396683082611806,0.7397580722635452,0.7397730546413436,0.6682267290691628,0.7213568073657702,0.7384396880681909,0.7459926094063213,0.7494691158863367,0.7512910243348265,0.6682267290691628,0.6437240783294677,0.648094326722431,0.6491760970089326,0.6494917071843388,0.6496288285234926,0.6498276413739233
3
+ 0.4106776180698152,400,0.6604264170566823,0.9282371294851794,0.9604784191367655,0.9771190847633905,0.9864794591783671,0.9875195007800313,0.6604264170566823,0.2557511824282495,0.11461258450338012,0.851594730455885,0.049193967758710364,0.9111197781244584,0.025434217368694754,0.9408736349453978,0.017302825446351183,0.9600346680533888,0.01307332293291732,0.9674536314785925,0.6604264170566823,0.737033626851442,0.7380887789983893,0.7383181276304548,0.7383970534972597,0.738402739501296,0.6604264170566823,0.7330213732381511,0.7493598664067755,0.7559911792230241,0.7597126022076224,0.7610407272000793,0.6604264170566823,0.6557894414908114,0.6602305201747474,0.6611963333911111,0.6615334732665976,0.661634924019516,0.6617945676009057
4
+ 0.6160164271047228,600,0.6703068122724909,0.9251170046801872,0.9646385855434217,0.9823192927717108,0.9864794591783671,0.9890795631825273,0.6703068122724909,0.25881930515315854,0.11632865314612584,0.8614317906049576,0.0495891835673427,0.9175940370948171,0.025631825273010923,0.9490552955451551,0.017316692667706707,0.9616571329519848,0.013107124284971402,0.9706274917663374,0.6703068122724909,0.7431267016583823,0.744504073877143,0.744772086796758,0.7448058483789589,0.7448203702902138,0.6703068122724909,0.7408715563512193,0.7562634581000692,0.7631542350614856,0.7656074684588632,0.7672304340171319,0.6703068122724909,0.6631579471812458,0.6671022534765748,0.6680657053022573,0.6683084445210381,0.6684283438543549,0.6685847473164434
5
+ 0.8213552361396304,800,0.6713468538741549,0.9355174206968279,0.96931877275091,0.9859594383775351,0.9911596463858554,0.9942797711908476,0.6713468538741549,0.25955600128767053,0.11739469578783152,0.8692494366441323,0.050150806032241306,0.9268937424163632,0.025798231929277177,0.9533801352054082,0.017424163633211993,0.9657046281851274,0.013151326053042124,0.9716970012133819,0.6713468538741549,0.74653622096617,0.7477123940580572,0.7479461942810823,0.7479925651121466,0.7480098526637375,0.6713468538741549,0.7487631390672219,0.7647296695042515,0.770563932615685,0.7729836170242523,0.7740761869504545,0.6713468538741549,0.6732123350006812,0.6775067092528217,0.6783446057459074,0.6785725152144115,0.6786503605131298,0.6787609601395801
6
+ 1.0266940451745379,1000,0.6833073322932918,0.9412376495059802,0.9724388975559022,0.982839313572543,0.9885595423816953,0.9932397295891836,0.6833073322932918,0.26488621449619887,0.11913676547061883,0.8817039348240596,0.05047321892875717,0.933662679840527,0.02581383255330214,0.9538654879528514,0.017382561969145432,0.9627145085803432,0.013133125325013003,0.9698734616051309,0.6833073322932918,0.756487637440084,0.7575907306427382,0.7577425000341975,0.7577897665080927,0.7578185236668188,0.6833073322932918,0.7606305076986049,0.7748889100226123,0.7794369401503287,0.7812055405695097,0.7825015658302185,0.6833073322932918,0.6849089805072761,0.6887250998647371,0.6894109340285521,0.6895723596998252,0.6896608739076088,0.6898032231487351
7
+ 1.2320328542094456,1200,0.6838273530941238,0.9438377535101404,0.9698387935517421,0.983359334373375,0.9901196047841914,0.9932397295891836,0.6838273530941238,0.26389817497461804,0.12012480499219969,0.8885075403016122,0.050598023920956844,0.9356474258970359,0.02591783671346854,0.9586063442537701,0.01747963251863408,0.9705148205928237,0.013190327613104527,0.9763390535621425,0.6838273530941238,0.7573154945327627,0.7582590904645041,0.7584569774086162,0.7585106721859824,0.7585282518582639,0.6838273530941238,0.7643896765848259,0.7773617123612077,0.7824030366568149,0.784655475735648,0.7857202455190264,0.6838273530941238,0.6888259776378871,0.6923774277626703,0.6930839123170334,0.6932881558723202,0.6933588432494899,0.6934946348665448
8
+ 1.4373716632443532,1400,0.6807072282891315,0.9485179407176287,0.9729589183567343,0.9823192927717108,0.9864794591783671,0.9906396255850234,0.6807072282891315,0.2627887972661764,0.1204108164326573,0.8919310105737563,0.05068122724908998,0.9366181313919223,0.025829433177327096,0.9546541861674468,0.01735482752643439,0.9613797885248743,0.013101924076963083,0.9680533888022187,0.6807072282891315,0.7548938476398593,0.7557372179374756,0.7558702884334163,0.7559044941059395,0.7559291821572394,0.6807072282891315,0.7639432802484696,0.7763357134016274,0.7803181410279327,0.7816804927525401,0.7828717158223965,0.6807072282891315,0.6869247793584264,0.690426614619918,0.6910037251248864,0.6911307673284897,0.6912093183129102,0.6913586880762755
9
+ 1.6427104722792607,1600,0.6978679147165887,0.9516380655226209,0.9724388975559022,0.984399375975039,0.9911596463858554,0.9937597503900156,0.6978679147165887,0.26963573781046485,0.12150286011440456,0.8992633038654879,0.050722828913156534,0.9370341480325879,0.025886635465418625,0.9562575836366789,0.017448431270584153,0.9667880048535276,0.013172126885075406,0.9729485179407177,0.6978679147165887,0.7696703094424816,0.7703453828527084,0.7705172992669082,0.7705720836308187,0.7705870876477192,0.6978679147165887,0.776144666041185,0.7866034130682366,0.7908437606823648,0.7928994948878915,0.7940159110162145,0.6978679147165887,0.701448431885515,0.7043277906309915,0.7049179622173299,0.7051045425383655,0.705185302891489,0.7052948837628819
10
+ 1.8480492813141685,1800,0.702548101924077,0.9547581903276131,0.9771190847633905,0.9859594383775351,0.9916796671866874,0.9937597503900156,0.702548101924077,0.271325805413169,0.12251690067602702,0.9064309239036228,0.051128445137805525,0.9447217888715549,0.025980239209568386,0.9597157219622118,0.01747963251863408,0.9678315132605304,0.013187727509100368,0.9737250823366268,0.702548101924077,0.7750169390819515,0.7757553957430369,0.7758849000311322,0.7759306176182311,0.775943198374107,0.702548101924077,0.7826716176193833,0.7931678016587755,0.7965031427540157,0.7981149726917887,0.7991800715217082,0.702548101924077,0.7078957102902047,0.7107552115447331,0.7112318899834764,0.7113798544189331,0.7114543827904634,0.711541729696531
11
+ 2.0544147843942504,2000,0.7103484139365575,0.9557982319292772,0.9760790431617264,0.9890795631825273,0.9937597503900156,0.9973998959958398,0.7103484139365575,0.27387390733724587,0.12293291731669267,0.9092823712948518,0.05097243889755592,0.9418096723868954,0.026027041081643273,0.9616918010053735,0.0175455018200728,0.9718218062055816,0.013242329693187734,0.9777535101404057,0.7103484139365575,0.7812909271181215,0.7819320854812777,0.7821121832732618,0.7821505989449018,0.78217172495247,0.7103484139365575,0.7868309758265937,0.795735229322949,0.8001095681572321,0.8021174349272104,0.8032046931418886,0.7103484139365575,0.7119065804063727,0.7142841892230629,0.7148851704393682,0.7150778280993605,0.7151514837506158,0.7152516872020032
12
+ 2.259753593429158,2200,0.6957878315132605,0.9568382735309412,0.9776391055642226,0.9880395215808633,0.9921996879875195,0.9942797711908476,0.6957878315132605,0.2686997003689672,0.12290691627665107,0.909421043508407,0.051253250130005215,0.9469058762350494,0.026084243369734795,0.9629155832899983,0.017573236262783842,0.9734026694401109,0.013257930317212691,0.979767724042295,0.6957878315132605,0.771977419764412,0.7726696525537105,0.772816863980678,0.7728512986234659,0.7728638525732718,0.6957878315132605,0.7823754495426941,0.7926550116762732,0.7962027331742831,0.7982286641162097,0.7993430882337849,0.6957878315132605,0.706839624825468,0.709594179988833,0.7101051778063159,0.7102937357355014,0.7103713351901881,0.7104777786737774
13
+ 2.465092402464066,2400,0.6989079563182528,0.9552782111284451,0.9781591263650546,0.9875195007800313,0.9921996879875195,0.9947997919916797,0.6989079563182528,0.2697137409305896,0.12217888715548621,0.9047321892875716,0.05127405096203849,0.946975212341827,0.026115444617784717,0.9647252556768938,0.017559369041428324,0.9730109204368175,0.013242329693187732,0.9784711388455538,0.6989079563182528,0.7752447491673757,0.7760003906786834,0.7761312184066443,0.7761694598895253,0.7761841093673262,0.6989079563182528,0.7836072837704448,0.7953142522429503,0.7992307781317647,0.800844654977285,0.8018320387474135,0.6989079563182528,0.7105074257229527,0.7137158961627179,0.7142936559201694,0.714443631868402,0.7145141566208517,0.7146101778464389
14
+ 2.6704312114989732,2600,0.7134685387415497,0.9578783151326054,0.9786791471658867,0.9854394175767031,0.9942797711908476,0.9958398335933437,0.7134685387415497,0.27513062427258994,0.12314092563702549,0.9113971225515688,0.05128445137805514,0.9468192061015774,0.026136245449817998,0.9652106084243369,0.017590570289478243,0.9750043335066735,0.013268330733229333,0.9809845727162421,0.7134685387415497,0.7863482128177135,0.7870096294892548,0.7871109086209715,0.7871857168259243,0.7871952694855203,0.7134685387415497,0.7913162052199003,0.8010780764764179,0.8051396136681125,0.8070095766622101,0.8080712829793945,0.7134685387415497,0.7172104329563992,0.7198617376892691,0.7204918764319661,0.7206381690081209,0.7207131596457822,0.7208007394228605
15
+ 2.875770020533881,2800,0.71866874674987,0.9589183567342694,0.9791991679667187,0.9890795631825273,0.9963598543941757,0.9963598543941757,0.71866874674987,0.27801673971720775,0.12366094643785752,0.915037268157393,0.05133645345813834,0.9478419136765471,0.026167446697867924,0.965990639625585,0.017614837926850403,0.975631825273011,0.013247529901196051,0.9784919396775871,0.71866874674987,0.7884016294872362,0.7890410308231185,0.7891831488842322,0.7892410903169512,0.7892410903169512,0.71866874674987,0.7966865394629457,0.8057592902842505,0.8097706851555725,0.8116268770930806,0.8121403469593211,0.71866874674987,0.7250649252001589,0.727570685922096,0.7281377980449529,0.728299081190692,0.7283386921757053,0.7284375209653465
16
+ 3.082135523613963,3000,0.7259490379615184,0.9589183567342694,0.982839313572543,0.9921996879875195,0.9958398335933437,0.9963598543941757,0.7259490379615184,0.2809375232152143,0.1235049401976079,0.9136418790084937,0.05143005720228811,0.9494193101057375,0.026203848153926162,0.9672733576009707,0.017597503900156006,0.9743976425723696,0.013244929797191891,0.9782111284451378,0.7259490379615184,0.7946893669911347,0.7954891365677775,0.7956216945335608,0.795650518262808,0.7956532843308975,0.7259490379615184,0.8004617848221726,0.8103222419295303,0.8142623334382676,0.8156467939791169,0.8163189962784397,0.7259490379615184,0.7306155607708827,0.7331987477300788,0.7337537325667062,0.7338830856887248,0.73393275636574,0.7340165393819637
17
+ 3.2874743326488707,3200,0.719188767550702,0.9604784191367655,0.9776391055642226,0.9880395215808633,0.9927197087883516,0.9947997919916797,0.719188767550702,0.2782334150508878,0.12384295371814871,0.9160253076789738,0.05138845553822154,0.9485526087710175,0.026177847113884566,0.9665973305598892,0.01758710348413936,0.97424163633212,0.013257930317212691,0.9794591783671347,0.719188767550702,0.7879463341655366,0.7885452337555502,0.7887052150488122,0.7887472532640811,0.7887587198653747,0.719188767550702,0.7962924850863434,0.8052777275687588,0.8092454705018242,0.8107254354272118,0.8116595145864065,0.719188767550702,0.7239495654466136,0.7263898722061299,0.7269454539480055,0.7270736479615637,0.7271408255225633,0.7272562625469244
18
+ 3.4928131416837784,3400,0.7233489339573583,0.9573582943317732,0.9812792511700468,0.9906396255850234,0.9937597503900156,0.9942797711908476,0.7233489339573583,0.2793514597726766,0.12332293291731669,0.9113364534581383,0.051419656786271466,0.9492979719188768,0.026136245449817998,0.9644652452764777,0.01757323626278384,0.972473565609291,0.013213728549141969,0.9751776737736176,0.7233489339573583,0.7899731852617836,0.7907610622253849,0.7908979304775949,0.7909242043473037,0.7909269852072012,0.7233489339573583,0.7961935945622317,0.8065361352116066,0.8099143859553151,0.8114979980109926,0.811974610921689,0.7233489339573583,0.7256388010608531,0.7283365343525039,0.7288243354765178,0.7289757842105498,0.7290124400366391,0.7291013727697001
19
+ 3.6981519507186857,3600,0.7259490379615184,0.9630785231409257,0.9817992719708788,0.9895995839833593,0.9921996879875195,0.9947997919916797,0.7259490379615184,0.2804348364410767,0.12420696827873115,0.9179580516553995,0.0516588663546542,0.9536314785924771,0.026219448777951126,0.9680533888022187,0.017597503900156002,0.9745536488126192,0.013273530941237652,0.9804992199687987,0.7259490379615184,0.7976710485107275,0.7982817633632742,0.7984045422003805,0.7984275106378795,0.7984428759316601,0.7259490379615184,0.8039656371357985,0.8137660849986358,0.8169571755430387,0.8182210403466826,0.8192700041493896,0.7259490379615184,0.733937670625927,0.7366410039262788,0.737114253940337,0.737228717715647,0.7373013964619535,0.7373725842148366
20
+ 3.9034907597535935,3800,0.7311492459698388,0.9609984399375975,0.9812792511700468,0.9927197087883516,0.9947997919916797,0.9958398335933437,0.7311492459698388,0.2821942401505584,0.12392095683827353,0.9157392962385162,0.05156526261050443,0.9516207314959265,0.026219448777951126,0.96801872074883,0.017604437510833765,0.974900329346507,0.013265730629225174,0.9796671866874674,0.7311492459698388,0.8000753042487717,0.8007542298774801,0.8009142584752065,0.8009302933559463,0.800936240312473,0.7311492459698388,0.8056306253667299,0.8154959339072615,0.8191064857581054,0.8204434071517968,0.8213033097505814,0.7311492459698388,0.7373535400361954,0.7400522303969465,0.7405629274661326,0.7406879307407565,0.7407539064098259,0.7408430400283221
21
+ 4.108829568788501,4000,0.7358294331773271,0.9625585023400937,0.9802392095683827,0.9927197087883516,0.9947997919916797,0.9958398335933437,0.7358294331773271,0.28403164698016486,0.12438897555902236,0.9190414283237995,0.05158606344253771,0.952244756456925,0.026224648985959446,0.9685820766163981,0.017628705148205928,0.9762870514820593,0.013268330733229333,0.9801872074882996,0.7358294331773271,0.8035232306901704,0.8041564269676074,0.8043491602665708,0.8043649132860833,0.8043707455995762,0.7358294331773271,0.8089516774866639,0.8181299102768375,0.8217009899252086,0.8232345422421572,0.8239096085290897,0.7358294331773271,0.7407296211762635,0.7433011890905112,0.7437599072934008,0.7439220951644092,0.7439677461223776,0.7440630263326289
22
+ 4.314168377823409,4200,0.733749349973999,0.9604784191367655,0.982839313572543,0.9916796671866874,0.9947997919916797,0.9953198127925117,0.733749349973999,0.28340762201916647,0.12433697347893914,0.9186774137632172,0.0516588663546542,0.9536314785924771,0.026229849193967765,0.968538741549662,0.017635638758883684,0.9768070722828913,0.013273530941237652,0.9806205581556595,0.733749349973999,0.8015837695391573,0.8023398853791036,0.8024787052722444,0.8025062574128484,0.8025096562416121,0.733749349973999,0.8074696494514497,0.8170488841773651,0.8203516409516334,0.8219710202163846,0.8226411885850343,0.733749349973999,0.7389285820519963,0.7414939322506505,0.7419568857454747,0.7421153780150582,0.742164620684282,0.7422579374234903
23
+ 4.519507186858316,4400,0.7358294331773271,0.9615184607384295,0.983359334373375,0.9921996879875195,0.9942797711908476,0.9947997919916797,0.7358294331773271,0.28398831191342894,0.12477899115964639,0.9220748829953198,0.05174206968278733,0.9548448604610851,0.02630265210608425,0.9711388455538221,0.017652972785578088,0.9777604437510833,0.013294331773270933,0.9823019587450165,0.7358294331773271,0.8034886386855536,0.8042294348215404,0.8043610639446989,0.8043778926448901,0.8043807816493392,0.7358294331773271,0.8104398530748719,0.8194810222604678,0.8230427127064399,0.8243283104602539,0.8251186561711241,0.7358294331773271,0.742446597316252,0.7448760952950458,0.7453727938942869,0.7454980553388746,0.7455568923614244,0.7456455633479137
24
+ 4.724845995893224,4600,0.7384295371814873,0.9630785231409257,0.983359334373375,0.9921996879875195,0.9942797711908476,0.9963598543941757,0.7384295371814873,0.285115023648565,0.12472698907956316,0.9216415323279598,0.05177327093083725,0.9554082163286531,0.026271450858034326,0.9698387935517421,0.017635638758883684,0.9766337320159473,0.013273530941237652,0.9803813485872768,0.7384295371814873,0.8055430723700101,0.806216320765459,0.8063554621873477,0.8063737837993774,0.8063859337270688,0.7384295371814873,0.8113619374567514,0.8206771431445348,0.8238555673509024,0.8251818493639627,0.8258365367322651,0.7384295371814873,0.7435414485170373,0.7461028121794654,0.7465435818030448,0.7466754353712773,0.7467184450113576,0.746815569330807
25
+ 4.930184804928132,4800,0.7394695787831513,0.9635985439417577,0.982839313572543,0.9927197087883516,0.9947997919916797,0.9963598543941757,0.7394695787831513,0.28537503404898107,0.12488299531981278,0.9225949037961519,0.05174206968278733,0.9548015253943491,0.02629225169006761,0.970532154619518,0.017635638758883684,0.9766337320159473,0.013281331253250133,0.9810747096550528,0.7394695787831513,0.8059183822863336,0.8065662458714291,0.8067209669800003,0.8067371899834064,0.8067455244059942,0.7394695787831513,0.8119072371250002,0.8208055075822587,0.8242798548838444,0.8254601712767063,0.826231823086538,0.7394695787831513,0.7439811728319751,0.7464542457655368,0.7469341154545359,0.7470471963812441,0.7471010455519603,0.7471920688836787