jstephencorey commited on
Commit
6b25db8
·
1 Parent(s): 161cecf

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2772 -0
README.md ADDED
@@ -0,0 +1,2772 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ model-index:
5
+ - name: pythia-14m_mean
6
+ results:
7
+ - task:
8
+ type: Classification
9
+ dataset:
10
+ type: mteb/amazon_counterfactual
11
+ name: MTEB AmazonCounterfactualClassification (en)
12
+ config: en
13
+ split: test
14
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
15
+ metrics:
16
+ - type: accuracy
17
+ value: 70.73134328358208
18
+ - type: ap
19
+ value: 32.35996836729783
20
+ - type: f1
21
+ value: 64.2137087561157
22
+ - task:
23
+ type: Classification
24
+ dataset:
25
+ type: mteb/amazon_counterfactual
26
+ name: MTEB AmazonCounterfactualClassification (de)
27
+ config: de
28
+ split: test
29
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
30
+ metrics:
31
+ - type: accuracy
32
+ value: 62.291220556745174
33
+ - type: ap
34
+ value: 76.5427302441011
35
+ - type: f1
36
+ value: 60.37703210343267
37
+ - task:
38
+ type: Classification
39
+ dataset:
40
+ type: mteb/amazon_counterfactual
41
+ name: MTEB AmazonCounterfactualClassification (en-ext)
42
+ config: en-ext
43
+ split: test
44
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
45
+ metrics:
46
+ - type: accuracy
47
+ value: 67.57871064467767
48
+ - type: ap
49
+ value: 17.03033311712744
50
+ - type: f1
51
+ value: 54.821750631894986
52
+ - task:
53
+ type: Classification
54
+ dataset:
55
+ type: mteb/amazon_counterfactual
56
+ name: MTEB AmazonCounterfactualClassification (ja)
57
+ config: ja
58
+ split: test
59
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
60
+ metrics:
61
+ - type: accuracy
62
+ value: 62.51605995717344
63
+ - type: ap
64
+ value: 14.367489440317666
65
+ - type: f1
66
+ value: 50.48473578289779
67
+ - task:
68
+ type: Classification
69
+ dataset:
70
+ type: mteb/amazon_reviews_multi
71
+ name: MTEB AmazonReviewsClassification (en)
72
+ config: en
73
+ split: test
74
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
75
+ metrics:
76
+ - type: accuracy
77
+ value: 29.172000000000004
78
+ - type: f1
79
+ value: 28.264998641170465
80
+ - task:
81
+ type: Classification
82
+ dataset:
83
+ type: mteb/amazon_reviews_multi
84
+ name: MTEB AmazonReviewsClassification (de)
85
+ config: de
86
+ split: test
87
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
88
+ metrics:
89
+ - type: accuracy
90
+ value: 25.157999999999998
91
+ - type: f1
92
+ value: 23.033533062569987
93
+ - task:
94
+ type: Classification
95
+ dataset:
96
+ type: mteb/amazon_reviews_multi
97
+ name: MTEB AmazonReviewsClassification (es)
98
+ config: es
99
+ split: test
100
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
101
+ metrics:
102
+ - type: accuracy
103
+ value: 26.840000000000003
104
+ - type: f1
105
+ value: 25.693413738086402
106
+ - task:
107
+ type: Classification
108
+ dataset:
109
+ type: mteb/amazon_reviews_multi
110
+ name: MTEB AmazonReviewsClassification (fr)
111
+ config: fr
112
+ split: test
113
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
114
+ metrics:
115
+ - type: accuracy
116
+ value: 26.491999999999997
117
+ - type: f1
118
+ value: 25.6252880863665
119
+ - task:
120
+ type: Classification
121
+ dataset:
122
+ type: mteb/amazon_reviews_multi
123
+ name: MTEB AmazonReviewsClassification (ja)
124
+ config: ja
125
+ split: test
126
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
127
+ metrics:
128
+ - type: accuracy
129
+ value: 24.448000000000004
130
+ - type: f1
131
+ value: 23.86460242225935
132
+ - task:
133
+ type: Classification
134
+ dataset:
135
+ type: mteb/amazon_reviews_multi
136
+ name: MTEB AmazonReviewsClassification (zh)
137
+ config: zh
138
+ split: test
139
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
140
+ metrics:
141
+ - type: accuracy
142
+ value: 26.412000000000003
143
+ - type: f1
144
+ value: 25.779710231390755
145
+ - task:
146
+ type: Retrieval
147
+ dataset:
148
+ type: arguana
149
+ name: MTEB ArguAna
150
+ config: default
151
+ split: test
152
+ revision: None
153
+ metrics:
154
+ - type: map_at_1
155
+ value: 5.761
156
+ - type: map_at_10
157
+ value: 10.267
158
+ - type: map_at_100
159
+ value: 11.065999999999999
160
+ - type: map_at_1000
161
+ value: 11.16
162
+ - type: map_at_3
163
+ value: 8.642
164
+ - type: map_at_5
165
+ value: 9.474
166
+ - type: mrr_at_1
167
+ value: 6.046
168
+ - type: mrr_at_10
169
+ value: 10.365
170
+ - type: mrr_at_100
171
+ value: 11.178
172
+ - type: mrr_at_1000
173
+ value: 11.272
174
+ - type: mrr_at_3
175
+ value: 8.713
176
+ - type: mrr_at_5
177
+ value: 9.587
178
+ - type: ndcg_at_1
179
+ value: 5.761
180
+ - type: ndcg_at_10
181
+ value: 13.055
182
+ - type: ndcg_at_100
183
+ value: 17.526
184
+ - type: ndcg_at_1000
185
+ value: 20.578
186
+ - type: ndcg_at_3
187
+ value: 9.616
188
+ - type: ndcg_at_5
189
+ value: 11.128
190
+ - type: precision_at_1
191
+ value: 5.761
192
+ - type: precision_at_10
193
+ value: 2.212
194
+ - type: precision_at_100
195
+ value: 0.44400000000000006
196
+ - type: precision_at_1000
197
+ value: 0.06999999999999999
198
+ - type: precision_at_3
199
+ value: 4.149
200
+ - type: precision_at_5
201
+ value: 3.229
202
+ - type: recall_at_1
203
+ value: 5.761
204
+ - type: recall_at_10
205
+ value: 22.119
206
+ - type: recall_at_100
207
+ value: 44.381
208
+ - type: recall_at_1000
209
+ value: 69.70100000000001
210
+ - type: recall_at_3
211
+ value: 12.447
212
+ - type: recall_at_5
213
+ value: 16.145
214
+ - task:
215
+ type: Clustering
216
+ dataset:
217
+ type: mteb/arxiv-clustering-s2s
218
+ name: MTEB ArxivClusteringS2S
219
+ config: default
220
+ split: test
221
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
222
+ metrics:
223
+ - type: v_measure
224
+ value: 13.902183567893395
225
+ - task:
226
+ type: Reranking
227
+ dataset:
228
+ type: mteb/askubuntudupquestions-reranking
229
+ name: MTEB AskUbuntuDupQuestions
230
+ config: default
231
+ split: test
232
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
233
+ metrics:
234
+ - type: map
235
+ value: 47.93210378051478
236
+ - type: mrr
237
+ value: 60.70318339708921
238
+ - task:
239
+ type: STS
240
+ dataset:
241
+ type: mteb/biosses-sts
242
+ name: MTEB BIOSSES
243
+ config: default
244
+ split: test
245
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
246
+ metrics:
247
+ - type: cos_sim_pearson
248
+ value: 49.57650220181508
249
+ - type: cos_sim_spearman
250
+ value: 51.842145113866636
251
+ - type: euclidean_pearson
252
+ value: 41.2188173176347
253
+ - type: euclidean_spearman
254
+ value: 41.16840792962046
255
+ - type: manhattan_pearson
256
+ value: 42.73893519020435
257
+ - type: manhattan_spearman
258
+ value: 44.384746276312534
259
+ - task:
260
+ type: Classification
261
+ dataset:
262
+ type: mteb/banking77
263
+ name: MTEB Banking77Classification
264
+ config: default
265
+ split: test
266
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
267
+ metrics:
268
+ - type: accuracy
269
+ value: 46.03896103896104
270
+ - type: f1
271
+ value: 44.54083818845286
272
+ - task:
273
+ type: Clustering
274
+ dataset:
275
+ type: mteb/biorxiv-clustering-p2p
276
+ name: MTEB BiorxivClusteringP2P
277
+ config: default
278
+ split: test
279
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
280
+ metrics:
281
+ - type: v_measure
282
+ value: 23.113393015706908
283
+ - task:
284
+ type: Clustering
285
+ dataset:
286
+ type: mteb/biorxiv-clustering-s2s
287
+ name: MTEB BiorxivClusteringS2S
288
+ config: default
289
+ split: test
290
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
291
+ metrics:
292
+ - type: v_measure
293
+ value: 12.624675113307488
294
+ - task:
295
+ type: Retrieval
296
+ dataset:
297
+ type: BeIR/cqadupstack
298
+ name: MTEB CQADupstackAndroidRetrieval
299
+ config: default
300
+ split: test
301
+ revision: None
302
+ metrics:
303
+ - type: map_at_1
304
+ value: 10.105
305
+ - type: map_at_10
306
+ value: 13.364
307
+ - type: map_at_100
308
+ value: 13.987
309
+ - type: map_at_1000
310
+ value: 14.08
311
+ - type: map_at_3
312
+ value: 12.447
313
+ - type: map_at_5
314
+ value: 12.992999999999999
315
+ - type: mrr_at_1
316
+ value: 12.876000000000001
317
+ - type: mrr_at_10
318
+ value: 16.252
319
+ - type: mrr_at_100
320
+ value: 16.926
321
+ - type: mrr_at_1000
322
+ value: 17.004
323
+ - type: mrr_at_3
324
+ value: 15.235999999999999
325
+ - type: mrr_at_5
326
+ value: 15.744
327
+ - type: ndcg_at_1
328
+ value: 12.876000000000001
329
+ - type: ndcg_at_10
330
+ value: 15.634999999999998
331
+ - type: ndcg_at_100
332
+ value: 19.173000000000002
333
+ - type: ndcg_at_1000
334
+ value: 22.168
335
+ - type: ndcg_at_3
336
+ value: 14.116999999999999
337
+ - type: ndcg_at_5
338
+ value: 14.767
339
+ - type: precision_at_1
340
+ value: 12.876000000000001
341
+ - type: precision_at_10
342
+ value: 2.761
343
+ - type: precision_at_100
344
+ value: 0.5579999999999999
345
+ - type: precision_at_1000
346
+ value: 0.101
347
+ - type: precision_at_3
348
+ value: 6.676
349
+ - type: precision_at_5
350
+ value: 4.635
351
+ - type: recall_at_1
352
+ value: 10.105
353
+ - type: recall_at_10
354
+ value: 19.767000000000003
355
+ - type: recall_at_100
356
+ value: 36.448
357
+ - type: recall_at_1000
358
+ value: 58.623000000000005
359
+ - type: recall_at_3
360
+ value: 15.087
361
+ - type: recall_at_5
362
+ value: 17.076
363
+ - task:
364
+ type: Retrieval
365
+ dataset:
366
+ type: BeIR/cqadupstack
367
+ name: MTEB CQADupstackEnglishRetrieval
368
+ config: default
369
+ split: test
370
+ revision: None
371
+ metrics:
372
+ - type: map_at_1
373
+ value: 7.249999999999999
374
+ - type: map_at_10
375
+ value: 9.41
376
+ - type: map_at_100
377
+ value: 9.903
378
+ - type: map_at_1000
379
+ value: 9.993
380
+ - type: map_at_3
381
+ value: 8.693
382
+ - type: map_at_5
383
+ value: 9.052
384
+ - type: mrr_at_1
385
+ value: 9.299
386
+ - type: mrr_at_10
387
+ value: 11.907
388
+ - type: mrr_at_100
389
+ value: 12.424
390
+ - type: mrr_at_1000
391
+ value: 12.503
392
+ - type: mrr_at_3
393
+ value: 10.945
394
+ - type: mrr_at_5
395
+ value: 11.413
396
+ - type: ndcg_at_1
397
+ value: 9.299
398
+ - type: ndcg_at_10
399
+ value: 11.278
400
+ - type: ndcg_at_100
401
+ value: 13.904
402
+ - type: ndcg_at_1000
403
+ value: 16.642000000000003
404
+ - type: ndcg_at_3
405
+ value: 9.956
406
+ - type: ndcg_at_5
407
+ value: 10.488
408
+ - type: precision_at_1
409
+ value: 9.299
410
+ - type: precision_at_10
411
+ value: 2.166
412
+ - type: precision_at_100
413
+ value: 0.45399999999999996
414
+ - type: precision_at_1000
415
+ value: 0.089
416
+ - type: precision_at_3
417
+ value: 4.798
418
+ - type: precision_at_5
419
+ value: 3.427
420
+ - type: recall_at_1
421
+ value: 7.249999999999999
422
+ - type: recall_at_10
423
+ value: 14.285
424
+ - type: recall_at_100
425
+ value: 26.588
426
+ - type: recall_at_1000
427
+ value: 46.488
428
+ - type: recall_at_3
429
+ value: 10.309
430
+ - type: recall_at_5
431
+ value: 11.756
432
+ - task:
433
+ type: Retrieval
434
+ dataset:
435
+ type: BeIR/cqadupstack
436
+ name: MTEB CQADupstackGamingRetrieval
437
+ config: default
438
+ split: test
439
+ revision: None
440
+ metrics:
441
+ - type: map_at_1
442
+ value: 11.57
443
+ - type: map_at_10
444
+ value: 15.497
445
+ - type: map_at_100
446
+ value: 16.036
447
+ - type: map_at_1000
448
+ value: 16.122
449
+ - type: map_at_3
450
+ value: 14.309
451
+ - type: map_at_5
452
+ value: 14.895
453
+ - type: mrr_at_1
454
+ value: 13.354
455
+ - type: mrr_at_10
456
+ value: 17.408
457
+ - type: mrr_at_100
458
+ value: 17.936
459
+ - type: mrr_at_1000
460
+ value: 18.015
461
+ - type: mrr_at_3
462
+ value: 16.123
463
+ - type: mrr_at_5
464
+ value: 16.735
465
+ - type: ndcg_at_1
466
+ value: 13.354
467
+ - type: ndcg_at_10
468
+ value: 18.071
469
+ - type: ndcg_at_100
470
+ value: 21.017
471
+ - type: ndcg_at_1000
472
+ value: 23.669999999999998
473
+ - type: ndcg_at_3
474
+ value: 15.644
475
+ - type: ndcg_at_5
476
+ value: 16.618
477
+ - type: precision_at_1
478
+ value: 13.354
479
+ - type: precision_at_10
480
+ value: 2.94
481
+ - type: precision_at_100
482
+ value: 0.481
483
+ - type: precision_at_1000
484
+ value: 0.076
485
+ - type: precision_at_3
486
+ value: 7.001
487
+ - type: precision_at_5
488
+ value: 4.765
489
+ - type: recall_at_1
490
+ value: 11.57
491
+ - type: recall_at_10
492
+ value: 24.147
493
+ - type: recall_at_100
494
+ value: 38.045
495
+ - type: recall_at_1000
496
+ value: 58.648
497
+ - type: recall_at_3
498
+ value: 17.419999999999998
499
+ - type: recall_at_5
500
+ value: 19.875999999999998
501
+ - task:
502
+ type: Retrieval
503
+ dataset:
504
+ type: BeIR/cqadupstack
505
+ name: MTEB CQADupstackGisRetrieval
506
+ config: default
507
+ split: test
508
+ revision: None
509
+ metrics:
510
+ - type: map_at_1
511
+ value: 4.463
512
+ - type: map_at_10
513
+ value: 6.091
514
+ - type: map_at_100
515
+ value: 6.548
516
+ - type: map_at_1000
517
+ value: 6.622
518
+ - type: map_at_3
519
+ value: 5.461
520
+ - type: map_at_5
521
+ value: 5.768
522
+ - type: mrr_at_1
523
+ value: 4.746
524
+ - type: mrr_at_10
525
+ value: 6.431000000000001
526
+ - type: mrr_at_100
527
+ value: 6.941
528
+ - type: mrr_at_1000
529
+ value: 7.016
530
+ - type: mrr_at_3
531
+ value: 5.763
532
+ - type: mrr_at_5
533
+ value: 6.101999999999999
534
+ - type: ndcg_at_1
535
+ value: 4.746
536
+ - type: ndcg_at_10
537
+ value: 7.19
538
+ - type: ndcg_at_100
539
+ value: 9.604
540
+ - type: ndcg_at_1000
541
+ value: 12.086
542
+ - type: ndcg_at_3
543
+ value: 5.88
544
+ - type: ndcg_at_5
545
+ value: 6.429
546
+ - type: precision_at_1
547
+ value: 4.746
548
+ - type: precision_at_10
549
+ value: 1.141
550
+ - type: precision_at_100
551
+ value: 0.249
552
+ - type: precision_at_1000
553
+ value: 0.049
554
+ - type: precision_at_3
555
+ value: 2.448
556
+ - type: precision_at_5
557
+ value: 1.7850000000000001
558
+ - type: recall_at_1
559
+ value: 4.463
560
+ - type: recall_at_10
561
+ value: 10.33
562
+ - type: recall_at_100
563
+ value: 21.578
564
+ - type: recall_at_1000
565
+ value: 41.404
566
+ - type: recall_at_3
567
+ value: 6.816999999999999
568
+ - type: recall_at_5
569
+ value: 8.06
570
+ - task:
571
+ type: Retrieval
572
+ dataset:
573
+ type: BeIR/cqadupstack
574
+ name: MTEB CQADupstackMathematicaRetrieval
575
+ config: default
576
+ split: test
577
+ revision: None
578
+ metrics:
579
+ - type: map_at_1
580
+ value: 1.521
581
+ - type: map_at_10
582
+ value: 2.439
583
+ - type: map_at_100
584
+ value: 2.785
585
+ - type: map_at_1000
586
+ value: 2.858
587
+ - type: map_at_3
588
+ value: 2.091
589
+ - type: map_at_5
590
+ value: 2.2560000000000002
591
+ - type: mrr_at_1
592
+ value: 2.114
593
+ - type: mrr_at_10
594
+ value: 3.216
595
+ - type: mrr_at_100
596
+ value: 3.6319999999999997
597
+ - type: mrr_at_1000
598
+ value: 3.712
599
+ - type: mrr_at_3
600
+ value: 2.778
601
+ - type: mrr_at_5
602
+ value: 2.971
603
+ - type: ndcg_at_1
604
+ value: 2.114
605
+ - type: ndcg_at_10
606
+ value: 3.1910000000000003
607
+ - type: ndcg_at_100
608
+ value: 5.165
609
+ - type: ndcg_at_1000
610
+ value: 7.607
611
+ - type: ndcg_at_3
612
+ value: 2.456
613
+ - type: ndcg_at_5
614
+ value: 2.7439999999999998
615
+ - type: precision_at_1
616
+ value: 2.114
617
+ - type: precision_at_10
618
+ value: 0.634
619
+ - type: precision_at_100
620
+ value: 0.189
621
+ - type: precision_at_1000
622
+ value: 0.049
623
+ - type: precision_at_3
624
+ value: 1.202
625
+ - type: precision_at_5
626
+ value: 0.8959999999999999
627
+ - type: recall_at_1
628
+ value: 1.521
629
+ - type: recall_at_10
630
+ value: 4.8
631
+ - type: recall_at_100
632
+ value: 13.877
633
+ - type: recall_at_1000
634
+ value: 32.1
635
+ - type: recall_at_3
636
+ value: 2.806
637
+ - type: recall_at_5
638
+ value: 3.5520000000000005
639
+ - task:
640
+ type: Retrieval
641
+ dataset:
642
+ type: BeIR/cqadupstack
643
+ name: MTEB CQADupstackPhysicsRetrieval
644
+ config: default
645
+ split: test
646
+ revision: None
647
+ metrics:
648
+ - type: map_at_1
649
+ value: 7.449999999999999
650
+ - type: map_at_10
651
+ value: 10.065
652
+ - type: map_at_100
653
+ value: 10.507
654
+ - type: map_at_1000
655
+ value: 10.599
656
+ - type: map_at_3
657
+ value: 9.017
658
+ - type: map_at_5
659
+ value: 9.603
660
+ - type: mrr_at_1
661
+ value: 9.336
662
+ - type: mrr_at_10
663
+ value: 12.589
664
+ - type: mrr_at_100
665
+ value: 13.086
666
+ - type: mrr_at_1000
667
+ value: 13.161000000000001
668
+ - type: mrr_at_3
669
+ value: 11.373
670
+ - type: mrr_at_5
671
+ value: 12.084999999999999
672
+ - type: ndcg_at_1
673
+ value: 9.336
674
+ - type: ndcg_at_10
675
+ value: 12.299
676
+ - type: ndcg_at_100
677
+ value: 14.780999999999999
678
+ - type: ndcg_at_1000
679
+ value: 17.632
680
+ - type: ndcg_at_3
681
+ value: 10.302
682
+ - type: ndcg_at_5
683
+ value: 11.247
684
+ - type: precision_at_1
685
+ value: 9.336
686
+ - type: precision_at_10
687
+ value: 2.271
688
+ - type: precision_at_100
689
+ value: 0.42300000000000004
690
+ - type: precision_at_1000
691
+ value: 0.08099999999999999
692
+ - type: precision_at_3
693
+ value: 4.909
694
+ - type: precision_at_5
695
+ value: 3.5999999999999996
696
+ - type: recall_at_1
697
+ value: 7.449999999999999
698
+ - type: recall_at_10
699
+ value: 16.891000000000002
700
+ - type: recall_at_100
701
+ value: 28.050000000000004
702
+ - type: recall_at_1000
703
+ value: 49.267
704
+ - type: recall_at_3
705
+ value: 11.187999999999999
706
+ - type: recall_at_5
707
+ value: 13.587
708
+ - task:
709
+ type: Retrieval
710
+ dataset:
711
+ type: BeIR/cqadupstack
712
+ name: MTEB CQADupstackProgrammersRetrieval
713
+ config: default
714
+ split: test
715
+ revision: None
716
+ metrics:
717
+ - type: map_at_1
718
+ value: 4.734
719
+ - type: map_at_10
720
+ value: 7.045999999999999
721
+ - type: map_at_100
722
+ value: 7.564
723
+ - type: map_at_1000
724
+ value: 7.6499999999999995
725
+ - type: map_at_3
726
+ value: 6.21
727
+ - type: map_at_5
728
+ value: 6.617000000000001
729
+ - type: mrr_at_1
730
+ value: 5.936
731
+ - type: mrr_at_10
732
+ value: 8.624
733
+ - type: mrr_at_100
734
+ value: 9.193
735
+ - type: mrr_at_1000
736
+ value: 9.28
737
+ - type: mrr_at_3
738
+ value: 7.725
739
+ - type: mrr_at_5
740
+ value: 8.147
741
+ - type: ndcg_at_1
742
+ value: 5.936
743
+ - type: ndcg_at_10
744
+ value: 8.81
745
+ - type: ndcg_at_100
746
+ value: 11.694
747
+ - type: ndcg_at_1000
748
+ value: 14.526
749
+ - type: ndcg_at_3
750
+ value: 7.140000000000001
751
+ - type: ndcg_at_5
752
+ value: 7.8020000000000005
753
+ - type: precision_at_1
754
+ value: 5.936
755
+ - type: precision_at_10
756
+ value: 1.701
757
+ - type: precision_at_100
758
+ value: 0.366
759
+ - type: precision_at_1000
760
+ value: 0.07200000000000001
761
+ - type: precision_at_3
762
+ value: 3.463
763
+ - type: precision_at_5
764
+ value: 2.557
765
+ - type: recall_at_1
766
+ value: 4.734
767
+ - type: recall_at_10
768
+ value: 12.733
769
+ - type: recall_at_100
770
+ value: 25.982
771
+ - type: recall_at_1000
772
+ value: 47.233999999999995
773
+ - type: recall_at_3
774
+ value: 8.018
775
+ - type: recall_at_5
776
+ value: 9.762
777
+ - task:
778
+ type: Retrieval
779
+ dataset:
780
+ type: BeIR/cqadupstack
781
+ name: MTEB CQADupstackStatsRetrieval
782
+ config: default
783
+ split: test
784
+ revision: None
785
+ metrics:
786
+ - type: map_at_1
787
+ value: 4.293
788
+ - type: map_at_10
789
+ value: 6.146999999999999
790
+ - type: map_at_100
791
+ value: 6.487
792
+ - type: map_at_1000
793
+ value: 6.544999999999999
794
+ - type: map_at_3
795
+ value: 5.6930000000000005
796
+ - type: map_at_5
797
+ value: 5.869
798
+ - type: mrr_at_1
799
+ value: 5.061
800
+ - type: mrr_at_10
801
+ value: 7.1690000000000005
802
+ - type: mrr_at_100
803
+ value: 7.542
804
+ - type: mrr_at_1000
805
+ value: 7.5969999999999995
806
+ - type: mrr_at_3
807
+ value: 6.646000000000001
808
+ - type: mrr_at_5
809
+ value: 6.8229999999999995
810
+ - type: ndcg_at_1
811
+ value: 5.061
812
+ - type: ndcg_at_10
813
+ value: 7.396
814
+ - type: ndcg_at_100
815
+ value: 9.41
816
+ - type: ndcg_at_1000
817
+ value: 11.386000000000001
818
+ - type: ndcg_at_3
819
+ value: 6.454
820
+ - type: ndcg_at_5
821
+ value: 6.718
822
+ - type: precision_at_1
823
+ value: 5.061
824
+ - type: precision_at_10
825
+ value: 1.319
826
+ - type: precision_at_100
827
+ value: 0.262
828
+ - type: precision_at_1000
829
+ value: 0.047
830
+ - type: precision_at_3
831
+ value: 3.0669999999999997
832
+ - type: precision_at_5
833
+ value: 1.994
834
+ - type: recall_at_1
835
+ value: 4.293
836
+ - type: recall_at_10
837
+ value: 10.221
838
+ - type: recall_at_100
839
+ value: 19.744999999999997
840
+ - type: recall_at_1000
841
+ value: 35.399
842
+ - type: recall_at_3
843
+ value: 7.507999999999999
844
+ - type: recall_at_5
845
+ value: 8.275
846
+ - task:
847
+ type: Retrieval
848
+ dataset:
849
+ type: BeIR/cqadupstack
850
+ name: MTEB CQADupstackTexRetrieval
851
+ config: default
852
+ split: test
853
+ revision: None
854
+ metrics:
855
+ - type: map_at_1
856
+ value: 3.519
857
+ - type: map_at_10
858
+ value: 4.768
859
+ - type: map_at_100
860
+ value: 5.034000000000001
861
+ - type: map_at_1000
862
+ value: 5.087
863
+ - type: map_at_3
864
+ value: 4.308
865
+ - type: map_at_5
866
+ value: 4.565
867
+ - type: mrr_at_1
868
+ value: 4.474
869
+ - type: mrr_at_10
870
+ value: 6.045
871
+ - type: mrr_at_100
872
+ value: 6.361999999999999
873
+ - type: mrr_at_1000
874
+ value: 6.417000000000001
875
+ - type: mrr_at_3
876
+ value: 5.483
877
+ - type: mrr_at_5
878
+ value: 5.81
879
+ - type: ndcg_at_1
880
+ value: 4.474
881
+ - type: ndcg_at_10
882
+ value: 5.799
883
+ - type: ndcg_at_100
884
+ value: 7.344
885
+ - type: ndcg_at_1000
886
+ value: 9.141
887
+ - type: ndcg_at_3
888
+ value: 4.893
889
+ - type: ndcg_at_5
890
+ value: 5.309
891
+ - type: precision_at_1
892
+ value: 4.474
893
+ - type: precision_at_10
894
+ value: 1.06
895
+ - type: precision_at_100
896
+ value: 0.217
897
+ - type: precision_at_1000
898
+ value: 0.045
899
+ - type: precision_at_3
900
+ value: 2.306
901
+ - type: precision_at_5
902
+ value: 1.7000000000000002
903
+ - type: recall_at_1
904
+ value: 3.519
905
+ - type: recall_at_10
906
+ value: 7.75
907
+ - type: recall_at_100
908
+ value: 15.049999999999999
909
+ - type: recall_at_1000
910
+ value: 28.779
911
+ - type: recall_at_3
912
+ value: 5.18
913
+ - type: recall_at_5
914
+ value: 6.245
915
+ - task:
916
+ type: Retrieval
917
+ dataset:
918
+ type: BeIR/cqadupstack
919
+ name: MTEB CQADupstackUnixRetrieval
920
+ config: default
921
+ split: test
922
+ revision: None
923
+ metrics:
924
+ - type: map_at_1
925
+ value: 6.098
926
+ - type: map_at_10
927
+ value: 7.918
928
+ - type: map_at_100
929
+ value: 8.229000000000001
930
+ - type: map_at_1000
931
+ value: 8.293000000000001
932
+ - type: map_at_3
933
+ value: 7.138999999999999
934
+ - type: map_at_5
935
+ value: 7.646
936
+ - type: mrr_at_1
937
+ value: 7.090000000000001
938
+ - type: mrr_at_10
939
+ value: 9.293
940
+ - type: mrr_at_100
941
+ value: 9.669
942
+ - type: mrr_at_1000
943
+ value: 9.734
944
+ - type: mrr_at_3
945
+ value: 8.364
946
+ - type: mrr_at_5
947
+ value: 8.956999999999999
948
+ - type: ndcg_at_1
949
+ value: 7.090000000000001
950
+ - type: ndcg_at_10
951
+ value: 9.411999999999999
952
+ - type: ndcg_at_100
953
+ value: 11.318999999999999
954
+ - type: ndcg_at_1000
955
+ value: 13.478000000000002
956
+ - type: ndcg_at_3
957
+ value: 7.837
958
+ - type: ndcg_at_5
959
+ value: 8.73
960
+ - type: precision_at_1
961
+ value: 7.090000000000001
962
+ - type: precision_at_10
963
+ value: 1.558
964
+ - type: precision_at_100
965
+ value: 0.28400000000000003
966
+ - type: precision_at_1000
967
+ value: 0.053
968
+ - type: precision_at_3
969
+ value: 3.42
970
+ - type: precision_at_5
971
+ value: 2.5749999999999997
972
+ - type: recall_at_1
973
+ value: 6.098
974
+ - type: recall_at_10
975
+ value: 12.764000000000001
976
+ - type: recall_at_100
977
+ value: 21.747
978
+ - type: recall_at_1000
979
+ value: 38.279999999999994
980
+ - type: recall_at_3
981
+ value: 8.476
982
+ - type: recall_at_5
983
+ value: 10.707
984
+ - task:
985
+ type: Retrieval
986
+ dataset:
987
+ type: BeIR/cqadupstack
988
+ name: MTEB CQADupstackWebmastersRetrieval
989
+ config: default
990
+ split: test
991
+ revision: None
992
+ metrics:
993
+ - type: map_at_1
994
+ value: 8.607
995
+ - type: map_at_10
996
+ value: 10.835
997
+ - type: map_at_100
998
+ value: 11.285
999
+ - type: map_at_1000
1000
+ value: 11.383000000000001
1001
+ - type: map_at_3
1002
+ value: 10.111
1003
+ - type: map_at_5
1004
+ value: 10.334999999999999
1005
+ - type: mrr_at_1
1006
+ value: 10.671999999999999
1007
+ - type: mrr_at_10
1008
+ value: 13.269
1009
+ - type: mrr_at_100
1010
+ value: 13.729
1011
+ - type: mrr_at_1000
1012
+ value: 13.813
1013
+ - type: mrr_at_3
1014
+ value: 12.385
1015
+ - type: mrr_at_5
1016
+ value: 12.701
1017
+ - type: ndcg_at_1
1018
+ value: 10.671999999999999
1019
+ - type: ndcg_at_10
1020
+ value: 12.728
1021
+ - type: ndcg_at_100
1022
+ value: 15.312999999999999
1023
+ - type: ndcg_at_1000
1024
+ value: 18.160999999999998
1025
+ - type: ndcg_at_3
1026
+ value: 11.355
1027
+ - type: ndcg_at_5
1028
+ value: 11.605
1029
+ - type: precision_at_1
1030
+ value: 10.671999999999999
1031
+ - type: precision_at_10
1032
+ value: 2.154
1033
+ - type: precision_at_100
1034
+ value: 0.455
1035
+ - type: precision_at_1000
1036
+ value: 0.098
1037
+ - type: precision_at_3
1038
+ value: 4.941
1039
+ - type: precision_at_5
1040
+ value: 3.2809999999999997
1041
+ - type: recall_at_1
1042
+ value: 8.607
1043
+ - type: recall_at_10
1044
+ value: 16.398
1045
+ - type: recall_at_100
1046
+ value: 28.92
1047
+ - type: recall_at_1000
1048
+ value: 49.761
1049
+ - type: recall_at_3
1050
+ value: 11.844000000000001
1051
+ - type: recall_at_5
1052
+ value: 12.792
1053
+ - task:
1054
+ type: Retrieval
1055
+ dataset:
1056
+ type: BeIR/cqadupstack
1057
+ name: MTEB CQADupstackWordpressRetrieval
1058
+ config: default
1059
+ split: test
1060
+ revision: None
1061
+ metrics:
1062
+ - type: map_at_1
1063
+ value: 3.826
1064
+ - type: map_at_10
1065
+ value: 5.6419999999999995
1066
+ - type: map_at_100
1067
+ value: 5.943
1068
+ - type: map_at_1000
1069
+ value: 6.005
1070
+ - type: map_at_3
1071
+ value: 5.1049999999999995
1072
+ - type: map_at_5
1073
+ value: 5.437
1074
+ - type: mrr_at_1
1075
+ value: 4.436
1076
+ - type: mrr_at_10
1077
+ value: 6.413
1078
+ - type: mrr_at_100
1079
+ value: 6.752
1080
+ - type: mrr_at_1000
1081
+ value: 6.819999999999999
1082
+ - type: mrr_at_3
1083
+ value: 5.884
1084
+ - type: mrr_at_5
1085
+ value: 6.18
1086
+ - type: ndcg_at_1
1087
+ value: 4.436
1088
+ - type: ndcg_at_10
1089
+ value: 6.7989999999999995
1090
+ - type: ndcg_at_100
1091
+ value: 8.619
1092
+ - type: ndcg_at_1000
1093
+ value: 10.842
1094
+ - type: ndcg_at_3
1095
+ value: 5.739
1096
+ - type: ndcg_at_5
1097
+ value: 6.292000000000001
1098
+ - type: precision_at_1
1099
+ value: 4.436
1100
+ - type: precision_at_10
1101
+ value: 1.109
1102
+ - type: precision_at_100
1103
+ value: 0.214
1104
+ - type: precision_at_1000
1105
+ value: 0.043
1106
+ - type: precision_at_3
1107
+ value: 2.588
1108
+ - type: precision_at_5
1109
+ value: 1.848
1110
+ - type: recall_at_1
1111
+ value: 3.826
1112
+ - type: recall_at_10
1113
+ value: 9.655
1114
+ - type: recall_at_100
1115
+ value: 18.611
1116
+ - type: recall_at_1000
1117
+ value: 36.733
1118
+ - type: recall_at_3
1119
+ value: 6.784
1120
+ - type: recall_at_5
1121
+ value: 8.17
1122
+ - task:
1123
+ type: Classification
1124
+ dataset:
1125
+ type: mteb/emotion
1126
+ name: MTEB EmotionClassification
1127
+ config: default
1128
+ split: test
1129
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1130
+ metrics:
1131
+ - type: accuracy
1132
+ value: 23.279999999999998
1133
+ - type: f1
1134
+ value: 19.87865985032945
1135
+ - task:
1136
+ type: Retrieval
1137
+ dataset:
1138
+ type: fiqa
1139
+ name: MTEB FiQA2018
1140
+ config: default
1141
+ split: test
1142
+ revision: None
1143
+ metrics:
1144
+ - type: map_at_1
1145
+ value: 1.166
1146
+ - type: map_at_10
1147
+ value: 2.283
1148
+ - type: map_at_100
1149
+ value: 2.564
1150
+ - type: map_at_1000
1151
+ value: 2.6519999999999997
1152
+ - type: map_at_3
1153
+ value: 1.867
1154
+ - type: map_at_5
1155
+ value: 2.0500000000000003
1156
+ - type: mrr_at_1
1157
+ value: 2.932
1158
+ - type: mrr_at_10
1159
+ value: 4.852
1160
+ - type: mrr_at_100
1161
+ value: 5.306
1162
+ - type: mrr_at_1000
1163
+ value: 5.4
1164
+ - type: mrr_at_3
1165
+ value: 4.141
1166
+ - type: mrr_at_5
1167
+ value: 4.457
1168
+ - type: ndcg_at_1
1169
+ value: 2.932
1170
+ - type: ndcg_at_10
1171
+ value: 3.5709999999999997
1172
+ - type: ndcg_at_100
1173
+ value: 5.489
1174
+ - type: ndcg_at_1000
1175
+ value: 8.309999999999999
1176
+ - type: ndcg_at_3
1177
+ value: 2.773
1178
+ - type: ndcg_at_5
1179
+ value: 2.979
1180
+ - type: precision_at_1
1181
+ value: 2.932
1182
+ - type: precision_at_10
1183
+ value: 1.049
1184
+ - type: precision_at_100
1185
+ value: 0.306
1186
+ - type: precision_at_1000
1187
+ value: 0.077
1188
+ - type: precision_at_3
1189
+ value: 1.8519999999999999
1190
+ - type: precision_at_5
1191
+ value: 1.389
1192
+ - type: recall_at_1
1193
+ value: 1.166
1194
+ - type: recall_at_10
1195
+ value: 5.178
1196
+ - type: recall_at_100
1197
+ value: 13.056999999999999
1198
+ - type: recall_at_1000
1199
+ value: 31.708
1200
+ - type: recall_at_3
1201
+ value: 2.714
1202
+ - type: recall_at_5
1203
+ value: 3.4909999999999997
1204
+ - task:
1205
+ type: Classification
1206
+ dataset:
1207
+ type: mteb/imdb
1208
+ name: MTEB ImdbClassification
1209
+ config: default
1210
+ split: test
1211
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1212
+ metrics:
1213
+ - type: accuracy
1214
+ value: 56.96359999999999
1215
+ - type: ap
1216
+ value: 54.16760114570921
1217
+ - type: f1
1218
+ value: 56.193845361069116
1219
+ - task:
1220
+ type: Classification
1221
+ dataset:
1222
+ type: mteb/mtop_domain
1223
+ name: MTEB MTOPDomainClassification (en)
1224
+ config: en
1225
+ split: test
1226
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1227
+ metrics:
1228
+ - type: accuracy
1229
+ value: 64.39808481532147
1230
+ - type: f1
1231
+ value: 63.468270818712625
1232
+ - task:
1233
+ type: Classification
1234
+ dataset:
1235
+ type: mteb/mtop_domain
1236
+ name: MTEB MTOPDomainClassification (de)
1237
+ config: de
1238
+ split: test
1239
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1240
+ metrics:
1241
+ - type: accuracy
1242
+ value: 53.961679346294744
1243
+ - type: f1
1244
+ value: 51.6707117653683
1245
+ - task:
1246
+ type: Classification
1247
+ dataset:
1248
+ type: mteb/mtop_domain
1249
+ name: MTEB MTOPDomainClassification (es)
1250
+ config: es
1251
+ split: test
1252
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1253
+ metrics:
1254
+ - type: accuracy
1255
+ value: 57.018012008005336
1256
+ - type: f1
1257
+ value: 54.23413458037234
1258
+ - task:
1259
+ type: Classification
1260
+ dataset:
1261
+ type: mteb/mtop_domain
1262
+ name: MTEB MTOPDomainClassification (fr)
1263
+ config: fr
1264
+ split: test
1265
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1266
+ metrics:
1267
+ - type: accuracy
1268
+ value: 48.84434700908236
1269
+ - type: f1
1270
+ value: 46.48494180527987
1271
+ - task:
1272
+ type: Classification
1273
+ dataset:
1274
+ type: mteb/mtop_domain
1275
+ name: MTEB MTOPDomainClassification (hi)
1276
+ config: hi
1277
+ split: test
1278
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1279
+ metrics:
1280
+ - type: accuracy
1281
+ value: 39.7669415561133
1282
+ - type: f1
1283
+ value: 35.50974325529877
1284
+ - task:
1285
+ type: Classification
1286
+ dataset:
1287
+ type: mteb/mtop_domain
1288
+ name: MTEB MTOPDomainClassification (th)
1289
+ config: th
1290
+ split: test
1291
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1292
+ metrics:
1293
+ - type: accuracy
1294
+ value: 42.589511754068724
1295
+ - type: f1
1296
+ value: 40.47244422785889
1297
+ - task:
1298
+ type: Classification
1299
+ dataset:
1300
+ type: mteb/mtop_intent
1301
+ name: MTEB MTOPIntentClassification (en)
1302
+ config: en
1303
+ split: test
1304
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1305
+ metrics:
1306
+ - type: accuracy
1307
+ value: 34.01276789785682
1308
+ - type: f1
1309
+ value: 21.256775922291286
1310
+ - task:
1311
+ type: Classification
1312
+ dataset:
1313
+ type: mteb/mtop_intent
1314
+ name: MTEB MTOPIntentClassification (de)
1315
+ config: de
1316
+ split: test
1317
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1318
+ metrics:
1319
+ - type: accuracy
1320
+ value: 33.285432516201745
1321
+ - type: f1
1322
+ value: 19.841703666811565
1323
+ - task:
1324
+ type: Classification
1325
+ dataset:
1326
+ type: mteb/mtop_intent
1327
+ name: MTEB MTOPIntentClassification (es)
1328
+ config: es
1329
+ split: test
1330
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1331
+ metrics:
1332
+ - type: accuracy
1333
+ value: 32.121414276184126
1334
+ - type: f1
1335
+ value: 19.34706868150749
1336
+ - task:
1337
+ type: Classification
1338
+ dataset:
1339
+ type: mteb/mtop_intent
1340
+ name: MTEB MTOPIntentClassification (fr)
1341
+ config: fr
1342
+ split: test
1343
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1344
+ metrics:
1345
+ - type: accuracy
1346
+ value: 26.088318196053866
1347
+ - type: f1
1348
+ value: 17.22608011891254
1349
+ - task:
1350
+ type: Classification
1351
+ dataset:
1352
+ type: mteb/mtop_intent
1353
+ name: MTEB MTOPIntentClassification (hi)
1354
+ config: hi
1355
+ split: test
1356
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1357
+ metrics:
1358
+ - type: accuracy
1359
+ value: 15.320903549659375
1360
+ - type: f1
1361
+ value: 9.62002916015258
1362
+ - task:
1363
+ type: Classification
1364
+ dataset:
1365
+ type: mteb/mtop_intent
1366
+ name: MTEB MTOPIntentClassification (th)
1367
+ config: th
1368
+ split: test
1369
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1370
+ metrics:
1371
+ - type: accuracy
1372
+ value: 16.426763110307412
1373
+ - type: f1
1374
+ value: 11.023799171137183
1375
+ - task:
1376
+ type: Clustering
1377
+ dataset:
1378
+ type: mteb/medrxiv-clustering-p2p
1379
+ name: MTEB MedrxivClusteringP2P
1380
+ config: default
1381
+ split: test
1382
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1383
+ metrics:
1384
+ - type: v_measure
1385
+ value: 22.08508717069763
1386
+ - task:
1387
+ type: Clustering
1388
+ dataset:
1389
+ type: mteb/medrxiv-clustering-s2s
1390
+ name: MTEB MedrxivClusteringS2S
1391
+ config: default
1392
+ split: test
1393
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1394
+ metrics:
1395
+ - type: v_measure
1396
+ value: 16.58582885790446
1397
+ - task:
1398
+ type: Retrieval
1399
+ dataset:
1400
+ type: nfcorpus
1401
+ name: MTEB NFCorpus
1402
+ config: default
1403
+ split: test
1404
+ revision: None
1405
+ metrics:
1406
+ - type: map_at_1
1407
+ value: 1.2
1408
+ - type: map_at_10
1409
+ value: 1.6400000000000001
1410
+ - type: map_at_100
1411
+ value: 1.9789999999999999
1412
+ - type: map_at_1000
1413
+ value: 2.554
1414
+ - type: map_at_3
1415
+ value: 1.4449999999999998
1416
+ - type: map_at_5
1417
+ value: 1.533
1418
+ - type: mrr_at_1
1419
+ value: 6.811
1420
+ - type: mrr_at_10
1421
+ value: 11.068999999999999
1422
+ - type: mrr_at_100
1423
+ value: 12.454
1424
+ - type: mrr_at_1000
1425
+ value: 12.590000000000002
1426
+ - type: mrr_at_3
1427
+ value: 9.751999999999999
1428
+ - type: mrr_at_5
1429
+ value: 10.31
1430
+ - type: ndcg_at_1
1431
+ value: 6.3469999999999995
1432
+ - type: ndcg_at_10
1433
+ value: 4.941
1434
+ - type: ndcg_at_100
1435
+ value: 6.524000000000001
1436
+ - type: ndcg_at_1000
1437
+ value: 15.918
1438
+ - type: ndcg_at_3
1439
+ value: 5.959
1440
+ - type: ndcg_at_5
1441
+ value: 5.395
1442
+ - type: precision_at_1
1443
+ value: 6.811
1444
+ - type: precision_at_10
1445
+ value: 3.375
1446
+ - type: precision_at_100
1447
+ value: 2.0709999999999997
1448
+ - type: precision_at_1000
1449
+ value: 1.313
1450
+ - type: precision_at_3
1451
+ value: 5.47
1452
+ - type: precision_at_5
1453
+ value: 4.396
1454
+ - type: recall_at_1
1455
+ value: 1.2
1456
+ - type: recall_at_10
1457
+ value: 2.5909999999999997
1458
+ - type: recall_at_100
1459
+ value: 9.443999999999999
1460
+ - type: recall_at_1000
1461
+ value: 41.542
1462
+ - type: recall_at_3
1463
+ value: 1.702
1464
+ - type: recall_at_5
1465
+ value: 1.9879999999999998
1466
+ - task:
1467
+ type: Retrieval
1468
+ dataset:
1469
+ type: quora
1470
+ name: MTEB QuoraRetrieval
1471
+ config: default
1472
+ split: test
1473
+ revision: None
1474
+ metrics:
1475
+ - type: map_at_1
1476
+ value: 45.708
1477
+ - type: map_at_10
1478
+ value: 55.131
1479
+ - type: map_at_100
1480
+ value: 55.935
1481
+ - type: map_at_1000
1482
+ value: 55.993
1483
+ - type: map_at_3
1484
+ value: 52.749
1485
+ - type: map_at_5
1486
+ value: 54.166000000000004
1487
+ - type: mrr_at_1
1488
+ value: 52.44
1489
+ - type: mrr_at_10
1490
+ value: 59.99
1491
+ - type: mrr_at_100
1492
+ value: 60.492999999999995
1493
+ - type: mrr_at_1000
1494
+ value: 60.522
1495
+ - type: mrr_at_3
1496
+ value: 58.285
1497
+ - type: mrr_at_5
1498
+ value: 59.305
1499
+ - type: ndcg_at_1
1500
+ value: 52.43
1501
+ - type: ndcg_at_10
1502
+ value: 59.873
1503
+ - type: ndcg_at_100
1504
+ value: 63.086
1505
+ - type: ndcg_at_1000
1506
+ value: 64.291
1507
+ - type: ndcg_at_3
1508
+ value: 56.291000000000004
1509
+ - type: ndcg_at_5
1510
+ value: 58.071
1511
+ - type: precision_at_1
1512
+ value: 52.43
1513
+ - type: precision_at_10
1514
+ value: 8.973
1515
+ - type: precision_at_100
1516
+ value: 1.161
1517
+ - type: precision_at_1000
1518
+ value: 0.134
1519
+ - type: precision_at_3
1520
+ value: 24.177
1521
+ - type: precision_at_5
1522
+ value: 16.073999999999998
1523
+ - type: recall_at_1
1524
+ value: 45.708
1525
+ - type: recall_at_10
1526
+ value: 69.195
1527
+ - type: recall_at_100
1528
+ value: 82.812
1529
+ - type: recall_at_1000
1530
+ value: 91.136
1531
+ - type: recall_at_3
1532
+ value: 58.938
1533
+ - type: recall_at_5
1534
+ value: 63.787000000000006
1535
+ - task:
1536
+ type: Clustering
1537
+ dataset:
1538
+ type: mteb/reddit-clustering
1539
+ name: MTEB RedditClustering
1540
+ config: default
1541
+ split: test
1542
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1543
+ metrics:
1544
+ - type: v_measure
1545
+ value: 13.142048230676806
1546
+ - task:
1547
+ type: Clustering
1548
+ dataset:
1549
+ type: mteb/reddit-clustering-p2p
1550
+ name: MTEB RedditClusteringP2P
1551
+ config: default
1552
+ split: test
1553
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1554
+ metrics:
1555
+ - type: v_measure
1556
+ value: 26.06687178917052
1557
+ - task:
1558
+ type: Retrieval
1559
+ dataset:
1560
+ type: scidocs
1561
+ name: MTEB SCIDOCS
1562
+ config: default
1563
+ split: test
1564
+ revision: None
1565
+ metrics:
1566
+ - type: map_at_1
1567
+ value: 0.46499999999999997
1568
+ - type: map_at_10
1569
+ value: 0.906
1570
+ - type: map_at_100
1571
+ value: 1.127
1572
+ - type: map_at_1000
1573
+ value: 1.203
1574
+ - type: map_at_3
1575
+ value: 0.72
1576
+ - type: map_at_5
1577
+ value: 0.814
1578
+ - type: mrr_at_1
1579
+ value: 2.3
1580
+ - type: mrr_at_10
1581
+ value: 3.733
1582
+ - type: mrr_at_100
1583
+ value: 4.295999999999999
1584
+ - type: mrr_at_1000
1585
+ value: 4.412
1586
+ - type: mrr_at_3
1587
+ value: 3.183
1588
+ - type: mrr_at_5
1589
+ value: 3.458
1590
+ - type: ndcg_at_1
1591
+ value: 2.3
1592
+ - type: ndcg_at_10
1593
+ value: 1.797
1594
+ - type: ndcg_at_100
1595
+ value: 3.376
1596
+ - type: ndcg_at_1000
1597
+ value: 6.143
1598
+ - type: ndcg_at_3
1599
+ value: 1.763
1600
+ - type: ndcg_at_5
1601
+ value: 1.5070000000000001
1602
+ - type: precision_at_1
1603
+ value: 2.3
1604
+ - type: precision_at_10
1605
+ value: 0.91
1606
+ - type: precision_at_100
1607
+ value: 0.32399999999999995
1608
+ - type: precision_at_1000
1609
+ value: 0.101
1610
+ - type: precision_at_3
1611
+ value: 1.633
1612
+ - type: precision_at_5
1613
+ value: 1.3
1614
+ - type: recall_at_1
1615
+ value: 0.46499999999999997
1616
+ - type: recall_at_10
1617
+ value: 1.8499999999999999
1618
+ - type: recall_at_100
1619
+ value: 6.625
1620
+ - type: recall_at_1000
1621
+ value: 20.587
1622
+ - type: recall_at_3
1623
+ value: 0.9900000000000001
1624
+ - type: recall_at_5
1625
+ value: 1.315
1626
+ - task:
1627
+ type: STS
1628
+ dataset:
1629
+ type: mteb/sickr-sts
1630
+ name: MTEB SICK-R
1631
+ config: default
1632
+ split: test
1633
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1634
+ metrics:
1635
+ - type: cos_sim_pearson
1636
+ value: 60.78961481918511
1637
+ - type: cos_sim_spearman
1638
+ value: 54.92014630234372
1639
+ - type: euclidean_pearson
1640
+ value: 54.91456364340953
1641
+ - type: euclidean_spearman
1642
+ value: 50.95537043206628
1643
+ - type: manhattan_pearson
1644
+ value: 55.0450005071106
1645
+ - type: manhattan_spearman
1646
+ value: 51.227579527791654
1647
+ - task:
1648
+ type: STS
1649
+ dataset:
1650
+ type: mteb/sts12-sts
1651
+ name: MTEB STS12
1652
+ config: default
1653
+ split: test
1654
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1655
+ metrics:
1656
+ - type: cos_sim_pearson
1657
+ value: 43.73124494569395
1658
+ - type: cos_sim_spearman
1659
+ value: 43.07629933550637
1660
+ - type: euclidean_pearson
1661
+ value: 37.2529484210563
1662
+ - type: euclidean_spearman
1663
+ value: 36.68421330216546
1664
+ - type: manhattan_pearson
1665
+ value: 37.41673219009712
1666
+ - type: manhattan_spearman
1667
+ value: 36.92073705702668
1668
+ - task:
1669
+ type: STS
1670
+ dataset:
1671
+ type: mteb/sts13-sts
1672
+ name: MTEB STS13
1673
+ config: default
1674
+ split: test
1675
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1676
+ metrics:
1677
+ - type: cos_sim_pearson
1678
+ value: 57.17534157059787
1679
+ - type: cos_sim_spearman
1680
+ value: 56.86679858348438
1681
+ - type: euclidean_pearson
1682
+ value: 54.51552371857776
1683
+ - type: euclidean_spearman
1684
+ value: 53.80989851917749
1685
+ - type: manhattan_pearson
1686
+ value: 54.44486043632584
1687
+ - type: manhattan_spearman
1688
+ value: 53.83487353949481
1689
+ - task:
1690
+ type: STS
1691
+ dataset:
1692
+ type: mteb/sts14-sts
1693
+ name: MTEB STS14
1694
+ config: default
1695
+ split: test
1696
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1697
+ metrics:
1698
+ - type: cos_sim_pearson
1699
+ value: 52.319034960820375
1700
+ - type: cos_sim_spearman
1701
+ value: 50.89512224974754
1702
+ - type: euclidean_pearson
1703
+ value: 49.19308209408045
1704
+ - type: euclidean_spearman
1705
+ value: 47.45736923614355
1706
+ - type: manhattan_pearson
1707
+ value: 48.82127080055118
1708
+ - type: manhattan_spearman
1709
+ value: 47.20185686489298
1710
+ - task:
1711
+ type: STS
1712
+ dataset:
1713
+ type: mteb/sts15-sts
1714
+ name: MTEB STS15
1715
+ config: default
1716
+ split: test
1717
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1718
+ metrics:
1719
+ - type: cos_sim_pearson
1720
+ value: 61.57602956458427
1721
+ - type: cos_sim_spearman
1722
+ value: 62.894640061838956
1723
+ - type: euclidean_pearson
1724
+ value: 53.86893407586029
1725
+ - type: euclidean_spearman
1726
+ value: 54.68528520514299
1727
+ - type: manhattan_pearson
1728
+ value: 53.689614981956815
1729
+ - type: manhattan_spearman
1730
+ value: 54.51172839699876
1731
+ - task:
1732
+ type: STS
1733
+ dataset:
1734
+ type: mteb/sts16-sts
1735
+ name: MTEB STS16
1736
+ config: default
1737
+ split: test
1738
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1739
+ metrics:
1740
+ - type: cos_sim_pearson
1741
+ value: 56.2305694109318
1742
+ - type: cos_sim_spearman
1743
+ value: 57.885939000786045
1744
+ - type: euclidean_pearson
1745
+ value: 50.486043353701994
1746
+ - type: euclidean_spearman
1747
+ value: 50.4463227974027
1748
+ - type: manhattan_pearson
1749
+ value: 50.73317560427465
1750
+ - type: manhattan_spearman
1751
+ value: 50.81397877006027
1752
+ - task:
1753
+ type: STS
1754
+ dataset:
1755
+ type: mteb/sts17-crosslingual-sts
1756
+ name: MTEB STS17 (ko-ko)
1757
+ config: ko-ko
1758
+ split: test
1759
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1760
+ metrics:
1761
+ - type: cos_sim_pearson
1762
+ value: 55.52162058025664
1763
+ - type: cos_sim_spearman
1764
+ value: 59.02220327783535
1765
+ - type: euclidean_pearson
1766
+ value: 55.66332330866701
1767
+ - type: euclidean_spearman
1768
+ value: 56.829076266662206
1769
+ - type: manhattan_pearson
1770
+ value: 55.39181385186973
1771
+ - type: manhattan_spearman
1772
+ value: 56.607432176121144
1773
+ - task:
1774
+ type: STS
1775
+ dataset:
1776
+ type: mteb/sts17-crosslingual-sts
1777
+ name: MTEB STS17 (ar-ar)
1778
+ config: ar-ar
1779
+ split: test
1780
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1781
+ metrics:
1782
+ - type: cos_sim_pearson
1783
+ value: 46.312186899914906
1784
+ - type: cos_sim_spearman
1785
+ value: 48.07172073934163
1786
+ - type: euclidean_pearson
1787
+ value: 46.957276350776695
1788
+ - type: euclidean_spearman
1789
+ value: 43.98800593212707
1790
+ - type: manhattan_pearson
1791
+ value: 46.910805787619914
1792
+ - type: manhattan_spearman
1793
+ value: 43.96662723946553
1794
+ - task:
1795
+ type: STS
1796
+ dataset:
1797
+ type: mteb/sts17-crosslingual-sts
1798
+ name: MTEB STS17 (en-ar)
1799
+ config: en-ar
1800
+ split: test
1801
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1802
+ metrics:
1803
+ - type: cos_sim_pearson
1804
+ value: 16.222172523403835
1805
+ - type: cos_sim_spearman
1806
+ value: 17.230258645779042
1807
+ - type: euclidean_pearson
1808
+ value: -6.781460243147299
1809
+ - type: euclidean_spearman
1810
+ value: -6.884123336780775
1811
+ - type: manhattan_pearson
1812
+ value: -4.369061881907372
1813
+ - type: manhattan_spearman
1814
+ value: -4.235845433380353
1815
+ - task:
1816
+ type: STS
1817
+ dataset:
1818
+ type: mteb/sts17-crosslingual-sts
1819
+ name: MTEB STS17 (en-de)
1820
+ config: en-de
1821
+ split: test
1822
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1823
+ metrics:
1824
+ - type: cos_sim_pearson
1825
+ value: 7.462476431657987
1826
+ - type: cos_sim_spearman
1827
+ value: 5.875270645234161
1828
+ - type: euclidean_pearson
1829
+ value: -10.79494346180473
1830
+ - type: euclidean_spearman
1831
+ value: -11.704529023304776
1832
+ - type: manhattan_pearson
1833
+ value: -11.465867974964997
1834
+ - type: manhattan_spearman
1835
+ value: -12.428424608287173
1836
+ - task:
1837
+ type: STS
1838
+ dataset:
1839
+ type: mteb/sts17-crosslingual-sts
1840
+ name: MTEB STS17 (en-en)
1841
+ config: en-en
1842
+ split: test
1843
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1844
+ metrics:
1845
+ - type: cos_sim_pearson
1846
+ value: 61.46601840758559
1847
+ - type: cos_sim_spearman
1848
+ value: 65.69667638887147
1849
+ - type: euclidean_pearson
1850
+ value: 49.531065525619866
1851
+ - type: euclidean_spearman
1852
+ value: 53.880480167479725
1853
+ - type: manhattan_pearson
1854
+ value: 50.25462221374689
1855
+ - type: manhattan_spearman
1856
+ value: 54.22205494276401
1857
+ - task:
1858
+ type: STS
1859
+ dataset:
1860
+ type: mteb/sts17-crosslingual-sts
1861
+ name: MTEB STS17 (en-tr)
1862
+ config: en-tr
1863
+ split: test
1864
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1865
+ metrics:
1866
+ - type: cos_sim_pearson
1867
+ value: -12.769479370624031
1868
+ - type: cos_sim_spearman
1869
+ value: -12.161427312728382
1870
+ - type: euclidean_pearson
1871
+ value: -27.950593491756536
1872
+ - type: euclidean_spearman
1873
+ value: -24.925281959398585
1874
+ - type: manhattan_pearson
1875
+ value: -25.98778888167475
1876
+ - type: manhattan_spearman
1877
+ value: -22.861942388867234
1878
+ - task:
1879
+ type: STS
1880
+ dataset:
1881
+ type: mteb/sts17-crosslingual-sts
1882
+ name: MTEB STS17 (es-en)
1883
+ config: es-en
1884
+ split: test
1885
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1886
+ metrics:
1887
+ - type: cos_sim_pearson
1888
+ value: 2.1575763564561727
1889
+ - type: cos_sim_spearman
1890
+ value: 1.182204089411577
1891
+ - type: euclidean_pearson
1892
+ value: -10.389249806317189
1893
+ - type: euclidean_spearman
1894
+ value: -16.078659904264605
1895
+ - type: manhattan_pearson
1896
+ value: -9.674301846448607
1897
+ - type: manhattan_spearman
1898
+ value: -16.976576817518577
1899
+ - task:
1900
+ type: STS
1901
+ dataset:
1902
+ type: mteb/sts17-crosslingual-sts
1903
+ name: MTEB STS17 (es-es)
1904
+ config: es-es
1905
+ split: test
1906
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1907
+ metrics:
1908
+ - type: cos_sim_pearson
1909
+ value: 66.16718583059163
1910
+ - type: cos_sim_spearman
1911
+ value: 69.95156267898052
1912
+ - type: euclidean_pearson
1913
+ value: 64.93174777029739
1914
+ - type: euclidean_spearman
1915
+ value: 66.21292533974568
1916
+ - type: manhattan_pearson
1917
+ value: 65.2578109632889
1918
+ - type: manhattan_spearman
1919
+ value: 66.21830865759128
1920
+ - task:
1921
+ type: STS
1922
+ dataset:
1923
+ type: mteb/sts17-crosslingual-sts
1924
+ name: MTEB STS17 (fr-en)
1925
+ config: fr-en
1926
+ split: test
1927
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1928
+ metrics:
1929
+ - type: cos_sim_pearson
1930
+ value: 0.1540829683540524
1931
+ - type: cos_sim_spearman
1932
+ value: -2.4072834011003987
1933
+ - type: euclidean_pearson
1934
+ value: -18.951775877513473
1935
+ - type: euclidean_spearman
1936
+ value: -18.393605606817527
1937
+ - type: manhattan_pearson
1938
+ value: -19.609633839454542
1939
+ - type: manhattan_spearman
1940
+ value: -19.276064769117912
1941
+ - task:
1942
+ type: STS
1943
+ dataset:
1944
+ type: mteb/sts17-crosslingual-sts
1945
+ name: MTEB STS17 (it-en)
1946
+ config: it-en
1947
+ split: test
1948
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1949
+ metrics:
1950
+ - type: cos_sim_pearson
1951
+ value: -4.22497246932717
1952
+ - type: cos_sim_spearman
1953
+ value: -5.747420352346977
1954
+ - type: euclidean_pearson
1955
+ value: -16.86351349130112
1956
+ - type: euclidean_spearman
1957
+ value: -16.555536618547382
1958
+ - type: manhattan_pearson
1959
+ value: -17.45445643482646
1960
+ - type: manhattan_spearman
1961
+ value: -17.97322953856309
1962
+ - task:
1963
+ type: STS
1964
+ dataset:
1965
+ type: mteb/sts17-crosslingual-sts
1966
+ name: MTEB STS17 (nl-en)
1967
+ config: nl-en
1968
+ split: test
1969
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1970
+ metrics:
1971
+ - type: cos_sim_pearson
1972
+ value: 8.559184021676034
1973
+ - type: cos_sim_spearman
1974
+ value: 5.600273352595882
1975
+ - type: euclidean_pearson
1976
+ value: -10.76482859283058
1977
+ - type: euclidean_spearman
1978
+ value: -9.575202768285926
1979
+ - type: manhattan_pearson
1980
+ value: -9.48508597350615
1981
+ - type: manhattan_spearman
1982
+ value: -9.33387861352172
1983
+ - task:
1984
+ type: STS
1985
+ dataset:
1986
+ type: mteb/sts22-crosslingual-sts
1987
+ name: MTEB STS22 (en)
1988
+ config: en
1989
+ split: test
1990
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1991
+ metrics:
1992
+ - type: cos_sim_pearson
1993
+ value: 30.260087169228978
1994
+ - type: cos_sim_spearman
1995
+ value: 43.264174903196015
1996
+ - type: euclidean_pearson
1997
+ value: 35.07785877281954
1998
+ - type: euclidean_spearman
1999
+ value: 43.41294719372452
2000
+ - type: manhattan_pearson
2001
+ value: 36.74996284702431
2002
+ - type: manhattan_spearman
2003
+ value: 43.53522851890142
2004
+ - task:
2005
+ type: STS
2006
+ dataset:
2007
+ type: mteb/sts22-crosslingual-sts
2008
+ name: MTEB STS22 (de)
2009
+ config: de
2010
+ split: test
2011
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2012
+ metrics:
2013
+ - type: cos_sim_pearson
2014
+ value: 5.58694979115026
2015
+ - type: cos_sim_spearman
2016
+ value: 32.80692337371332
2017
+ - type: euclidean_pearson
2018
+ value: 10.53180875461474
2019
+ - type: euclidean_spearman
2020
+ value: 31.105269938654033
2021
+ - type: manhattan_pearson
2022
+ value: 10.559778015974826
2023
+ - type: manhattan_spearman
2024
+ value: 31.452204563072044
2025
+ - task:
2026
+ type: STS
2027
+ dataset:
2028
+ type: mteb/sts22-crosslingual-sts
2029
+ name: MTEB STS22 (es)
2030
+ config: es
2031
+ split: test
2032
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2033
+ metrics:
2034
+ - type: cos_sim_pearson
2035
+ value: 10.593783873928478
2036
+ - type: cos_sim_spearman
2037
+ value: 50.397542574042006
2038
+ - type: euclidean_pearson
2039
+ value: 28.122179063209714
2040
+ - type: euclidean_spearman
2041
+ value: 50.72847867996529
2042
+ - type: manhattan_pearson
2043
+ value: 28.730690148465005
2044
+ - type: manhattan_spearman
2045
+ value: 51.019761292483366
2046
+ - task:
2047
+ type: STS
2048
+ dataset:
2049
+ type: mteb/sts22-crosslingual-sts
2050
+ name: MTEB STS22 (pl)
2051
+ config: pl
2052
+ split: test
2053
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2054
+ metrics:
2055
+ - type: cos_sim_pearson
2056
+ value: -1.3049499265017876
2057
+ - type: cos_sim_spearman
2058
+ value: 16.347130048706084
2059
+ - type: euclidean_pearson
2060
+ value: 0.5710147274110128
2061
+ - type: euclidean_spearman
2062
+ value: 16.589843077857605
2063
+ - type: manhattan_pearson
2064
+ value: 1.1226404198336415
2065
+ - type: manhattan_spearman
2066
+ value: 16.410620108636557
2067
+ - task:
2068
+ type: STS
2069
+ dataset:
2070
+ type: mteb/sts22-crosslingual-sts
2071
+ name: MTEB STS22 (tr)
2072
+ config: tr
2073
+ split: test
2074
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2075
+ metrics:
2076
+ - type: cos_sim_pearson
2077
+ value: -10.96861909019159
2078
+ - type: cos_sim_spearman
2079
+ value: 24.536979219880724
2080
+ - type: euclidean_pearson
2081
+ value: -1.3040190807315306
2082
+ - type: euclidean_spearman
2083
+ value: 25.061584673761928
2084
+ - type: manhattan_pearson
2085
+ value: -0.06525719745037804
2086
+ - type: manhattan_spearman
2087
+ value: 25.979295538386893
2088
+ - task:
2089
+ type: STS
2090
+ dataset:
2091
+ type: mteb/sts22-crosslingual-sts
2092
+ name: MTEB STS22 (ar)
2093
+ config: ar
2094
+ split: test
2095
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2096
+ metrics:
2097
+ - type: cos_sim_pearson
2098
+ value: 1.0599417065503314
2099
+ - type: cos_sim_spearman
2100
+ value: 52.055853787103345
2101
+ - type: euclidean_pearson
2102
+ value: 23.666828441081776
2103
+ - type: euclidean_spearman
2104
+ value: 52.38656753170069
2105
+ - type: manhattan_pearson
2106
+ value: 23.398080463967215
2107
+ - type: manhattan_spearman
2108
+ value: 52.23849717509109
2109
+ - task:
2110
+ type: STS
2111
+ dataset:
2112
+ type: mteb/sts22-crosslingual-sts
2113
+ name: MTEB STS22 (ru)
2114
+ config: ru
2115
+ split: test
2116
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2117
+ metrics:
2118
+ - type: cos_sim_pearson
2119
+ value: -2.847646040977239
2120
+ - type: cos_sim_spearman
2121
+ value: 40.5826838357407
2122
+ - type: euclidean_pearson
2123
+ value: 9.242304983683113
2124
+ - type: euclidean_spearman
2125
+ value: 40.35906851022345
2126
+ - type: manhattan_pearson
2127
+ value: 9.645663412799504
2128
+ - type: manhattan_spearman
2129
+ value: 40.78106154950966
2130
+ - task:
2131
+ type: STS
2132
+ dataset:
2133
+ type: mteb/sts22-crosslingual-sts
2134
+ name: MTEB STS22 (zh)
2135
+ config: zh
2136
+ split: test
2137
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2138
+ metrics:
2139
+ - type: cos_sim_pearson
2140
+ value: 17.761397832130992
2141
+ - type: cos_sim_spearman
2142
+ value: 59.98756452345925
2143
+ - type: euclidean_pearson
2144
+ value: 37.03125109036693
2145
+ - type: euclidean_spearman
2146
+ value: 59.58469212715707
2147
+ - type: manhattan_pearson
2148
+ value: 36.828102137170724
2149
+ - type: manhattan_spearman
2150
+ value: 59.07036501478588
2151
+ - task:
2152
+ type: STS
2153
+ dataset:
2154
+ type: mteb/sts22-crosslingual-sts
2155
+ name: MTEB STS22 (fr)
2156
+ config: fr
2157
+ split: test
2158
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2159
+ metrics:
2160
+ - type: cos_sim_pearson
2161
+ value: 22.281212883400205
2162
+ - type: cos_sim_spearman
2163
+ value: 48.27687537627578
2164
+ - type: euclidean_pearson
2165
+ value: 30.531395629285324
2166
+ - type: euclidean_spearman
2167
+ value: 50.349143748970384
2168
+ - type: manhattan_pearson
2169
+ value: 30.48762081986554
2170
+ - type: manhattan_spearman
2171
+ value: 50.66037165529169
2172
+ - task:
2173
+ type: STS
2174
+ dataset:
2175
+ type: mteb/sts22-crosslingual-sts
2176
+ name: MTEB STS22 (de-en)
2177
+ config: de-en
2178
+ split: test
2179
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2180
+ metrics:
2181
+ - type: cos_sim_pearson
2182
+ value: 15.76679673990358
2183
+ - type: cos_sim_spearman
2184
+ value: 19.123349126370442
2185
+ - type: euclidean_pearson
2186
+ value: 19.21389203087116
2187
+ - type: euclidean_spearman
2188
+ value: 23.63276413160338
2189
+ - type: manhattan_pearson
2190
+ value: 18.789263824907053
2191
+ - type: manhattan_spearman
2192
+ value: 19.962703178974692
2193
+ - task:
2194
+ type: STS
2195
+ dataset:
2196
+ type: mteb/sts22-crosslingual-sts
2197
+ name: MTEB STS22 (es-en)
2198
+ config: es-en
2199
+ split: test
2200
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2201
+ metrics:
2202
+ - type: cos_sim_pearson
2203
+ value: 11.024970397289941
2204
+ - type: cos_sim_spearman
2205
+ value: 13.530951900755017
2206
+ - type: euclidean_pearson
2207
+ value: 13.473514585343645
2208
+ - type: euclidean_spearman
2209
+ value: 16.754702023734914
2210
+ - type: manhattan_pearson
2211
+ value: 13.72847275970385
2212
+ - type: manhattan_spearman
2213
+ value: 16.673001637012348
2214
+ - task:
2215
+ type: STS
2216
+ dataset:
2217
+ type: mteb/sts22-crosslingual-sts
2218
+ name: MTEB STS22 (it)
2219
+ config: it
2220
+ split: test
2221
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2222
+ metrics:
2223
+ - type: cos_sim_pearson
2224
+ value: 33.32761589409043
2225
+ - type: cos_sim_spearman
2226
+ value: 54.14305778960692
2227
+ - type: euclidean_pearson
2228
+ value: 45.30173241170555
2229
+ - type: euclidean_spearman
2230
+ value: 54.77422257007743
2231
+ - type: manhattan_pearson
2232
+ value: 45.41890064000217
2233
+ - type: manhattan_spearman
2234
+ value: 54.533788920795544
2235
+ - task:
2236
+ type: STS
2237
+ dataset:
2238
+ type: mteb/sts22-crosslingual-sts
2239
+ name: MTEB STS22 (pl-en)
2240
+ config: pl-en
2241
+ split: test
2242
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2243
+ metrics:
2244
+ - type: cos_sim_pearson
2245
+ value: 20.045210048995486
2246
+ - type: cos_sim_spearman
2247
+ value: 17.597101329633823
2248
+ - type: euclidean_pearson
2249
+ value: 32.531726142346145
2250
+ - type: euclidean_spearman
2251
+ value: 27.244772040848105
2252
+ - type: manhattan_pearson
2253
+ value: 32.74618458514601
2254
+ - type: manhattan_spearman
2255
+ value: 25.81220754539242
2256
+ - task:
2257
+ type: STS
2258
+ dataset:
2259
+ type: mteb/sts22-crosslingual-sts
2260
+ name: MTEB STS22 (zh-en)
2261
+ config: zh-en
2262
+ split: test
2263
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2264
+ metrics:
2265
+ - type: cos_sim_pearson
2266
+ value: -13.832846350193021
2267
+ - type: cos_sim_spearman
2268
+ value: -8.406778050457863
2269
+ - type: euclidean_pearson
2270
+ value: -6.557254855697437
2271
+ - type: euclidean_spearman
2272
+ value: -3.5112770921588563
2273
+ - type: manhattan_pearson
2274
+ value: -6.493730738275641
2275
+ - type: manhattan_spearman
2276
+ value: -2.5922348401468365
2277
+ - task:
2278
+ type: STS
2279
+ dataset:
2280
+ type: mteb/sts22-crosslingual-sts
2281
+ name: MTEB STS22 (es-it)
2282
+ config: es-it
2283
+ split: test
2284
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2285
+ metrics:
2286
+ - type: cos_sim_pearson
2287
+ value: 26.357929743436664
2288
+ - type: cos_sim_spearman
2289
+ value: 37.3417709718339
2290
+ - type: euclidean_pearson
2291
+ value: 30.930792572341293
2292
+ - type: euclidean_spearman
2293
+ value: 36.061866364725795
2294
+ - type: manhattan_pearson
2295
+ value: 31.56982745863155
2296
+ - type: manhattan_spearman
2297
+ value: 37.18529502311113
2298
+ - task:
2299
+ type: STS
2300
+ dataset:
2301
+ type: mteb/sts22-crosslingual-sts
2302
+ name: MTEB STS22 (de-fr)
2303
+ config: de-fr
2304
+ split: test
2305
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2306
+ metrics:
2307
+ - type: cos_sim_pearson
2308
+ value: 9.310102041071547
2309
+ - type: cos_sim_spearman
2310
+ value: 10.907002693108673
2311
+ - type: euclidean_pearson
2312
+ value: 7.361793742296021
2313
+ - type: euclidean_spearman
2314
+ value: 9.53967881391466
2315
+ - type: manhattan_pearson
2316
+ value: 8.017048631719996
2317
+ - type: manhattan_spearman
2318
+ value: 13.537860190039725
2319
+ - task:
2320
+ type: STS
2321
+ dataset:
2322
+ type: mteb/sts22-crosslingual-sts
2323
+ name: MTEB STS22 (de-pl)
2324
+ config: de-pl
2325
+ split: test
2326
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2327
+ metrics:
2328
+ - type: cos_sim_pearson
2329
+ value: -5.534456407419709
2330
+ - type: cos_sim_spearman
2331
+ value: 17.552638994787724
2332
+ - type: euclidean_pearson
2333
+ value: -10.136558594355556
2334
+ - type: euclidean_spearman
2335
+ value: 11.055083156366303
2336
+ - type: manhattan_pearson
2337
+ value: -11.799223055640773
2338
+ - type: manhattan_spearman
2339
+ value: 1.416528760982869
2340
+ - task:
2341
+ type: STS
2342
+ dataset:
2343
+ type: mteb/sts22-crosslingual-sts
2344
+ name: MTEB STS22 (fr-pl)
2345
+ config: fr-pl
2346
+ split: test
2347
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2348
+ metrics:
2349
+ - type: cos_sim_pearson
2350
+ value: 48.64639760720344
2351
+ - type: cos_sim_spearman
2352
+ value: 39.440531887330785
2353
+ - type: euclidean_pearson
2354
+ value: 37.75527464173489
2355
+ - type: euclidean_spearman
2356
+ value: 39.440531887330785
2357
+ - type: manhattan_pearson
2358
+ value: 32.324715276369474
2359
+ - type: manhattan_spearman
2360
+ value: 28.17180849095055
2361
+ - task:
2362
+ type: STS
2363
+ dataset:
2364
+ type: mteb/stsbenchmark-sts
2365
+ name: MTEB STSBenchmark
2366
+ config: default
2367
+ split: test
2368
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2369
+ metrics:
2370
+ - type: cos_sim_pearson
2371
+ value: 44.667456983937
2372
+ - type: cos_sim_spearman
2373
+ value: 46.04327333618551
2374
+ - type: euclidean_pearson
2375
+ value: 44.583522824155104
2376
+ - type: euclidean_spearman
2377
+ value: 44.77184813864239
2378
+ - type: manhattan_pearson
2379
+ value: 44.54496373721756
2380
+ - type: manhattan_spearman
2381
+ value: 44.830873857115996
2382
+ - task:
2383
+ type: Reranking
2384
+ dataset:
2385
+ type: mteb/scidocs-reranking
2386
+ name: MTEB SciDocsRR
2387
+ config: default
2388
+ split: test
2389
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2390
+ metrics:
2391
+ - type: map
2392
+ value: 49.756063724243
2393
+ - type: mrr
2394
+ value: 75.29077585450135
2395
+ - task:
2396
+ type: Retrieval
2397
+ dataset:
2398
+ type: scifact
2399
+ name: MTEB SciFact
2400
+ config: default
2401
+ split: test
2402
+ revision: None
2403
+ metrics:
2404
+ - type: map_at_1
2405
+ value: 14.194
2406
+ - type: map_at_10
2407
+ value: 18.756999999999998
2408
+ - type: map_at_100
2409
+ value: 19.743
2410
+ - type: map_at_1000
2411
+ value: 19.865
2412
+ - type: map_at_3
2413
+ value: 16.986
2414
+ - type: map_at_5
2415
+ value: 18.024
2416
+ - type: mrr_at_1
2417
+ value: 15.0
2418
+ - type: mrr_at_10
2419
+ value: 19.961000000000002
2420
+ - type: mrr_at_100
2421
+ value: 20.875
2422
+ - type: mrr_at_1000
2423
+ value: 20.982
2424
+ - type: mrr_at_3
2425
+ value: 18.056
2426
+ - type: mrr_at_5
2427
+ value: 19.406000000000002
2428
+ - type: ndcg_at_1
2429
+ value: 15.0
2430
+ - type: ndcg_at_10
2431
+ value: 21.775
2432
+ - type: ndcg_at_100
2433
+ value: 26.8
2434
+ - type: ndcg_at_1000
2435
+ value: 30.468
2436
+ - type: ndcg_at_3
2437
+ value: 18.199
2438
+ - type: ndcg_at_5
2439
+ value: 20.111
2440
+ - type: precision_at_1
2441
+ value: 15.0
2442
+ - type: precision_at_10
2443
+ value: 3.4000000000000004
2444
+ - type: precision_at_100
2445
+ value: 0.607
2446
+ - type: precision_at_1000
2447
+ value: 0.094
2448
+ - type: precision_at_3
2449
+ value: 7.444000000000001
2450
+ - type: precision_at_5
2451
+ value: 5.6000000000000005
2452
+ - type: recall_at_1
2453
+ value: 14.194
2454
+ - type: recall_at_10
2455
+ value: 30.0
2456
+ - type: recall_at_100
2457
+ value: 53.911
2458
+ - type: recall_at_1000
2459
+ value: 83.289
2460
+ - type: recall_at_3
2461
+ value: 20.556
2462
+ - type: recall_at_5
2463
+ value: 24.972
2464
+ - task:
2465
+ type: PairClassification
2466
+ dataset:
2467
+ type: mteb/sprintduplicatequestions-pairclassification
2468
+ name: MTEB SprintDuplicateQuestions
2469
+ config: default
2470
+ split: test
2471
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2472
+ metrics:
2473
+ - type: cos_sim_accuracy
2474
+ value: 99.35544554455446
2475
+ - type: cos_sim_ap
2476
+ value: 62.596006705300724
2477
+ - type: cos_sim_f1
2478
+ value: 60.80283353010627
2479
+ - type: cos_sim_precision
2480
+ value: 74.20749279538906
2481
+ - type: cos_sim_recall
2482
+ value: 51.5
2483
+ - type: dot_accuracy
2484
+ value: 99.13564356435643
2485
+ - type: dot_ap
2486
+ value: 43.87589686325114
2487
+ - type: dot_f1
2488
+ value: 46.99663623258049
2489
+ - type: dot_precision
2490
+ value: 45.235892691951896
2491
+ - type: dot_recall
2492
+ value: 48.9
2493
+ - type: euclidean_accuracy
2494
+ value: 99.2
2495
+ - type: euclidean_ap
2496
+ value: 43.44660755386079
2497
+ - type: euclidean_f1
2498
+ value: 45.9016393442623
2499
+ - type: euclidean_precision
2500
+ value: 52.79583875162549
2501
+ - type: euclidean_recall
2502
+ value: 40.6
2503
+ - type: manhattan_accuracy
2504
+ value: 99.2
2505
+ - type: manhattan_ap
2506
+ value: 43.11790011749347
2507
+ - type: manhattan_f1
2508
+ value: 45.11023176936122
2509
+ - type: manhattan_precision
2510
+ value: 51.88556566970091
2511
+ - type: manhattan_recall
2512
+ value: 39.900000000000006
2513
+ - type: max_accuracy
2514
+ value: 99.35544554455446
2515
+ - type: max_ap
2516
+ value: 62.596006705300724
2517
+ - type: max_f1
2518
+ value: 60.80283353010627
2519
+ - task:
2520
+ type: Clustering
2521
+ dataset:
2522
+ type: mteb/stackexchange-clustering
2523
+ name: MTEB StackExchangeClustering
2524
+ config: default
2525
+ split: test
2526
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2527
+ metrics:
2528
+ - type: v_measure
2529
+ value: 25.71674282500873
2530
+ - task:
2531
+ type: Clustering
2532
+ dataset:
2533
+ type: mteb/stackexchange-clustering-p2p
2534
+ name: MTEB StackExchangeClusteringP2P
2535
+ config: default
2536
+ split: test
2537
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2538
+ metrics:
2539
+ - type: v_measure
2540
+ value: 25.465780711520985
2541
+ - task:
2542
+ type: Reranking
2543
+ dataset:
2544
+ type: mteb/stackoverflowdupquestions-reranking
2545
+ name: MTEB StackOverflowDupQuestions
2546
+ config: default
2547
+ split: test
2548
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2549
+ metrics:
2550
+ - type: map
2551
+ value: 35.35656209427094
2552
+ - type: mrr
2553
+ value: 35.10693860877685
2554
+ - task:
2555
+ type: Retrieval
2556
+ dataset:
2557
+ type: trec-covid
2558
+ name: MTEB TRECCOVID
2559
+ config: default
2560
+ split: test
2561
+ revision: None
2562
+ metrics:
2563
+ - type: map_at_1
2564
+ value: 0.074
2565
+ - type: map_at_10
2566
+ value: 0.47400000000000003
2567
+ - type: map_at_100
2568
+ value: 1.825
2569
+ - type: map_at_1000
2570
+ value: 4.056
2571
+ - type: map_at_3
2572
+ value: 0.199
2573
+ - type: map_at_5
2574
+ value: 0.301
2575
+ - type: mrr_at_1
2576
+ value: 34.0
2577
+ - type: mrr_at_10
2578
+ value: 46.06
2579
+ - type: mrr_at_100
2580
+ value: 47.506
2581
+ - type: mrr_at_1000
2582
+ value: 47.522999999999996
2583
+ - type: mrr_at_3
2584
+ value: 44.0
2585
+ - type: mrr_at_5
2586
+ value: 44.4
2587
+ - type: ndcg_at_1
2588
+ value: 32.0
2589
+ - type: ndcg_at_10
2590
+ value: 28.633999999999997
2591
+ - type: ndcg_at_100
2592
+ value: 18.547
2593
+ - type: ndcg_at_1000
2594
+ value: 16.142
2595
+ - type: ndcg_at_3
2596
+ value: 32.48
2597
+ - type: ndcg_at_5
2598
+ value: 31.163999999999998
2599
+ - type: precision_at_1
2600
+ value: 34.0
2601
+ - type: precision_at_10
2602
+ value: 30.4
2603
+ - type: precision_at_100
2604
+ value: 18.54
2605
+ - type: precision_at_1000
2606
+ value: 7.942
2607
+ - type: precision_at_3
2608
+ value: 35.333
2609
+ - type: precision_at_5
2610
+ value: 34.0
2611
+ - type: recall_at_1
2612
+ value: 0.074
2613
+ - type: recall_at_10
2614
+ value: 0.641
2615
+ - type: recall_at_100
2616
+ value: 3.675
2617
+ - type: recall_at_1000
2618
+ value: 15.706000000000001
2619
+ - type: recall_at_3
2620
+ value: 0.231
2621
+ - type: recall_at_5
2622
+ value: 0.367
2623
+ - task:
2624
+ type: Classification
2625
+ dataset:
2626
+ type: mteb/toxic_conversations_50k
2627
+ name: MTEB ToxicConversationsClassification
2628
+ config: default
2629
+ split: test
2630
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2631
+ metrics:
2632
+ - type: accuracy
2633
+ value: 54.625600000000006
2634
+ - type: ap
2635
+ value: 9.425323874806459
2636
+ - type: f1
2637
+ value: 42.38724794017267
2638
+ - task:
2639
+ type: Classification
2640
+ dataset:
2641
+ type: mteb/tweet_sentiment_extraction
2642
+ name: MTEB TweetSentimentExtractionClassification
2643
+ config: default
2644
+ split: test
2645
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2646
+ metrics:
2647
+ - type: accuracy
2648
+ value: 42.8494623655914
2649
+ - type: f1
2650
+ value: 42.66062148844617
2651
+ - task:
2652
+ type: Clustering
2653
+ dataset:
2654
+ type: mteb/twentynewsgroups-clustering
2655
+ name: MTEB TwentyNewsgroupsClustering
2656
+ config: default
2657
+ split: test
2658
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2659
+ metrics:
2660
+ - type: v_measure
2661
+ value: 12.464890895237952
2662
+ - task:
2663
+ type: PairClassification
2664
+ dataset:
2665
+ type: mteb/twittersemeval2015-pairclassification
2666
+ name: MTEB TwitterSemEval2015
2667
+ config: default
2668
+ split: test
2669
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2670
+ metrics:
2671
+ - type: cos_sim_accuracy
2672
+ value: 79.97854205161829
2673
+ - type: cos_sim_ap
2674
+ value: 47.45175747605773
2675
+ - type: cos_sim_f1
2676
+ value: 46.55775962660444
2677
+ - type: cos_sim_precision
2678
+ value: 41.73640167364017
2679
+ - type: cos_sim_recall
2680
+ value: 52.638522427440634
2681
+ - type: dot_accuracy
2682
+ value: 77.76718126005842
2683
+ - type: dot_ap
2684
+ value: 35.97737653101504
2685
+ - type: dot_f1
2686
+ value: 41.1975475754439
2687
+ - type: dot_precision
2688
+ value: 29.50165355228646
2689
+ - type: dot_recall
2690
+ value: 68.25857519788919
2691
+ - type: euclidean_accuracy
2692
+ value: 79.34076414138403
2693
+ - type: euclidean_ap
2694
+ value: 45.309577778755134
2695
+ - type: euclidean_f1
2696
+ value: 45.09938313913639
2697
+ - type: euclidean_precision
2698
+ value: 39.76631748589847
2699
+ - type: euclidean_recall
2700
+ value: 52.0844327176781
2701
+ - type: manhattan_accuracy
2702
+ value: 79.31692197651546
2703
+ - type: manhattan_ap
2704
+ value: 45.2433373222626
2705
+ - type: manhattan_f1
2706
+ value: 45.04624986069319
2707
+ - type: manhattan_precision
2708
+ value: 38.99286127725256
2709
+ - type: manhattan_recall
2710
+ value: 53.324538258575195
2711
+ - type: max_accuracy
2712
+ value: 79.97854205161829
2713
+ - type: max_ap
2714
+ value: 47.45175747605773
2715
+ - type: max_f1
2716
+ value: 46.55775962660444
2717
+ - task:
2718
+ type: PairClassification
2719
+ dataset:
2720
+ type: mteb/twitterurlcorpus-pairclassification
2721
+ name: MTEB TwitterURLCorpus
2722
+ config: default
2723
+ split: test
2724
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2725
+ metrics:
2726
+ - type: cos_sim_accuracy
2727
+ value: 81.76737687740133
2728
+ - type: cos_sim_ap
2729
+ value: 64.59241956109807
2730
+ - type: cos_sim_f1
2731
+ value: 57.83203629255339
2732
+ - type: cos_sim_precision
2733
+ value: 55.50442477876106
2734
+ - type: cos_sim_recall
2735
+ value: 60.363412380659064
2736
+ - type: dot_accuracy
2737
+ value: 78.96922420149805
2738
+ - type: dot_ap
2739
+ value: 56.11775087282065
2740
+ - type: dot_f1
2741
+ value: 52.92134831460675
2742
+ - type: dot_precision
2743
+ value: 51.524212368728115
2744
+ - type: dot_recall
2745
+ value: 54.39636587619341
2746
+ - type: euclidean_accuracy
2747
+ value: 80.8611790274382
2748
+ - type: euclidean_ap
2749
+ value: 61.28070098354092
2750
+ - type: euclidean_f1
2751
+ value: 54.58334971882497
2752
+ - type: euclidean_precision
2753
+ value: 55.783297162607504
2754
+ - type: euclidean_recall
2755
+ value: 53.43393902063443
2756
+ - type: manhattan_accuracy
2757
+ value: 80.72534637326814
2758
+ - type: manhattan_ap
2759
+ value: 61.18048430787254
2760
+ - type: manhattan_f1
2761
+ value: 54.50978912822061
2762
+ - type: manhattan_precision
2763
+ value: 53.435396790178245
2764
+ - type: manhattan_recall
2765
+ value: 55.6282722513089
2766
+ - type: max_accuracy
2767
+ value: 81.76737687740133
2768
+ - type: max_ap
2769
+ value: 64.59241956109807
2770
+ - type: max_f1
2771
+ value: 57.83203629255339
2772
+ ---