SentenceTransformer based on intfloat/multilingual-e5-large
This is a sentence-transformers model finetuned from intfloat/multilingual-e5-large on the inhouse_devanagari dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: intfloat/multilingual-e5-large
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
queries = [
"\u0935\u093e\u0924\u093e\u0924\u092a\u093e\u0927\u094d\u0935-\u092f\u093e\u0928\u093e\u0926\u093f-\u092a\u0930\u093f\u0939\u093e\u0930\u094d\u092f\u0947\u0937\u094d\u0935\u094d \u0905-\u092f\u0928\u094d\u0924\u094d\u0930\u0923\u092e\u094d \u0964 \u092a\u094d\u0930\u092f\u094b\u091c\u094d\u092f\u0902 \u0938\u0941-\u0915\u0941\u092e\u093e\u0930\u093e\u0923\u093e\u092e\u094d \u0908\u0936\u094d\u0935\u0930\u093e\u0923\u093e\u092e\u094d \u0938\u0941\u0916\u093e\u0924\u094d\u092e\u0928\u093e\u092e\u094d \u0965 \u096a\u096b \u0965",
]
documents = [
'**Ashtanga Hridayam, Chikitsa Sthana, chapter 13, sutra 45**\n\n**Sutra**:\nवातातपाध्व-यानादि-परिहार्येष्व् अ-यन्त्रणम् । प्रयोज्यं सु-कुमाराणाम् ईश्वराणाम् सुखात्मनाम् ॥ ४५ ॥\n\n**English Transliteration**:\nvātātapādhva-yānādi-parihāryeṣv a-yantraṇam | prayojyaṃ su-kumārāṇām īśvarāṇām sukhātmanām || 45 ||\n\n**English Translation**:\nWithout restrictions regarding avoidance of wind, sun, travel, etc., it can be used by delicate, wealthy, and happy individuals.',
'**Ashtanga Hridayam, Sutra Sthana, chapter 22, sutra 34**\n\n**Sutra**:\nकच-सदन-सित-त्व-पिञ्जर-त्वं परिफुटनं शिरसः समीर-रोगान् । जयति जनयतीन्द्रिय-प्रसादं स्वर-हनु-मूर्द्ध-बलं च मूर्द्ध-तैलम् ॥ ३४ ॥\n\n**English Transliteration**:\nkaca-sadana-sita-tva-piñjara-tvaṃ parisphuṭanaṃ śirasaḥ samīra-rogān । jayati janayatīndriya-prasādaṃ svara-hanu-mūrddha-balaṃ ca mūrddha-tailam ॥ 34 ॥\n\n**English Translation**:\nHair-falling-white-ness-yellowish-ness splitting of head wind-diseases overcomes generates sense-organ-pleasure voice-jaw-head-strength and head-oil.',
'**Ashtanga Hridayam, Sutra Sthana, Sutra Sthana, chapter 6, sutra 129**\n\n**Sutra**:\nगुर्व् आम्रं वात-जित् पक्वं स्वाद्व् अम्लं कफ-शुक्र-कृत् । वृक्षाम्लं ग्राहि रूक्षोष्णं वात-श्लेष्म-हरं लघु ॥ १२९ ॥\n\n**English Transliteration**:\ngurv āmraṃ vāta-jit pakvaṃ svādv amlaṃ kapha-śukra-kṛt । vṛkṣāmlaṃ grāhi rūkṣoṣṇaṃ vāta-śleṣma-haraṃ laghu ॥ 129 ॥\n\n**English Translation**:\nHeavy mango vata-conquering ripe sweet-sour kapha-semen-doing. Garcinia astringent dry-hot vata-phlegm-removing light.',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 1024] [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[0.7942, 0.0831, 0.0912]])
Evaluation
Metrics
Triplet
- Datasets:
Embedding_Dataset_Devandall-nli-test - Evaluated with
TripletEvaluator
| Metric | Embedding_Dataset_Dev | all-nli-test |
|---|---|---|
| cosine_accuracy | 0.9998 | 0.9996 |
Training Details
Training Dataset
inhouse_devanagari
- Dataset: inhouse_devanagari at 9076844
- Size: 40,374 training samples
- Columns:
query,positive_pair, andnegative_pair - Approximate statistics based on the first 1000 samples:
query positive_pair negative_pair type string string string details - min: 11 tokens
- mean: 55.91 tokens
- max: 512 tokens
- min: 78 tokens
- mean: 193.14 tokens
- max: 512 tokens
- min: 77 tokens
- mean: 192.85 tokens
- max: 512 tokens
- Samples:
query positive_pair negative_pair
नैते सृती पार्थ जानन्योगी मुह्यति कश्चन। तस्मात्सर्वेषु कालेषु योगयुक्तो भवार्जुन।।8.27।।Shloka:
नैते सृती पार्थ जानन्योगी मुह्यति कश्चन। तस्मात्सर्वेषु कालेषु योगयुक्तो भवार्जुन।।8.27।।
Transliteration:
naite sṛtī pārtha jānanyogī muhyati kaścana| tasmātsarveṣu kāleṣu yogayukto bhavārjuna||8.27||
English Translation by Shri Purohit Swami:
O Arjuna! The saint knowing these paths is not confused. Therefore meditate perpetually.
English Translation Of Sri Shankaracharya's Sanskrit Commentary By Swami Gambirananda:
O son of Prtha, na kascana yogi, no yogi whosoever; janan, has known; ete srti, these two courses as described-that one leads to worldly life, and the other to Liberation; muhyati, becomes deluded. Tasmat, therefore; O Arjuna, bhava, be you; yoga-yuktah, steadfast in Yoga; sarvesu kalesu, at all times. Here about the greatness of that yoga:Shloka:
यज्ञार्थात्कर्मणोऽन्यत्र लोकोऽयं कर्मबन्धनः। तदर्थं कर्म कौन्तेय मुक्तसंगः समाचर।।3.9।।
Transliteration:
yajñārthātkarmaṇo'nyatra loko'yaṃ karmabandhanaḥ| tadarthaṃ karma kaunteya muktasaṃgaḥ samācara||3.9||
English Translation by Shri Purohit Swami:
In this world people are fettered by action, unless it is performed as a sacrifice. Therefore, O Arjuna, let thy acts be done without attachment, as sacrifice only.
English Translation Of Sri Shankaracharya's Sanskrit Commentary By Swami Gambirananda:
Ayam, this; lokah, man, the one who is eligible for action; karma-bandhanah, becomes bound by actions- the person who has karma as his bondage (bandhana) is karma-bandhanah-; anyatra, other than; that karmanah, action; yajnarthat, meant for Got not by that meant for God. According to the Vedic text, 'Sacrifice is verily Visnu' (Tai. Sam. 1.7.4), yajnah means God; whatever is done for Him is yajnartham. Therefore, mukta-sangah, without being attached, being free fr...Specifically, in the shataponaka type, the physician should create wounds within the tracts. After these have healed, the remaining tracts should be treated.Susrut Samhita, Chikitsa Sthana, chapter 8, sutra 5
Sutra:
विशेषतस्तु- नाड्यन्तरे व्रणान् कुर्याद्भिषक् तु शतपोनके | ततस्तेषूपरूढेषु शेषा नाडीरुपाचरेत् ||५||
English Transliteration:
viśeṣatastu- nāḍyantare vraṇān kuryādbhīṣak tu śataponake | tatasteṣūparūḍheṣu śeṣā nāḍīrupācaret ||5||
English Translation:
Specifically, in the shataponaka type, the physician should create wounds within the tracts. After these have healed, the remaining tracts should be treated.Susrut Samhita, Uttara tantra, chapter 39, sutra 306
Sutra:
चूर्णितैस्त्रिफलाश्यामात्रिवृत्पिप्पलिसंयुतैः | सक्षौद्रः शर्करायुक्तो विरेकस्तु प्रशस्यते ||३०६||
English Transliteration:
cūrṇitaistriphalāśyāmātrivṛtpippalisaṃyutaiḥ | sakṣaudraḥ śarkarāyukto virekastu praśasyate ||306||
English Translation:
A purgative (vireka) is recommended when prepared with powdered Triphala, Shyama, Trivrit, and Pippali, mixed with honey and sugar.अथ पुण्ये ऽह्नि संपूज्य पूज्यांस् तां प्रविशेच् छुचिः । तत्र संशोधनैः शुद्धः सुखी जात-बलः पुनः ॥ ८ ॥Ashtanga Hridayam, Uttara Sthana, chapter 39, sutra 8
Sutra:
अथ पुण्ये ऽह्नि संपूज्य पूज्यांस् तां प्रविशेच् छुचिः । तत्र संशोधनैः शुद्धः सुखी जात-बलः पुनः ॥ ८ ॥
English Transliteration:
atha puṇye 'hni saṃpūjya pūjyāṃs tāṃ praviśec chuchiḥ | tatra saṃśodhanaiḥ śuddhaḥ sukhī jāta-balaḥ punaḥ || 8 ||
English Translation:
Then, on an auspicious day, having worshipped the worshipful, the pure one should enter it; there, purified by cleansing therapies, he becomes happy and regains strength.Ashtanga Hridayam, Uttara Sthana, chapter 40, sutra 82
Sutra:
दीर्घ-जीवितम् आरोग्यं धर्मम् अर्थं सुखं यशः । पाठावबोधानुष्ठानैर् अधिगच्छत्य् अतो ध्रुवम् ॥ ८२ ॥
English Transliteration:
dīrgha-jīvitam ārogyaṁ dharmam arthaṁ sukhaṁ yaśaḥ | pāṭhāvabodhānuṣṭhānair adhigacchaty ato dhruvam || 82 ||
English Translation:
Long life, health, righteousness, wealth, happiness, and fame, one attains surely through reading, understanding, and practicing this. - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "gather_across_devices": false }
Evaluation Dataset
inhouse_devanagari
- Dataset: inhouse_devanagari at 9076844
- Size: 5,047 evaluation samples
- Columns:
query,positive_pair, andnegative_pair - Approximate statistics based on the first 1000 samples:
query positive_pair negative_pair type string string string details - min: 10 tokens
- mean: 54.29 tokens
- max: 512 tokens
- min: 72 tokens
- mean: 192.49 tokens
- max: 512 tokens
- min: 78 tokens
- mean: 196.56 tokens
- max: 512 tokens
- Samples:
query positive_pair negative_pair Marma-destroyed separately not-said their flesh-etc.-depending-on. Generally with-foreign-body but agitating by action with-pain.Ashtanga Hridayam, Sutra Sthana, chapter 28, sutra 17
Sutra:
मर्म-नष्टं पृथङ् नोक्तं तेषां मांसादि-संश्रयात् । सामान्येन स-शल्यं तु क्षोभिण्या क्रियया स-रुक् ॥ १७ ॥
English Transliteration:
marma-naṣṭaṃ pṛthaṅ noktaṃ teṣāṃ māṃsādi-saṃśrayāt । sāmānyena sa-śalyaṃ tu kṣobhiṇyā kriyayā sa-ruk ॥ 17 ॥
English Translation:
Marma-destroyed separately not-said their flesh-etc.-depending-on. Generally with-foreign-body but agitating by action with-pain.Ashtanga Hridayam, Chikitsa Sthana, chapter 6, sutra 34
Sutra:
पञ्च-कोल-शठी-पथ्या-गुड-बीजाह्व-पौष्करम् । वारुणी-कल्कितं भृष्टं यमके लवणान्वितम् ॥ ३४ ॥
English Transliteration:
pañca-kola-śaṭhī-pathyā-guḍa-bījāhva-pauṣkaram । vāruṇī-kalkitaṃ bhṛṣṭaṃ yamake lavaṇānvitam ॥ 34 ॥
English Translation:
Five-kolas, shathi, pathya, jaggery, bija, and pushkara, ground with varuni, fried in clarified butter, and mixed with salt.प्राचीनामलकं चैव दोषघ्नं गरहारि च| ऐङ्गुदं तिक्तमधुरं स्निग्धोष्णं कफवातजित्||१४६||Charak-Samhita, sutra sthana, chapter 27, sutra 146
Sutra:
प्राचीनामलकं चैव दोषघ्नं गरहारि च| ऐङ्गुदं तिक्तमधुरं स्निग्धोष्णं कफवातजित्||१४६||
English Transliteration:
prācīnāmalakaṃ caiva doṣaghnaṃ garahāri ca| aiṅgudaṃ tiktamadhuraṃ snigdhoṣṇaṃ kaphavātajit||146||
English Translation:
Pracinamalaka eliminates the doshas and counteracts poison. Inguda is bitter and sweet, unctuous and hot, and conquers Kapha and Vata.Charak-Samhita, chikitsa sthana, chapter 15, sutra 65
Sutra:
कट्वजीर्णविदाह्यम्लक्षाराद्यैः पित्तमुल्बणम्| अग्निमाप्लावयद्धन्ति जलं तप्तमिवानलम्||६५||
English Transliteration:
kaṭvajīrṇavidāhyamlākṣārādyaiḥ pittamulbaṇam| agnimāplāvayaddhanti jalaṃ taptamivānalam||65||
English Translation:
Pitta (bile) aggravated by pungent, indigestible, burning, sour, alkaline, and other substances, overwhelms the agni (digestive fire) and destroys it, just as hot water extinguishes a fire.Vāta becomes aggravated by excessive consumption of dry food, overeating, exposure to easterly winds, dew, sexual intercourse, suppression of natural urges, exertion, and exercise.Charak-Samhita, siddhi sthana, chapter 9, sutra 74
Sutra:
रूक्षात्यध्यशनात् पूर्ववातावश्यायमैथुनैः| वेगसन्धारणायासव्यायामैः कुपितोऽनिलः||७४||
English Transliteration:
rūkṣātyadhyaśanāt pūrvavātāvaśyāyamaithunaiḥ| vegasaṃdhāraṇāyāsavyāyāmaiḥ kupito'nilaḥ||74||
English Translation:
Vāta becomes aggravated by excessive consumption of dry food, overeating, exposure to easterly winds, dew, sexual intercourse, suppression of natural urges, exertion, and exercise.Charak-Samhita, sharira sthana, chapter 4, sutra 4
Sutra:
मातृतः पितृत आत्मतः सात्म्यतो रसतः सत्त्वत इत्येतेभ्यो भावेभ्यः समुदितेभ्यो गर्भः सम्भवति| तस्य ये येऽवयवा यतो यतः सम्भवतः सम्भवन्ति तान् विभज्य मातृजादीनवयवान् पृथक् पृथगुक्तमग्रे||४||
English Transliteration:
mātṛtaḥ pitṛta ātmatas sāmyato rasataḥ sattvata ityetebhyo bhāvebhyaḥ samuditebhyo garbhaḥ sambhavati| tasya ye ye'vayavā yato yataḥ sambhavataḥ sambhavanti tān vibhajya mātṛjādīnavayavān pṛthak pṛthaguktamagre||4||
English Translation:
The embryo originates from the combined factors of the mother, the father, the self, suitability, nutrition, and the mind. The specific components of it that originate from each of these sources will be described separately in the following sections, distinguishing the maternal and other components. - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "gather_across_devices": false }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 16per_device_eval_batch_size: 16num_train_epochs: 1warmup_ratio: 0.1fp16: Truebatch_sampler: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 16per_device_eval_batch_size: 16per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torch_fusedoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthproject: huggingfacetrackio_space_id: trackioddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: noneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Trueprompts: Nonebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}
Training Logs
| Epoch | Step | Training Loss | Validation Loss | Embedding_Dataset_Dev_cosine_accuracy | all-nli-test_cosine_accuracy |
|---|---|---|---|---|---|
| -1 | -1 | - | - | 0.9990 | - |
| 0.0396 | 100 | 0.4702 | 0.0037 | 0.9996 | - |
| 0.0792 | 200 | 0.0087 | 0.0041 | 0.9992 | - |
| 0.1189 | 300 | 0.004 | 0.0041 | 0.9994 | - |
| 0.1585 | 400 | 0.0037 | 0.0038 | 0.9994 | - |
| 0.1981 | 500 | 0.0041 | 0.0037 | 0.9994 | - |
| 0.2377 | 600 | 0.0011 | 0.0025 | 0.9994 | - |
| 0.2773 | 700 | 0.0046 | 0.0027 | 0.9996 | - |
| 0.3170 | 800 | 0.0014 | 0.0024 | 0.9998 | - |
| 0.3566 | 900 | 0.0008 | 0.0025 | 0.9998 | - |
| 0.3962 | 1000 | 0.0044 | 0.0027 | 1.0 | - |
| 0.4358 | 1100 | 0.0015 | 0.0027 | 1.0 | - |
| 0.4754 | 1200 | 0.0033 | 0.0031 | 0.9998 | - |
| 0.5151 | 1300 | 0.0071 | 0.0047 | 0.9996 | - |
| 0.5547 | 1400 | 0.0055 | 0.0027 | 0.9998 | - |
| 0.5943 | 1500 | 0.0025 | 0.0027 | 0.9994 | - |
| 0.6339 | 1600 | 0.003 | 0.0026 | 0.9994 | - |
| 0.6735 | 1700 | 0.0015 | 0.0024 | 0.9994 | - |
| 0.7132 | 1800 | 0.0017 | 0.0032 | 0.9996 | - |
| 0.7528 | 1900 | 0.0041 | 0.0025 | 0.9998 | - |
| 0.7924 | 2000 | 0.0041 | 0.0022 | 0.9998 | - |
| 0.8320 | 2100 | 0.0048 | 0.0022 | 0.9998 | - |
| 0.8716 | 2200 | 0.0011 | 0.0023 | 0.9998 | - |
| 0.9113 | 2300 | 0.0038 | 0.0024 | 0.9996 | - |
| 0.9509 | 2400 | 0.0039 | 0.0022 | 0.9998 | - |
| 0.9905 | 2500 | 0.0052 | 0.0020 | 0.9998 | - |
| -1 | -1 | - | - | - | 0.9996 |
Framework Versions
- Python: 3.12.11
- Sentence Transformers: 5.1.1
- Transformers: 4.57.0
- PyTorch: 2.8.0+cu128
- Accelerate: 1.10.1
- Datasets: 4.2.0
- Tokenizers: 0.22.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 1
Model tree for vrnP66/my-model-repo
Base model
intfloat/multilingual-e5-largeDataset used to train vrnP66/my-model-repo
Evaluation results
- Cosine Accuracy on Embedding Dataset Devself-reported1.000
- Cosine Accuracy on all nli testself-reported1.000