metadata
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:5822
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: nomic-ai/modernbert-embed-base
widget:
- source_sentence: >-
court opined that the Board exercised “substantial independent authority”
and thus was also a
FOIA “agency” under Soucie’s functional test. Id. at 584–85.
This Court’s previous opinion followed Energy Research’s analytical
steps. As with the
Board, Congress made the Commission an “establishment in the executive
branch,” one of the
sentences:
- How does the Court describe the CIA's work?
- Which test was used to determine that the Board was a FOIA 'agency'?
- What is the estimated value range of the contract in question?
- source_sentence: >-
• The Court grants in part and denies in part summary judgment to the CIA
on Count Three
in No. 11-445. The Court denies summary judgment to the CIA with respect
to (1) the
CIA’s withholding of responsive information under FOIA Exemption 3 and the
CIA Act,
50 U.S.C. § 403g, see supra Part III.H.; and (2) the CIA’s withholding of
responsive
161
sentences:
- Under what condition can the parties file renewed motions?
- >-
What legislation is referenced in connection with the CIA's withholding
of information?
- What does the Government not dispute regarding § 340.403?
- source_sentence: >-
for a specific procurement through separate joint ventures with different
protégés.” Id. The SBA
underscored this purpose by highlighting that in acquiring a second
protégé, the mentor “has
already assured SBA that the two protégés would not be competitors. If
the two mentor-protégé
relationships were approved in the same [North American Industry
Classification System] code,
sentences:
- What is the title of section D?
- What does the mentor assure the SBA about the two protégés?
- Where can specific details about the plaintiff's opposition be found?
- source_sentence: >-
moving party has shown a privacy interest outweighing the public’s
interest in open judicial
proceedings. Doe, 282 Ill. App. 3d at 1088. The standard of review for the
trial court’s
determination stands, absent an abuse of discretion. Northwestern Memorial
Hospital, 2014 IL
App (1st) 140212, ¶ 36.
¶ 51
sentences:
- >-
What is mentioned as the standard of review for the trial court’s
determination?
- When did the plaintiff file a motion?
- What does recognizing assignments of FOIA request rights result in?
- source_sentence: >-
Williams Decl. Exs. D–I, ECF No. 53-1. In Counts Five and Six of No.
11-445, the plaintiff
challenges the DIA’s and the ODNI’s withholding determinations,
respectively, made under
10
FOIA Exemptions 1, 2, 3, 5, and 6. See 445 FAC ¶¶ 38–54; Defs.’ First 445
Mem. at 4–6; Pl.’s
First 445 Opp’n at 6, 17–22, 24.7
B.
2010 FOIA Requests
1.
sentences:
- >-
What did the forum a quo determine it would do after the parties exposed
their positions?
- Under which FOIA exemptions are the withholding determinations made?
- How many remaining claims does the plaintiff have?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: ModernBERT Embed base Legal Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.5440494590417311
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.58887171561051
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6877897990726429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7619783616692427
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5440494590417311
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5151983513652756
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3984544049459042
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.23616692426584238
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.19448737764039153
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5047655847501289
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6329211746522411
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7434312210200927
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6499814474424818
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5917923995976541
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6349937117655203
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.5316846986089645
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5826893353941267
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6893353941267388
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7619783616692427
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5316846986089645
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5100463678516228
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3993817619783616
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.23817619783616692
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.18663060278207108
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.49613601236476046
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6312467800103039
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7480680061823802
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6470109167633091
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.583873064939525
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6280912185452766
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.5069551777434312
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5486862442040186
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.652241112828439
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7357032457496137
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5069551777434312
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.4863472436888202
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.37712519319938176
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.2282843894899536
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.17362184441009787
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4719216898505925
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5965996908809892
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7174137042761463
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6158619070528558
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.555434115944162
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6000656985096435
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.4327666151468315
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.47449768160741884
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5703245749613601
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6646058732612056
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4327666151468315
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.41576506955177744
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3316846986089645
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.20819165378670787
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.148634724368882
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3999227202472952
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5211231324059763
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6510819165378671
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5456391631379686
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.48163317877382794
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5298973764645131
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.3323029366306028
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.37094281298299847
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.44513137557959814
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.5239567233384853
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3323029366306028
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.32096857290056674
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.25718701700154567
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.16306027820710975
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.11669242658423493
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3104070066975786
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.4031427099433281
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5090159711488923
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.42514271233181616
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.37330168543460646
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4208075319076454
name: Cosine Map@100
ModernBERT Embed base Legal Matryoshka
This is a sentence-transformers model finetuned from nomic-ai/modernbert-embed-base on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: nomic-ai/modernbert-embed-base
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
- License: apache-2.0
Model Sources
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("ao-ot1231231/modernbert-embed-base-legal-matryoshka-2")
sentences = [
'Williams Decl. Exs. D–I, ECF No. 53-1. In Counts Five and Six of No. 11-445, the plaintiff \nchallenges the DIA’s and the ODNI’s withholding determinations, respectively, made under \n10 \n \nFOIA Exemptions 1, 2, 3, 5, and 6. See 445 FAC ¶¶ 38–54; Defs.’ First 445 Mem. at 4–6; Pl.’s \nFirst 445 Opp’n at 6, 17–22, 24.7 \nB. \n2010 FOIA Requests \n1.',
'Under which FOIA exemptions are the withholding determinations made?',
'What did the forum a quo determine it would do after the parties exposed their positions?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
similarities = model.similarity(embeddings, embeddings)
print(similarities)
Evaluation
Metrics
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.544 |
| cosine_accuracy@3 |
0.5889 |
| cosine_accuracy@5 |
0.6878 |
| cosine_accuracy@10 |
0.762 |
| cosine_precision@1 |
0.544 |
| cosine_precision@3 |
0.5152 |
| cosine_precision@5 |
0.3985 |
| cosine_precision@10 |
0.2362 |
| cosine_recall@1 |
0.1945 |
| cosine_recall@3 |
0.5048 |
| cosine_recall@5 |
0.6329 |
| cosine_recall@10 |
0.7434 |
| cosine_ndcg@10 |
0.65 |
| cosine_mrr@10 |
0.5918 |
| cosine_map@100 |
0.635 |
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.5317 |
| cosine_accuracy@3 |
0.5827 |
| cosine_accuracy@5 |
0.6893 |
| cosine_accuracy@10 |
0.762 |
| cosine_precision@1 |
0.5317 |
| cosine_precision@3 |
0.51 |
| cosine_precision@5 |
0.3994 |
| cosine_precision@10 |
0.2382 |
| cosine_recall@1 |
0.1866 |
| cosine_recall@3 |
0.4961 |
| cosine_recall@5 |
0.6312 |
| cosine_recall@10 |
0.7481 |
| cosine_ndcg@10 |
0.647 |
| cosine_mrr@10 |
0.5839 |
| cosine_map@100 |
0.6281 |
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.507 |
| cosine_accuracy@3 |
0.5487 |
| cosine_accuracy@5 |
0.6522 |
| cosine_accuracy@10 |
0.7357 |
| cosine_precision@1 |
0.507 |
| cosine_precision@3 |
0.4863 |
| cosine_precision@5 |
0.3771 |
| cosine_precision@10 |
0.2283 |
| cosine_recall@1 |
0.1736 |
| cosine_recall@3 |
0.4719 |
| cosine_recall@5 |
0.5966 |
| cosine_recall@10 |
0.7174 |
| cosine_ndcg@10 |
0.6159 |
| cosine_mrr@10 |
0.5554 |
| cosine_map@100 |
0.6001 |
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.4328 |
| cosine_accuracy@3 |
0.4745 |
| cosine_accuracy@5 |
0.5703 |
| cosine_accuracy@10 |
0.6646 |
| cosine_precision@1 |
0.4328 |
| cosine_precision@3 |
0.4158 |
| cosine_precision@5 |
0.3317 |
| cosine_precision@10 |
0.2082 |
| cosine_recall@1 |
0.1486 |
| cosine_recall@3 |
0.3999 |
| cosine_recall@5 |
0.5211 |
| cosine_recall@10 |
0.6511 |
| cosine_ndcg@10 |
0.5456 |
| cosine_mrr@10 |
0.4816 |
| cosine_map@100 |
0.5299 |
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.3323 |
| cosine_accuracy@3 |
0.3709 |
| cosine_accuracy@5 |
0.4451 |
| cosine_accuracy@10 |
0.524 |
| cosine_precision@1 |
0.3323 |
| cosine_precision@3 |
0.321 |
| cosine_precision@5 |
0.2572 |
| cosine_precision@10 |
0.1631 |
| cosine_recall@1 |
0.1167 |
| cosine_recall@3 |
0.3104 |
| cosine_recall@5 |
0.4031 |
| cosine_recall@10 |
0.509 |
| cosine_ndcg@10 |
0.4251 |
| cosine_mrr@10 |
0.3733 |
| cosine_map@100 |
0.4208 |
Training Details
Training Dataset
json
- Dataset: json
- Size: 5,822 training samples
- Columns:
positive and anchor
- Approximate statistics based on the first 1000 samples:
|
positive |
anchor |
| type |
string |
string |
| details |
- min: 28 tokens
- mean: 97.25 tokens
- max: 170 tokens
|
- min: 7 tokens
- mean: 16.57 tokens
- max: 49 tokens
|
- Samples:
| positive |
anchor |
personnel.” See id. The answer to that question remains unclear, and the Court need not decide 113 it here.52 It suffices to conclude that the names withheld by the CIA are at least arguably protected from disclosure under the interpretation of § 403g announced in Halperin, and thus withholding those names does not rise to the level of “general sloppiness” that would caution |
Under which interpretation are the names at least arguably protected from disclosure? |
last of these motions became ripe on June 11, 2013. Additionally, on November 21, 2012, the plaintiff filed a motion for leave to file a second amended complaint in No. 11-445, and on January 11, 2013, the plaintiff filed a motion for sanctions in No. 11-443. Thus, currently pending before the Court in these related actions are ten motions: eight motions or cross-motions 28 |
When did the last of the motions become ripe? |
the parties to confer, once this report is final, and submit any remaining areas of disagreement on the scope of the inspection to the Court. 33 D.I. 1, Ex. 2. 34 Id. Senetas Corporation, Ltd. v. DeepRadiology Corporation C.A. No. 2019-0170-PWG July 30, 2019 9 accurate financial records; failed to keep the Board reasonably informed about |
What is the case number for Senetas Corporation, Ltd. v. DeepRadiology Corporation? |
- Loss:
MatryoshkaLoss with these parameters:{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: epoch
per_device_train_batch_size: 32
per_device_eval_batch_size: 16
gradient_accumulation_steps: 16
learning_rate: 2e-05
num_train_epochs: 4
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: True
tf32: True
load_best_model_at_end: True
batch_sampler: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir: False
do_predict: False
eval_strategy: epoch
prediction_loss_only: True
per_device_train_batch_size: 32
per_device_eval_batch_size: 16
per_gpu_train_batch_size: None
per_gpu_eval_batch_size: None
gradient_accumulation_steps: 16
eval_accumulation_steps: None
torch_empty_cache_steps: None
learning_rate: 2e-05
weight_decay: 0.0
adam_beta1: 0.9
adam_beta2: 0.999
adam_epsilon: 1e-08
max_grad_norm: 1.0
num_train_epochs: 4
max_steps: -1
lr_scheduler_type: cosine
lr_scheduler_kwargs: {}
warmup_ratio: 0.1
warmup_steps: 0
log_level: passive
log_level_replica: warning
log_on_each_node: True
logging_nan_inf_filter: True
save_safetensors: True
save_on_each_node: False
save_only_model: False
restore_callback_states_from_checkpoint: False
no_cuda: False
use_cpu: False
use_mps_device: False
seed: 42
data_seed: None
jit_mode_eval: False
bf16: True
fp16: False
fp16_opt_level: O1
half_precision_backend: auto
bf16_full_eval: False
fp16_full_eval: False
tf32: True
local_rank: 0
ddp_backend: None
tpu_num_cores: None
tpu_metrics_debug: False
debug: []
dataloader_drop_last: False
dataloader_num_workers: 0
dataloader_prefetch_factor: None
past_index: -1
disable_tqdm: False
remove_unused_columns: True
label_names: None
load_best_model_at_end: True
ignore_data_skip: False
fsdp: []
fsdp_min_num_params: 0
fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
fsdp_transformer_layer_cls_to_wrap: None
accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
parallelism_config: None
deepspeed: None
label_smoothing_factor: 0.0
optim: adamw_torch_fused
optim_args: None
adafactor: False
group_by_length: False
length_column_name: length
project: huggingface
trackio_space_id: trackio
ddp_find_unused_parameters: None
ddp_bucket_cap_mb: None
ddp_broadcast_buffers: False
dataloader_pin_memory: True
dataloader_persistent_workers: False
skip_memory_metrics: True
use_legacy_prediction_loop: False
push_to_hub: False
resume_from_checkpoint: None
hub_model_id: None
hub_strategy: every_save
hub_private_repo: None
hub_always_push: False
hub_revision: None
gradient_checkpointing: False
gradient_checkpointing_kwargs: None
include_inputs_for_metrics: False
include_for_metrics: []
eval_do_concat_batches: True
fp16_backend: auto
push_to_hub_model_id: None
push_to_hub_organization: None
mp_parameters:
auto_find_batch_size: False
full_determinism: False
torchdynamo: None
ray_scope: last
ddp_timeout: 1800
torch_compile: False
torch_compile_backend: None
torch_compile_mode: None
include_tokens_per_second: False
include_num_input_tokens_seen: no
neftune_noise_alpha: None
optim_target_modules: None
batch_eval_metrics: False
eval_on_start: False
use_liger_kernel: False
liger_kernel_config: None
eval_use_gather_object: False
average_tokens_across_devices: True
prompts: None
batch_sampler: no_duplicates
multi_dataset_batch_sampler: proportional
router_mapping: {}
learning_rate_mapping: {}
Training Logs
| Epoch |
Step |
Training Loss |
dim_768_cosine_ndcg@10 |
dim_512_cosine_ndcg@10 |
dim_256_cosine_ndcg@10 |
dim_128_cosine_ndcg@10 |
dim_64_cosine_ndcg@10 |
| 0.8791 |
10 |
5.7061 |
- |
- |
- |
- |
- |
| 1.0 |
12 |
- |
0.6031 |
0.5863 |
0.5621 |
0.4889 |
0.3463 |
| 1.7033 |
20 |
2.6671 |
- |
- |
- |
- |
- |
| 2.0 |
24 |
- |
0.6410 |
0.6341 |
0.6047 |
0.5248 |
0.4071 |
| 2.5275 |
30 |
2.0092 |
- |
- |
- |
- |
- |
| 3.0 |
36 |
- |
0.6489 |
0.6465 |
0.6154 |
0.5391 |
0.4261 |
| 3.3516 |
40 |
1.6698 |
- |
- |
- |
- |
- |
| 4.0 |
48 |
- |
0.65 |
0.647 |
0.6159 |
0.5456 |
0.4251 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 5.1.2
- Transformers: 4.57.3
- PyTorch: 2.9.1+cu128
- Accelerate: 1.12.0
- Datasets: 4.4.1
- Tokenizers: 0.22.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}