---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:5822
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: nomic-ai/modernbert-embed-base
widget:
- source_sentence: "court opined that the Board exercised “substantial independent\
\ authority” and thus was also a \nFOIA “agency” under Soucie’s functional test.\
\ Id. at 584–85. \nThis Court’s previous opinion followed Energy Research’s analytical\
\ steps. As with the \nBoard, Congress made the Commission an “establishment\
\ in the executive branch,” one of the"
sentences:
- How does the Court describe the CIA's work?
- Which test was used to determine that the Board was a FOIA 'agency'?
- What is the estimated value range of the contract in question?
- source_sentence: "• The Court grants in part and denies in part summary judgment\
\ to the CIA on Count Three \nin No. 11-445. The Court denies summary judgment\
\ to the CIA with respect to (1) the \nCIA’s withholding of responsive information\
\ under FOIA Exemption 3 and the CIA Act, \n50 U.S.C. § 403g, see supra Part III.H.;\
\ and (2) the CIA’s withholding of responsive \n161"
sentences:
- Under what condition can the parties file renewed motions?
- What legislation is referenced in connection with the CIA's withholding of information?
- What does the Government not dispute regarding § 340.403?
- source_sentence: "for a specific procurement through separate joint ventures with\
\ different protégés.” Id. The SBA \nunderscored this purpose by highlighting\
\ that in acquiring a second protégé, the mentor “has \nalready assured SBA that\
\ the two protégés would not be competitors. If the two mentor-protégé \nrelationships\
\ were approved in the same [North American Industry Classification System] code,"
sentences:
- What is the title of section D?
- What does the mentor assure the SBA about the two protégés?
- Where can specific details about the plaintiff's opposition be found?
- source_sentence: "moving party has shown a privacy interest outweighing the public’s\
\ interest in open judicial \nproceedings. Doe, 282 Ill. App. 3d at 1088. The\
\ standard of review for the trial court’s \ndetermination stands, absent an abuse\
\ of discretion. Northwestern Memorial Hospital, 2014 IL \nApp (1st) 140212, ¶\
\ 36. \n¶ 51"
sentences:
- What is mentioned as the standard of review for the trial court’s determination?
- When did the plaintiff file a motion?
- What does recognizing assignments of FOIA request rights result in?
- source_sentence: "Williams Decl. Exs. D–I, ECF No. 53-1. In Counts Five and Six\
\ of No. 11-445, the plaintiff \nchallenges the DIA’s and the ODNI’s withholding\
\ determinations, respectively, made under \n10 \n \nFOIA Exemptions 1, 2, 3,\
\ 5, and 6. See 445 FAC ¶¶ 38–54; Defs.’ First 445 Mem. at 4–6; Pl.’s \nFirst\
\ 445 Opp’n at 6, 17–22, 24.7 \nB. \n2010 FOIA Requests \n1."
sentences:
- What did the forum a quo determine it would do after the parties exposed their
positions?
- Under which FOIA exemptions are the withholding determinations made?
- How many remaining claims does the plaintiff have?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: ModernBERT Embed base Legal Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.5440494590417311
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.58887171561051
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6877897990726429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7619783616692427
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5440494590417311
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5151983513652756
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3984544049459042
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.23616692426584238
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.19448737764039153
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5047655847501289
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6329211746522411
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7434312210200927
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6499814474424818
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5917923995976541
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6349937117655203
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.5316846986089645
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5826893353941267
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6893353941267388
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7619783616692427
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5316846986089645
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5100463678516228
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3993817619783616
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.23817619783616692
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.18663060278207108
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.49613601236476046
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6312467800103039
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7480680061823802
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6470109167633091
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.583873064939525
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6280912185452766
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.5069551777434312
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5486862442040186
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.652241112828439
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7357032457496137
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5069551777434312
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.4863472436888202
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.37712519319938176
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.2282843894899536
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.17362184441009787
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4719216898505925
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5965996908809892
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7174137042761463
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6158619070528558
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.555434115944162
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6000656985096435
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.4327666151468315
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.47449768160741884
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5703245749613601
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6646058732612056
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4327666151468315
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.41576506955177744
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3316846986089645
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.20819165378670787
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.148634724368882
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3999227202472952
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5211231324059763
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6510819165378671
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5456391631379686
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.48163317877382794
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5298973764645131
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.3323029366306028
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.37094281298299847
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.44513137557959814
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.5239567233384853
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3323029366306028
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.32096857290056674
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.25718701700154567
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.16306027820710975
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.11669242658423493
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3104070066975786
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.4031427099433281
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5090159711488923
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.42514271233181616
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.37330168543460646
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4208075319076454
name: Cosine Map@100
---
# ModernBERT Embed base Legal Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [nomic-ai/modernbert-embed-base](https://huggingface.co/nomic-ai/modernbert-embed-base)
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("ao-ot1231231/modernbert-embed-base-legal-matryoshka-2")
# Run inference
sentences = [
'Williams Decl. Exs. D–I, ECF No. 53-1. In Counts Five and Six of No. 11-445, the plaintiff \nchallenges the DIA’s and the ODNI’s withholding determinations, respectively, made under \n10 \n \nFOIA Exemptions 1, 2, 3, 5, and 6. See 445 FAC ¶¶ 38–54; Defs.’ First 445 Mem. at 4–6; Pl.’s \nFirst 445 Opp’n at 6, 17–22, 24.7 \nB. \n2010 FOIA Requests \n1.',
'Under which FOIA exemptions are the withholding determinations made?',
'What did the forum a quo determine it would do after the parties exposed their positions?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.4481, 0.1215],
# [0.4481, 1.0000, 0.1083],
# [0.1215, 0.1083, 1.0000]])
```
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 768
}
```
| Metric | Value |
|:--------------------|:---------|
| cosine_accuracy@1 | 0.544 |
| cosine_accuracy@3 | 0.5889 |
| cosine_accuracy@5 | 0.6878 |
| cosine_accuracy@10 | 0.762 |
| cosine_precision@1 | 0.544 |
| cosine_precision@3 | 0.5152 |
| cosine_precision@5 | 0.3985 |
| cosine_precision@10 | 0.2362 |
| cosine_recall@1 | 0.1945 |
| cosine_recall@3 | 0.5048 |
| cosine_recall@5 | 0.6329 |
| cosine_recall@10 | 0.7434 |
| **cosine_ndcg@10** | **0.65** |
| cosine_mrr@10 | 0.5918 |
| cosine_map@100 | 0.635 |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 512
}
```
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@1 | 0.5317 |
| cosine_accuracy@3 | 0.5827 |
| cosine_accuracy@5 | 0.6893 |
| cosine_accuracy@10 | 0.762 |
| cosine_precision@1 | 0.5317 |
| cosine_precision@3 | 0.51 |
| cosine_precision@5 | 0.3994 |
| cosine_precision@10 | 0.2382 |
| cosine_recall@1 | 0.1866 |
| cosine_recall@3 | 0.4961 |
| cosine_recall@5 | 0.6312 |
| cosine_recall@10 | 0.7481 |
| **cosine_ndcg@10** | **0.647** |
| cosine_mrr@10 | 0.5839 |
| cosine_map@100 | 0.6281 |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 256
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.507 |
| cosine_accuracy@3 | 0.5487 |
| cosine_accuracy@5 | 0.6522 |
| cosine_accuracy@10 | 0.7357 |
| cosine_precision@1 | 0.507 |
| cosine_precision@3 | 0.4863 |
| cosine_precision@5 | 0.3771 |
| cosine_precision@10 | 0.2283 |
| cosine_recall@1 | 0.1736 |
| cosine_recall@3 | 0.4719 |
| cosine_recall@5 | 0.5966 |
| cosine_recall@10 | 0.7174 |
| **cosine_ndcg@10** | **0.6159** |
| cosine_mrr@10 | 0.5554 |
| cosine_map@100 | 0.6001 |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 128
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.4328 |
| cosine_accuracy@3 | 0.4745 |
| cosine_accuracy@5 | 0.5703 |
| cosine_accuracy@10 | 0.6646 |
| cosine_precision@1 | 0.4328 |
| cosine_precision@3 | 0.4158 |
| cosine_precision@5 | 0.3317 |
| cosine_precision@10 | 0.2082 |
| cosine_recall@1 | 0.1486 |
| cosine_recall@3 | 0.3999 |
| cosine_recall@5 | 0.5211 |
| cosine_recall@10 | 0.6511 |
| **cosine_ndcg@10** | **0.5456** |
| cosine_mrr@10 | 0.4816 |
| cosine_map@100 | 0.5299 |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 64
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3323 |
| cosine_accuracy@3 | 0.3709 |
| cosine_accuracy@5 | 0.4451 |
| cosine_accuracy@10 | 0.524 |
| cosine_precision@1 | 0.3323 |
| cosine_precision@3 | 0.321 |
| cosine_precision@5 | 0.2572 |
| cosine_precision@10 | 0.1631 |
| cosine_recall@1 | 0.1167 |
| cosine_recall@3 | 0.3104 |
| cosine_recall@5 | 0.4031 |
| cosine_recall@10 | 0.509 |
| **cosine_ndcg@10** | **0.4251** |
| cosine_mrr@10 | 0.3733 |
| cosine_map@100 | 0.4208 |
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 5,822 training samples
* Columns: positive and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
personnel.” See id. The answer to that question remains unclear, and the Court need not decide
113
it here.52 It suffices to conclude that the names withheld by the CIA are at least arguably
protected from disclosure under the interpretation of § 403g announced in Halperin, and thus
withholding those names does not rise to the level of “general sloppiness” that would caution | Under which interpretation are the names at least arguably protected from disclosure? |
| last of these motions became ripe on June 11, 2013. Additionally, on November 21, 2012, the
plaintiff filed a motion for leave to file a second amended complaint in No. 11-445, and on
January 11, 2013, the plaintiff filed a motion for sanctions in No. 11-443. Thus, currently
pending before the Court in these related actions are ten motions: eight motions or cross-motions
28 | When did the last of the motions become ripe? |
| the parties to confer, once this report is final, and submit any remaining areas of
disagreement on the scope of the inspection to the Court.
33 D.I. 1, Ex. 2.
34 Id.
Senetas Corporation, Ltd. v. DeepRadiology Corporation
C.A. No. 2019-0170-PWG
July 30, 2019
9
accurate financial records; failed to keep the Board reasonably informed about | What is the case number for Senetas Corporation, Ltd. v. DeepRadiology Corporation? |
* Loss: [MatryoshkaLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters