legal_bert_base_uncased Finetuned on Data

This is a sentence-transformers model finetuned from nlpaueb/legal-bert-base-uncased. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nlpaueb/legal-bert-base-uncased
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    "In what context does the operation of the democratic system in a Member State require political parties to compile personal data on people's political opinions?",
    "Where in the course of electoral activities, the operation of the democratic system in a Member State requires that political parties compile personal data on people's political opinions, the processing of such data may be permitted for reasons of public interest, provided that appropriate safeguards are established.",
    '1.The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information: (a)  the purposes of the processing; (b)  the categories of personal data concerned; (c)  the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations; (d)  where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period; (e)  the existence of the right to request from the controller rectification or erasure of personal data or restriction of processing of personal data concerning the data subject or to object to such processing; (f)  the right to lodge a complaint with a supervisory authority; (g)  where the personal data are not collected from the data subject, any available information as to their source; (h)  the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.\n2.Where personal data are transferred to a third country or to an international organisation, the data subject shall have the right to be informed of the appropriate safeguards pursuant to Article 46 relating to the transfer.\n3.The controller shall provide a copy of the personal data undergoing processing. For any further copies requested by the data subject, the controller may charge a reasonable fee based on administrative costs. Where the data subject makes the request by electronic means, and unless otherwise requested by the data subject, the information shall be provided in a commonly used electronic form.\n4.The right to obtain a copy referred to in paragraph 3 shall not adversely affect the rights and freedoms of others. Section 3 Rectification and erasure',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.8307, 0.1443],
#         [0.8307, 1.0000, 0.1468],
#         [0.1443, 0.1468, 1.0000]])

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.4079
cosine_accuracy@3 0.4521
cosine_accuracy@5 0.4988
cosine_accuracy@10 0.5627
cosine_precision@1 0.4079
cosine_precision@3 0.407
cosine_precision@5 0.398
cosine_precision@10 0.3681
cosine_recall@1 0.0629
cosine_recall@3 0.1767
cosine_recall@5 0.2581
cosine_recall@10 0.397
cosine_ndcg@10 0.4802
cosine_mrr@10 0.4422
cosine_map@100 0.5349

Information Retrieval

Metric Value
cosine_accuracy@1 0.4201
cosine_accuracy@3 0.4595
cosine_accuracy@5 0.5037
cosine_accuracy@10 0.5725
cosine_precision@1 0.4201
cosine_precision@3 0.4185
cosine_precision@5 0.4079
cosine_precision@10 0.3737
cosine_recall@1 0.0639
cosine_recall@3 0.178
cosine_recall@5 0.2597
cosine_recall@10 0.3948
cosine_ndcg@10 0.4862
cosine_mrr@10 0.4528
cosine_map@100 0.5415

Information Retrieval

Metric Value
cosine_accuracy@1 0.4029
cosine_accuracy@3 0.4423
cosine_accuracy@5 0.4717
cosine_accuracy@10 0.5479
cosine_precision@1 0.4029
cosine_precision@3 0.4013
cosine_precision@5 0.3877
cosine_precision@10 0.3587
cosine_recall@1 0.0619
cosine_recall@3 0.1719
cosine_recall@5 0.2476
cosine_recall@10 0.3811
cosine_ndcg@10 0.4679
cosine_mrr@10 0.434
cosine_map@100 0.5272

Information Retrieval

Metric Value
cosine_accuracy@1 0.4054
cosine_accuracy@3 0.4496
cosine_accuracy@5 0.4742
cosine_accuracy@10 0.5258
cosine_precision@1 0.4054
cosine_precision@3 0.3997
cosine_precision@5 0.3838
cosine_precision@10 0.3511
cosine_recall@1 0.0673
cosine_recall@3 0.1777
cosine_recall@5 0.2456
cosine_recall@10 0.3598
cosine_ndcg@10 0.4621
cosine_mrr@10 0.4332
cosine_map@100 0.5189

Information Retrieval

Metric Value
cosine_accuracy@1 0.3833
cosine_accuracy@3 0.4251
cosine_accuracy@5 0.4521
cosine_accuracy@10 0.5135
cosine_precision@1 0.3833
cosine_precision@3 0.3833
cosine_precision@5 0.373
cosine_precision@10 0.343
cosine_recall@1 0.0568
cosine_recall@3 0.1575
cosine_recall@5 0.2312
cosine_recall@10 0.3402
cosine_ndcg@10 0.4408
cosine_mrr@10 0.4121
cosine_map@100 0.5014

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,627 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 7 tokens
    • mean: 15.21 tokens
    • max: 36 tokens
    • min: 25 tokens
    • mean: 369.39 tokens
    • max: 512 tokens
  • Samples:
    anchor positive
    What is the purpose of the publicly accessible electronic register maintained by the authority? 1.The Board shall ensure the consistent application of this Regulation. To that end, the Board shall, on its own initiative or, where relevant, at the request of the Commission, in particular: (a) monitor and ensure the correct application of this Regulation in the cases provided for in Articles 64 and 65 without prejudice to the tasks of national supervisory authorities; 4.5.2016 L 119/76 (b) advise the Commission on any issue related to the protection of personal data in the Union, including on any proposed amendment of this Regulation; (c) advise the Commission on the format and procedures for the exchange of information between controllers, processors and supervisory authorities for binding corporate rules; (d) issue guidelines, recommendations, and best practices on procedures for erasing links, copies or replications of personal data from publicly available communication services as referred to in Article 17(2); (e) examine, on its own initiative, on request of one of its ...
    How should the request for consent be presented? 1.Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data.
    2.If the data subject's consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding.
    3.The data subject shall have the right to withdraw his or her consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. Prior to giving consent, the data subject shall be informed thereof. It shall be as easy to withdraw as to give consent.
    4.When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the...
    What type of clauses is the European Data Protection Committee encouraged to adopt under point (j)? 1.Without prejudice to other tasks set out under this Regulation, each supervisory authority shall on its territory: (a) monitor and enforce the application of this Regulation; (b) promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing. Activities addressed specifically to children shall receive specific attention; (c) advise, in accordance with Member State law, the national parliament, the government, and other institutions and bodies on legislative and administrative measures relating to the protection of natural persons' rights and freedoms with regard to processing; (d) promote the awareness of controllers and processors of their obligations under this Regulation; (e) upon request, provide information to any data subject concerning the exercise of their rights under this Regulation and, if appropriate, cooperate with the supervisory authorities in other Member States to that end; (f) handle complaints lodged by a data ...
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • gradient_accumulation_steps: 2
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.0098 1 16.142 - - - - -
0.0196 2 13.6979 - - - - -
0.0294 3 17.5356 - - - - -
0.0392 4 14.932 - - - - -
0.0490 5 19.2064 - - - - -
0.0588 6 16.981 - - - - -
0.0686 7 17.1026 - - - - -
0.0784 8 15.6905 - - - - -
0.0882 9 18.223 - - - - -
0.0980 10 15.6634 - - - - -
0.1078 11 21.4638 - - - - -
0.1176 12 18.1468 - - - - -
0.1275 13 17.1178 - - - - -
0.1373 14 15.877 - - - - -
0.1471 15 15.1227 - - - - -
0.1569 16 14.0964 - - - - -
0.1667 17 17.4772 - - - - -
0.1765 18 19.1766 - - - - -
0.1863 19 14.6675 - - - - -
0.1961 20 13.2606 - - - - -
0.2059 21 18.2742 - - - - -
0.2157 22 18.0096 - - - - -
0.2255 23 12.3101 - - - - -
0.2353 24 14.0233 - - - - -
0.2451 25 12.9841 - - - - -
0.2549 26 13.7307 - - - - -
0.2647 27 16.5663 - - - - -
0.2745 28 13.9413 - - - - -
0.2843 29 15.4937 - - - - -
0.2941 30 12.2452 - - - - -
0.3039 31 11.8533 - - - - -
0.3137 32 12.9059 - - - - -
0.3235 33 13.6171 - - - - -
0.3333 34 15.7267 - - - - -
0.3431 35 11.0184 - - - - -
0.3529 36 13.0709 - - - - -
0.3627 37 13.0111 - - - - -
0.3725 38 12.1237 - - - - -
0.3824 39 12.1466 - - - - -
0.3922 40 13.1446 - - - - -
0.4020 41 14.607 - - - - -
0.4118 42 13.0565 - - - - -
0.4216 43 11.0872 - - - - -
0.4314 44 13.9679 - - - - -
0.4412 45 11.4318 - - - - -
0.4510 46 13.5983 - - - - -
0.4608 47 13.3464 - - - - -
0.4706 48 10.2001 - - - - -
0.4804 49 12.6368 - - - - -
0.4902 50 14.6145 - - - - -
0.5 51 10.4465 - - - - -
0.5098 52 11.4625 - - - - -
0.5196 53 13.0207 - - - - -
0.5294 54 14.5234 - - - - -
0.5392 55 13.5372 - - - - -
0.5490 56 9.3016 - - - - -
0.5588 57 10.8238 - - - - -
0.5686 58 13.8616 - - - - -
0.5784 59 11.3745 - - - - -
0.5882 60 9.5075 - - - - -
0.5980 61 10.5469 - - - - -
0.6078 62 10.9505 - - - - -
0.6176 63 9.7053 - - - - -
0.6275 64 13.7708 - - - - -
0.6373 65 8.9201 - - - - -
0.6471 66 7.1201 - - - - -
0.6569 67 10.5854 - - - - -
0.6667 68 10.8858 - - - - -
0.6765 69 5.099 - - - - -
0.6863 70 7.2317 - - - - -
0.6961 71 12.0098 - - - - -
0.7059 72 11.1372 - - - - -
0.7157 73 13.2915 - - - - -
0.7255 74 7.666 - - - - -
0.7353 75 8.9583 - - - - -
0.7451 76 14.4897 - - - - -
0.7549 77 5.2462 - - - - -
0.7647 78 8.6225 - - - - -
0.7745 79 11.8394 - - - - -
0.7843 80 13.6184 - - - - -
0.7941 81 8.3545 - - - - -
0.8039 82 10.7644 - - - - -
0.8137 83 11.0633 - - - - -
0.8235 84 12.6646 - - - - -
0.8333 85 9.8459 - - - - -
0.8431 86 9.6683 - - - - -
0.8529 87 8.6824 - - - - -
0.8627 88 11.2598 - - - - -
0.8725 89 7.7519 - - - - -
0.8824 90 6.2339 - - - - -
0.8922 91 12.3876 - - - - -
0.9020 92 9.0571 - - - - -
0.9118 93 11.2924 - - - - -
0.9216 94 9.8345 - - - - -
0.9314 95 11.6696 - - - - -
0.9412 96 4.8431 - - - - -
0.9510 97 6.9415 - - - - -
0.9608 98 6.5089 - - - - -
0.9706 99 7.2339 - - - - -
0.9804 100 6.9042 - - - - -
0.9902 101 7.3693 - - - - -
1.0 102 7.3341 0.3214 0.3273 0.3465 0.3125 0.2972
1.0098 103 2.4598 - - - - -
1.0196 104 6.2536 - - - - -
1.0294 105 3.4257 - - - - -
1.0392 106 3.8891 - - - - -
1.0490 107 4.9762 - - - - -
1.0588 108 7.4206 - - - - -
1.0686 109 7.4156 - - - - -
1.0784 110 7.7509 - - - - -
1.0882 111 7.8635 - - - - -
1.0980 112 5.5645 - - - - -
1.1078 113 4.4094 - - - - -
1.1176 114 7.3678 - - - - -
1.1275 115 5.9872 - - - - -
1.1373 116 10.9788 - - - - -
1.1471 117 10.184 - - - - -
1.1569 118 9.0405 - - - - -
1.1667 119 4.9606 - - - - -
1.1765 120 5.9992 - - - - -
1.1863 121 4.846 - - - - -
1.1961 122 9.0015 - - - - -
1.2059 123 3.7348 - - - - -
1.2157 124 5.6321 - - - - -
1.2255 125 5.4282 - - - - -
1.2353 126 4.7605 - - - - -
1.2451 127 6.4746 - - - - -
1.2549 128 5.2789 - - - - -
1.2647 129 8.5016 - - - - -
1.2745 130 5.6314 - - - - -
1.2843 131 5.0657 - - - - -
1.2941 132 5.4924 - - - - -
1.3039 133 4.5245 - - - - -
1.3137 134 3.4378 - - - - -
1.3235 135 9.0247 - - - - -
1.3333 136 8.8494 - - - - -
1.3431 137 5.8472 - - - - -
1.3529 138 9.1909 - - - - -
1.3627 139 7.2911 - - - - -
1.3725 140 4.8004 - - - - -
1.3824 141 5.212 - - - - -
1.3922 142 6.5621 - - - - -
1.4020 143 4.5448 - - - - -
1.4118 144 6.5068 - - - - -
1.4216 145 3.3916 - - - - -
1.4314 146 4.8277 - - - - -
1.4412 147 7.9691 - - - - -
1.4510 148 5.8588 - - - - -
1.4608 149 4.7892 - - - - -
1.4706 150 6.8099 - - - - -
1.4804 151 8.754 - - - - -
1.4902 152 7.0441 - - - - -
1.5 153 9.6257 - - - - -
1.5098 154 2.521 - - - - -
1.5196 155 5.5603 - - - - -
1.5294 156 1.9541 - - - - -
1.5392 157 5.8526 - - - - -
1.5490 158 7.6458 - - - - -
1.5588 159 6.384 - - - - -
1.5686 160 5.6423 - - - - -
1.5784 161 5.5204 - - - - -
1.5882 162 7.6642 - - - - -
1.5980 163 3.7946 - - - - -
1.6078 164 4.2481 - - - - -
1.6176 165 4.4644 - - - - -
1.6275 166 4.674 - - - - -
1.6373 167 3.3481 - - - - -
1.6471 168 10.6918 - - - - -
1.6569 169 5.1483 - - - - -
1.6667 170 5.0832 - - - - -
1.6765 171 4.7421 - - - - -
1.6863 172 4.2187 - - - - -
1.6961 173 6.6807 - - - - -
1.7059 174 2.2695 - - - - -
1.7157 175 2.53 - - - - -
1.7255 176 6.0218 - - - - -
1.7353 177 8.7176 - - - - -
1.7451 178 6.9877 - - - - -
1.7549 179 7.2235 - - - - -
1.7647 180 2.7433 - - - - -
1.7745 181 4.1857 - - - - -
1.7843 182 8.8419 - - - - -
1.7941 183 4.2731 - - - - -
1.8039 184 5.9068 - - - - -
1.8137 185 3.8098 - - - - -
1.8235 186 3.6064 - - - - -
1.8333 187 7.5581 - - - - -
1.8431 188 5.1475 - - - - -
1.8529 189 0.9157 - - - - -
1.8627 190 5.8079 - - - - -
1.8725 191 3.7 - - - - -
1.8824 192 5.0955 - - - - -
1.8922 193 4.1939 - - - - -
1.9020 194 9.5075 - - - - -
1.9118 195 8.3701 - - - - -
1.9216 196 4.8127 - - - - -
1.9314 197 10.9187 - - - - -
1.9412 198 3.5791 - - - - -
1.9510 199 11.3306 - - - - -
1.9608 200 6.3793 - - - - -
1.9706 201 4.1617 - - - - -
1.9804 202 13.2734 - - - - -
1.9902 203 7.1173 - - - - -
2.0 204 2.266 0.3906 0.3760 0.3709 0.3645 0.3147
2.0098 205 3.1302 - - - - -
2.0196 206 1.3766 - - - - -
2.0294 207 3.3127 - - - - -
2.0392 208 2.5981 - - - - -
2.0490 209 3.4364 - - - - -
2.0588 210 2.6902 - - - - -
2.0686 211 2.2741 - - - - -
2.0784 212 3.3271 - - - - -
2.0882 213 2.7175 - - - - -
2.0980 214 6.3425 - - - - -
2.1078 215 6.2902 - - - - -
2.1176 216 3.145 - - - - -
2.1275 217 2.8716 - - - - -
2.1373 218 2.3294 - - - - -
2.1471 219 2.1109 - - - - -
2.1569 220 0.5804 - - - - -
2.1667 221 3.2296 - - - - -
2.1765 222 2.5376 - - - - -
2.1863 223 2.3902 - - - - -
2.1961 224 3.6877 - - - - -
2.2059 225 5.9844 - - - - -
2.2157 226 1.8207 - - - - -
2.2255 227 4.5828 - - - - -
2.2353 228 2.2631 - - - - -
2.2451 229 2.9204 - - - - -
2.2549 230 2.8955 - - - - -
2.2647 231 2.2151 - - - - -
2.2745 232 2.4886 - - - - -
2.2843 233 4.7441 - - - - -
2.2941 234 2.8377 - - - - -
2.3039 235 5.3201 - - - - -
2.3137 236 1.1322 - - - - -
2.3235 237 3.7221 - - - - -
2.3333 238 2.9845 - - - - -
2.3431 239 0.8198 - - - - -
2.3529 240 3.2936 - - - - -
2.3627 241 1.4042 - - - - -
2.3725 242 4.6838 - - - - -
2.3824 243 6.6713 - - - - -
2.3922 244 4.7596 - - - - -
2.4020 245 5.5073 - - - - -
2.4118 246 5.8471 - - - - -
2.4216 247 2.5645 - - - - -
2.4314 248 4.8164 - - - - -
2.4412 249 3.479 - - - - -
2.4510 250 4.7469 - - - - -
2.4608 251 3.8473 - - - - -
2.4706 252 1.9741 - - - - -
2.4804 253 4.2415 - - - - -
2.4902 254 5.7806 - - - - -
2.5 255 4.1767 - - - - -
2.5098 256 1.4454 - - - - -
2.5196 257 6.4449 - - - - -
2.5294 258 2.5377 - - - - -
2.5392 259 5.2794 - - - - -
2.5490 260 6.4417 - - - - -
2.5588 261 2.2805 - - - - -
2.5686 262 1.5983 - - - - -
2.5784 263 4.0917 - - - - -
2.5882 264 5.2794 - - - - -
2.5980 265 5.2811 - - - - -
2.6078 266 2.5164 - - - - -
2.6176 267 4.1126 - - - - -
2.6275 268 3.4238 - - - - -
2.6373 269 3.0646 - - - - -
2.6471 270 5.9979 - - - - -
2.6569 271 2.6063 - - - - -
2.6667 272 2.7587 - - - - -
2.6765 273 4.4089 - - - - -
2.6863 274 4.606 - - - - -
2.6961 275 2.1391 - - - - -
2.7059 276 2.9717 - - - - -
2.7157 277 2.9827 - - - - -
2.7255 278 4.3785 - - - - -
2.7353 279 1.3594 - - - - -
2.7451 280 3.4222 - - - - -
2.7549 281 4.1147 - - - - -
2.7647 282 1.4041 - - - - -
2.7745 283 3.4664 - - - - -
2.7843 284 4.6955 - - - - -
2.7941 285 7.5589 - - - - -
2.8039 286 2.2981 - - - - -
2.8137 287 1.9319 - - - - -
2.8235 288 3.8581 - - - - -
2.8333 289 1.486 - - - - -
2.8431 290 1.9626 - - - - -
2.8529 291 3.8278 - - - - -
2.8627 292 4.7401 - - - - -
2.8725 293 1.4546 - - - - -
2.8824 294 1.695 - - - - -
2.8922 295 1.5778 - - - - -
2.9020 296 0.7612 - - - - -
2.9118 297 3.9774 - - - - -
2.9216 298 1.601 - - - - -
2.9314 299 1.0608 - - - - -
2.9412 300 0.9563 - - - - -
2.9510 301 3.8269 - - - - -
2.9608 302 2.8383 - - - - -
2.9706 303 0.8521 - - - - -
2.9804 304 3.6911 - - - - -
2.9902 305 2.9831 - - - - -
3.0 306 2.86 0.4383 0.4420 0.4243 0.4179 0.3780
3.0098 307 2.1505 - - - - -
3.0196 308 1.951 - - - - -
3.0294 309 1.0424 - - - - -
3.0392 310 1.0313 - - - - -
3.0490 311 2.3466 - - - - -
3.0588 312 2.8182 - - - - -
3.0686 313 2.7665 - - - - -
3.0784 314 3.3207 - - - - -
3.0882 315 0.5337 - - - - -
3.0980 316 5.9269 - - - - -
3.1078 317 1.4661 - - - - -
3.1176 318 2.2047 - - - - -
3.1275 319 0.4942 - - - - -
3.1373 320 0.3199 - - - - -
3.1471 321 2.6207 - - - - -
3.1569 322 1.66 - - - - -
3.1667 323 1.6875 - - - - -
3.1765 324 0.9947 - - - - -
3.1863 325 2.8068 - - - - -
3.1961 326 0.7081 - - - - -
3.2059 327 2.8691 - - - - -
3.2157 328 2.2257 - - - - -
3.2255 329 2.6505 - - - - -
3.2353 330 0.5006 - - - - -
3.2451 331 0.8226 - - - - -
3.2549 332 0.7512 - - - - -
3.2647 333 3.9588 - - - - -
3.2745 334 6.0112 - - - - -
3.2843 335 4.641 - - - - -
3.2941 336 5.446 - - - - -
3.3039 337 2.1972 - - - - -
3.3137 338 0.7079 - - - - -
3.3235 339 1.2027 - - - - -
3.3333 340 3.0418 - - - - -
3.3431 341 4.8786 - - - - -
3.3529 342 1.3315 - - - - -
3.3627 343 4.3283 - - - - -
3.3725 344 1.1935 - - - - -
3.3824 345 2.4532 - - - - -
3.3922 346 1.9306 - - - - -
3.4020 347 2.1302 - - - - -
3.4118 348 0.8763 - - - - -
3.4216 349 2.3134 - - - - -
3.4314 350 1.1645 - - - - -
3.4412 351 4.0291 - - - - -
3.4510 352 2.413 - - - - -
3.4608 353 2.2474 - - - - -
3.4706 354 0.4758 - - - - -
3.4804 355 3.3463 - - - - -
3.4902 356 2.6824 - - - - -
3.5 357 2.0532 - - - - -
3.5098 358 4.2147 - - - - -
3.5196 359 0.6906 - - - - -
3.5294 360 4.5964 - - - - -
3.5392 361 3.9468 - - - - -
3.5490 362 0.2993 - - - - -
3.5588 363 0.5449 - - - - -
3.5686 364 6.1468 - - - - -
3.5784 365 3.3679 - - - - -
3.5882 366 0.2183 - - - - -
3.5980 367 3.1249 - - - - -
3.6078 368 0.2242 - - - - -
3.6176 369 2.8286 - - - - -
3.6275 370 4.7574 - - - - -
3.6373 371 1.8525 - - - - -
3.6471 372 0.3502 - - - - -
3.6569 373 0.4283 - - - - -
3.6667 374 1.4765 - - - - -
3.6765 375 1.4383 - - - - -
3.6863 376 3.0731 - - - - -
3.6961 377 7.8687 - - - - -
3.7059 378 3.4209 - - - - -
3.7157 379 3.5559 - - - - -
3.7255 380 5.7134 - - - - -
3.7353 381 0.3762 - - - - -
3.7451 382 4.7601 - - - - -
3.7549 383 2.5504 - - - - -
3.7647 384 2.0682 - - - - -
3.7745 385 2.3374 - - - - -
3.7843 386 0.8238 - - - - -
3.7941 387 1.3386 - - - - -
3.8039 388 3.9282 - - - - -
3.8137 389 3.3598 - - - - -
3.8235 390 1.2969 - - - - -
3.8333 391 2.7564 - - - - -
3.8431 392 4.6597 - - - - -
3.8529 393 3.5919 - - - - -
3.8627 394 1.4537 - - - - -
3.8725 395 2.0461 - - - - -
3.8824 396 1.5874 - - - - -
3.8922 397 1.3049 - - - - -
3.9020 398 2.3358 - - - - -
3.9118 399 3.2675 - - - - -
3.9216 400 5.0215 - - - - -
3.9314 401 2.3915 - - - - -
3.9412 402 4.3487 - - - - -
3.9510 403 1.9832 - - - - -
3.9608 404 3.0072 - - - - -
3.9706 405 0.9291 - - - - -
3.9804 406 1.8248 - - - - -
3.9902 407 2.9961 - - - - -
4.0 408 1.1937 0.4626 0.4678 0.4523 0.4299 0.3836
4.0098 409 0.9064 - - - - -
4.0196 410 2.1557 - - - - -
4.0294 411 2.1057 - - - - -
4.0392 412 0.6088 - - - - -
4.0490 413 0.2606 - - - - -
4.0588 414 1.1907 - - - - -
4.0686 415 0.6167 - - - - -
4.0784 416 1.0502 - - - - -
4.0882 417 3.8215 - - - - -
4.0980 418 2.3614 - - - - -
4.1078 419 2.3688 - - - - -
4.1176 420 1.1455 - - - - -
4.1275 421 1.1536 - - - - -
4.1373 422 2.409 - - - - -
4.1471 423 0.3791 - - - - -
4.1569 424 1.0055 - - - - -
4.1667 425 0.5284 - - - - -
4.1765 426 1.223 - - - - -
4.1863 427 0.7022 - - - - -
4.1961 428 2.4283 - - - - -
4.2059 429 1.8243 - - - - -
4.2157 430 0.6676 - - - - -
4.2255 431 1.79 - - - - -
4.2353 432 1.0209 - - - - -
4.2451 433 1.2226 - - - - -
4.2549 434 6.2845 - - - - -
4.2647 435 1.84 - - - - -
4.2745 436 2.1295 - - - - -
4.2843 437 6.0545 - - - - -
4.2941 438 2.5843 - - - - -
4.3039 439 2.7702 - - - - -
4.3137 440 0.3411 - - - - -
4.3235 441 3.9539 - - - - -
4.3333 442 0.8735 - - - - -
4.3431 443 1.1179 - - - - -
4.3529 444 0.8603 - - - - -
4.3627 445 1.0603 - - - - -
4.3725 446 2.5723 - - - - -
4.3824 447 0.7826 - - - - -
4.3922 448 2.1126 - - - - -
4.4020 449 0.8591 - - - - -
4.4118 450 0.5417 - - - - -
4.4216 451 0.3693 - - - - -
4.4314 452 2.6074 - - - - -
4.4412 453 0.5343 - - - - -
4.4510 454 2.1009 - - - - -
4.4608 455 0.1991 - - - - -
4.4706 456 4.0662 - - - - -
4.4804 457 0.2067 - - - - -
4.4902 458 0.1804 - - - - -
4.5 459 0.7831 - - - - -
4.5098 460 0.2989 - - - - -
4.5196 461 3.5961 - - - - -
4.5294 462 1.0125 - - - - -
4.5392 463 6.6978 - - - - -
4.5490 464 0.916 - - - - -
4.5588 465 1.3722 - - - - -
4.5686 466 0.8413 - - - - -
4.5784 467 0.8699 - - - - -
4.5882 468 1.158 - - - - -
4.5980 469 1.9805 - - - - -
4.6078 470 1.5379 - - - - -
4.6176 471 2.1312 - - - - -
4.6275 472 1.2694 - - - - -
4.6373 473 1.1249 - - - - -
4.6471 474 0.2574 - - - - -
4.6569 475 0.298 - - - - -
4.6667 476 0.993 - - - - -
4.6765 477 1.1082 - - - - -
4.6863 478 0.5371 - - - - -
4.6961 479 1.3218 - - - - -
4.7059 480 0.8123 - - - - -
4.7157 481 1.2833 - - - - -
4.7255 482 1.8921 - - - - -
4.7353 483 2.6904 - - - - -
4.7451 484 1.039 - - - - -
4.7549 485 0.6109 - - - - -
4.7647 486 0.1796 - - - - -
4.7745 487 1.8133 - - - - -
4.7843 488 2.3428 - - - - -
4.7941 489 0.5583 - - - - -
4.8039 490 1.7511 - - - - -
4.8137 491 4.4125 - - - - -
4.8235 492 0.7261 - - - - -
4.8333 493 3.266 - - - - -
4.8431 494 1.4778 - - - - -
4.8529 495 0.4726 - - - - -
4.8627 496 2.017 - - - - -
4.8725 497 4.986 - - - - -
4.8824 498 1.3917 - - - - -
4.8922 499 3.241 - - - - -
4.9020 500 0.8305 - - - - -
4.9118 501 2.1252 - - - - -
4.9216 502 0.5237 - - - - -
4.9314 503 3.5636 - - - - -
4.9412 504 1.1303 - - - - -
4.9510 505 0.4488 - - - - -
4.9608 506 1.2867 - - - - -
4.9706 507 2.4835 - - - - -
4.9804 508 1.8102 - - - - -
4.9902 509 0.4084 - - - - -
5.0 510 0.4287 0.4877 0.4856 0.4594 0.4635 0.4196
5.0098 511 1.4083 - - - - -
5.0196 512 0.1373 - - - - -
5.0294 513 0.2411 - - - - -
5.0392 514 0.9886 - - - - -
5.0490 515 1.0 - - - - -
5.0588 516 0.1746 - - - - -
5.0686 517 0.3335 - - - - -
5.0784 518 0.7792 - - - - -
5.0882 519 0.9675 - - - - -
5.0980 520 1.2497 - - - - -
5.1078 521 0.3268 - - - - -
5.1176 522 0.0865 - - - - -
5.1275 523 0.8019 - - - - -
5.1373 524 1.9829 - - - - -
5.1471 525 0.715 - - - - -
5.1569 526 0.8971 - - - - -
5.1667 527 0.8267 - - - - -
5.1765 528 1.4991 - - - - -
5.1863 529 1.8707 - - - - -
5.1961 530 1.0055 - - - - -
5.2059 531 1.6003 - - - - -
5.2157 532 0.3225 - - - - -
5.2255 533 1.1942 - - - - -
5.2353 534 0.0983 - - - - -
5.2451 535 0.2066 - - - - -
5.2549 536 0.473 - - - - -
5.2647 537 0.6633 - - - - -
5.2745 538 0.3503 - - - - -
5.2843 539 2.3147 - - - - -
5.2941 540 1.2655 - - - - -
5.3039 541 0.3521 - - - - -
5.3137 542 1.2058 - - - - -
5.3235 543 3.4029 - - - - -
5.3333 544 0.1811 - - - - -
5.3431 545 2.561 - - - - -
5.3529 546 0.5999 - - - - -
5.3627 547 0.7599 - - - - -
5.3725 548 0.6124 - - - - -
5.3824 549 0.1712 - - - - -
5.3922 550 1.5036 - - - - -
5.4020 551 2.1593 - - - - -
5.4118 552 1.3076 - - - - -
5.4216 553 1.4391 - - - - -
5.4314 554 1.3278 - - - - -
5.4412 555 0.3238 - - - - -
5.4510 556 2.1983 - - - - -
5.4608 557 1.3862 - - - - -
5.4706 558 0.3954 - - - - -
5.4804 559 0.2868 - - - - -
5.4902 560 0.2495 - - - - -
5.5 561 2.0504 - - - - -
5.5098 562 0.2046 - - - - -
5.5196 563 2.2235 - - - - -
5.5294 564 1.3584 - - - - -
5.5392 565 0.2429 - - - - -
5.5490 566 0.3024 - - - - -
5.5588 567 0.5984 - - - - -
5.5686 568 0.3191 - - - - -
5.5784 569 0.4595 - - - - -
5.5882 570 2.1861 - - - - -
5.5980 571 2.0487 - - - - -
5.6078 572 1.063 - - - - -
5.6176 573 0.4859 - - - - -
5.6275 574 0.7916 - - - - -
5.6373 575 1.8277 - - - - -
5.6471 576 1.966 - - - - -
5.6569 577 0.7466 - - - - -
5.6667 578 0.9443 - - - - -
5.6765 579 0.9522 - - - - -
5.6863 580 0.791 - - - - -
5.6961 581 0.9675 - - - - -
5.7059 582 0.253 - - - - -
5.7157 583 1.7913 - - - - -
5.7255 584 0.4794 - - - - -
5.7353 585 0.7508 - - - - -
5.7451 586 1.6652 - - - - -
5.7549 587 0.4571 - - - - -
5.7647 588 1.5655 - - - - -
5.7745 589 0.3066 - - - - -
5.7843 590 1.9775 - - - - -
5.7941 591 0.5368 - - - - -
5.8039 592 3.5144 - - - - -
5.8137 593 0.8008 - - - - -
5.8235 594 0.5214 - - - - -
5.8333 595 1.5262 - - - - -
5.8431 596 0.5599 - - - - -
5.8529 597 2.5003 - - - - -
5.8627 598 2.299 - - - - -
5.8725 599 0.4357 - - - - -
5.8824 600 1.3485 - - - - -
5.8922 601 0.9481 - - - - -
5.9020 602 1.9528 - - - - -
5.9118 603 1.2791 - - - - -
5.9216 604 1.2034 - - - - -
5.9314 605 0.5493 - - - - -
5.9412 606 0.3016 - - - - -
5.9510 607 2.149 - - - - -
5.9608 608 2.4052 - - - - -
5.9706 609 1.0008 - - - - -
5.9804 610 0.6253 - - - - -
5.9902 611 0.3332 - - - - -
6.0 612 0.0307 0.4805 0.4826 0.4801 0.4541 0.4263
6.0098 613 1.0663 - - - - -
6.0196 614 1.3709 - - - - -
6.0294 615 0.4196 - - - - -
6.0392 616 2.2074 - - - - -
6.0490 617 0.8465 - - - - -
6.0588 618 0.0855 - - - - -
6.0686 619 1.0775 - - - - -
6.0784 620 0.4405 - - - - -
6.0882 621 0.1843 - - - - -
6.0980 622 1.2837 - - - - -
6.1078 623 1.9711 - - - - -
6.1176 624 0.8662 - - - - -
6.1275 625 0.3973 - - - - -
6.1373 626 0.5848 - - - - -
6.1471 627 0.6703 - - - - -
6.1569 628 0.7336 - - - - -
6.1667 629 0.4416 - - - - -
6.1765 630 3.4996 - - - - -
6.1863 631 1.5145 - - - - -
6.1961 632 1.1684 - - - - -
6.2059 633 4.2434 - - - - -
6.2157 634 0.4169 - - - - -
6.2255 635 0.4279 - - - - -
6.2353 636 0.2271 - - - - -
6.2451 637 0.1208 - - - - -
6.2549 638 2.5412 - - - - -
6.2647 639 0.8021 - - - - -
6.2745 640 0.4896 - - - - -
6.2843 641 0.5744 - - - - -
6.2941 642 0.1416 - - - - -
6.3039 643 0.5951 - - - - -
6.3137 644 0.6001 - - - - -
6.3235 645 0.3476 - - - - -
6.3333 646 1.5666 - - - - -
6.3431 647 0.4627 - - - - -
6.3529 648 0.4656 - - - - -
6.3627 649 0.5176 - - - - -
6.3725 650 1.2141 - - - - -
6.3824 651 0.1015 - - - - -
6.3922 652 1.0091 - - - - -
6.4020 653 1.0072 - - - - -
6.4118 654 0.4348 - - - - -
6.4216 655 1.4236 - - - - -
6.4314 656 2.7582 - - - - -
6.4412 657 0.1325 - - - - -
6.4510 658 1.1764 - - - - -
6.4608 659 0.1993 - - - - -
6.4706 660 0.7112 - - - - -
6.4804 661 0.2277 - - - - -
6.4902 662 0.6941 - - - - -
6.5 663 0.0961 - - - - -
6.5098 664 0.1357 - - - - -
6.5196 665 0.9405 - - - - -
6.5294 666 2.4092 - - - - -
6.5392 667 0.5314 - - - - -
6.5490 668 1.6019 - - - - -
6.5588 669 0.2396 - - - - -
6.5686 670 0.7991 - - - - -
6.5784 671 0.1627 - - - - -
6.5882 672 1.1582 - - - - -
6.5980 673 0.1338 - - - - -
6.6078 674 0.7125 - - - - -
6.6176 675 2.1821 - - - - -
6.6275 676 1.3282 - - - - -
6.6373 677 0.4241 - - - - -
6.6471 678 0.6067 - - - - -
6.6569 679 0.6277 - - - - -
6.6667 680 0.6314 - - - - -
6.6765 681 0.1953 - - - - -
6.6863 682 0.8784 - - - - -
6.6961 683 1.0776 - - - - -
6.7059 684 0.3236 - - - - -
6.7157 685 0.2603 - - - - -
6.7255 686 0.2072 - - - - -
6.7353 687 0.2286 - - - - -
6.7451 688 0.2785 - - - - -
6.7549 689 1.0233 - - - - -
6.7647 690 1.1958 - - - - -
6.7745 691 0.4351 - - - - -
6.7843 692 0.7968 - - - - -
6.7941 693 1.4431 - - - - -
6.8039 694 2.7822 - - - - -
6.8137 695 0.1462 - - - - -
6.8235 696 0.7891 - - - - -
6.8333 697 2.3079 - - - - -
6.8431 698 0.8345 - - - - -
6.8529 699 0.9415 - - - - -
6.8627 700 1.5442 - - - - -
6.8725 701 0.5693 - - - - -
6.8824 702 0.6353 - - - - -
6.8922 703 0.5235 - - - - -
6.9020 704 1.9037 - - - - -
6.9118 705 0.2953 - - - - -
6.9216 706 0.3391 - - - - -
6.9314 707 0.0905 - - - - -
6.9412 708 2.1041 - - - - -
6.9510 709 1.8204 - - - - -
6.9608 710 1.7676 - - - - -
6.9706 711 1.4628 - - - - -
6.9804 712 0.6696 - - - - -
6.9902 713 1.4171 - - - - -
7.0 714 0.0389 0.4817 0.4788 0.4802 0.4604 0.4326
7.0098 715 0.1509 - - - - -
7.0196 716 0.1037 - - - - -
7.0294 717 0.4192 - - - - -
7.0392 718 0.1334 - - - - -
7.0490 719 0.4343 - - - - -
7.0588 720 3.0699 - - - - -
7.0686 721 0.9501 - - - - -
7.0784 722 0.1778 - - - - -
7.0882 723 0.4164 - - - - -
7.0980 724 0.2963 - - - - -
7.1078 725 1.3947 - - - - -
7.1176 726 0.505 - - - - -
7.1275 727 0.5072 - - - - -
7.1373 728 0.1854 - - - - -
7.1471 729 0.1622 - - - - -
7.1569 730 0.6847 - - - - -
7.1667 731 1.3038 - - - - -
7.1765 732 0.9911 - - - - -
7.1863 733 2.1003 - - - - -
7.1961 734 0.3802 - - - - -
7.2059 735 1.134 - - - - -
7.2157 736 0.3427 - - - - -
7.2255 737 0.8341 - - - - -
7.2353 738 1.1072 - - - - -
7.2451 739 0.3203 - - - - -
7.2549 740 0.3252 - - - - -
7.2647 741 1.0124 - - - - -
7.2745 742 0.1467 - - - - -
7.2843 743 0.1729 - - - - -
7.2941 744 0.167 - - - - -
7.3039 745 0.1038 - - - - -
7.3137 746 2.6918 - - - - -
7.3235 747 0.3386 - - - - -
7.3333 748 0.1204 - - - - -
7.3431 749 1.5078 - - - - -
7.3529 750 0.0813 - - - - -
7.3627 751 1.3693 - - - - -
7.3725 752 0.7416 - - - - -
7.3824 753 0.5061 - - - - -
7.3922 754 1.0782 - - - - -
7.4020 755 0.1061 - - - - -
7.4118 756 0.4726 - - - - -
7.4216 757 0.3794 - - - - -
7.4314 758 1.0286 - - - - -
7.4412 759 0.3516 - - - - -
7.4510 760 0.2665 - - - - -
7.4608 761 0.4261 - - - - -
7.4706 762 0.7815 - - - - -
7.4804 763 0.0516 - - - - -
7.4902 764 0.4681 - - - - -
7.5 765 0.1083 - - - - -
7.5098 766 0.1157 - - - - -
7.5196 767 0.0383 - - - - -
7.5294 768 0.1496 - - - - -
7.5392 769 1.5602 - - - - -
7.5490 770 1.2548 - - - - -
7.5588 771 0.5062 - - - - -
7.5686 772 1.3416 - - - - -
7.5784 773 0.0605 - - - - -
7.5882 774 0.4715 - - - - -
7.5980 775 0.0467 - - - - -
7.6078 776 0.9769 - - - - -
7.6176 777 0.0936 - - - - -
7.6275 778 0.3977 - - - - -
7.6373 779 0.0266 - - - - -
7.6471 780 0.2169 - - - - -
7.6569 781 0.4521 - - - - -
7.6667 782 0.5292 - - - - -
7.6765 783 0.9527 - - - - -
7.6863 784 0.2699 - - - - -
7.6961 785 0.0181 - - - - -
7.7059 786 1.7187 - - - - -
7.7157 787 1.8481 - - - - -
7.7255 788 0.1036 - - - - -
7.7353 789 0.2311 - - - - -
7.7451 790 0.221 - - - - -
7.7549 791 0.1773 - - - - -
7.7647 792 0.2201 - - - - -
7.7745 793 0.3505 - - - - -
7.7843 794 0.6739 - - - - -
7.7941 795 0.3774 - - - - -
7.8039 796 0.609 - - - - -
7.8137 797 0.106 - - - - -
7.8235 798 0.4456 - - - - -
7.8333 799 1.3334 - - - - -
7.8431 800 0.0451 - - - - -
7.8529 801 1.024 - - - - -
7.8627 802 0.1337 - - - - -
7.8725 803 1.0237 - - - - -
7.8824 804 0.7726 - - - - -
7.8922 805 0.5858 - - - - -
7.9020 806 0.0826 - - - - -
7.9118 807 1.5779 - - - - -
7.9216 808 0.5999 - - - - -
7.9314 809 0.1622 - - - - -
7.9412 810 0.8974 - - - - -
7.9510 811 0.1727 - - - - -
7.9608 812 0.5008 - - - - -
7.9706 813 0.1356 - - - - -
7.9804 814 1.4574 - - - - -
7.9902 815 0.7763 - - - - -
8.0 816 0.0359 0.4802 0.4862 0.4679 0.4621 0.4408
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.51.3
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IoannisKat1/legal-bert-base-uncased-ft-new

Finetuned
(84)
this model

Evaluation results