all_MiniLM_L6_v2 Finetuned on Data

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    "What is one action that may be included under 'mutual assistance' according to the text?",
    '1.Supervisory authorities shall provide each other with relevant information and mutual assistance in order to implement and apply this Regulation in a consistent manner, and shall put in place measures for effective cooperation with one another. Mutual assistance shall cover, in particular, information requests and supervisory measures, such as requests to carry out prior authorisations and consultations, inspections and investigations.\n2.Each supervisory authority shall take all appropriate measures required to reply to a request of another supervisory authority without undue delay and no later than one month after receiving the request. Such measures may include, in particular, the transmission of relevant information on the conduct of an investigation.\n3.Requests for assistance shall contain all the necessary information, including the purpose of and reasons for the request. Information exchanged shall be used only for the purpose for which it was requested.\n4.The requested supervisory authority shall not refuse to comply with the request unless: (a)  it is not competent for the subject-matter of the request or for the measures it is requested to execute; or (b)  compliance with the request would infringe this Regulation or Union or Member State law to which the supervisory authority receiving the request is subject.\n5.The requested supervisory authority shall inform the requesting supervisory authority of the results or, as the case may be, of the progress of the measures taken in order to respond to the request. The requested supervisory authority shall provide reasons for any refusal to comply with a request pursuant to paragraph 4\n6.Requested supervisory authorities shall, as a rule, supply the information requested by other supervisory authorities by electronic means, using a standardised format.\n7.Requested supervisory authorities shall not charge a fee for any action taken by them pursuant to a request for mutual assistance. Supervisory authorities may agree on rules to indemnify each other for specific expenditure arising from the provision of mutual assistance in exceptional circumstances.\n8.Where a supervisory authority does not provide the information referred to in paragraph 5 of this Article within one month of receiving the request of another supervisory authority, the requesting supervisory authority may adopt a provisional measure on the territory of its Member State in accordance with Article 55(1). In that case, the urgent need to act under Article 66(1) shall be presumed to be met and require an urgent binding decision from the Board pursuant to Article 66(2).\n9.The Commission may, by means of implementing acts, specify the format and procedures for mutual assistance referred to in this Article and the arrangements for the exchange of information by electronic means between supervisory authorities, and between supervisory authorities and the Board, in particular the standardised format referred to in paragraph 6 of this Article. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 93(2).',
    "Any natural or legal person has the right to bring an action for annulment of decisions of the Board before the Court of Justice under the conditions provided for in Article 263 TFEU. As addressees of such decisions, the supervisory authorities concerned which wish to challenge them have to bring action within two months of being notified of them, in accordance with Article 263 TFEU. Where decisions of the Board are of direct and individual concern to a controller, processor or complainant, the latter may bring an action for annulment against those decisions within two months of their publication on the website of the Board, in accordance with Article 263 TFEU. Without prejudice to this right under Article 263 TFEU, each natural or legal person should have an effective judicial remedy before the competent national court against a decision of a supervisory authority which produces legal effects concerning that person. Such a decision concerns in particular the exercise of investigative, corrective and authorisation powers by the supervisory authority or the dismissal or rejection of complaints. However, the right to an effective judicial remedy does not encompass measures taken by supervisory authorities which are not legally binding, such as opinions issued by or advice provided by the supervisory authority. Proceedings against a supervisory authority should be brought before the courts of the Member State where the supervisory authority is established and should be conducted in accordance with that Member State's procedural law. Those courts should exercise full jurisdiction, which should include jurisdiction to examine all questions of fact and law relevant to the dispute before them. Where a complaint has been rejected or dismissed by a supervisory authority, the complainant may bring proceedings before the courts in the same Member State. In the context of judicial remedies relating to the application of this Regulation, national courts which consider a decision on the question necessary to enable them to give judgment, may, or in the case provided for in Article 267 TFEU, must, request the Court of Justice to give a preliminary ruling on the interpretation of Union law, including this Regulation. Furthermore, where a decision of a supervisory authority implementing a decision of the Board is challenged before a national court and the validity of the decision of the Board is at issue, that national court does not have the power to declare the Board's decision invalid but must refer the question of validity to the Court of Justice in accordance with Article 267 TFEU as interpreted by the Court of Justice, where it considers the decision invalid. However, a national court may not refer a question on the validity of the decision of the Board at the request of a natural or legal person which had the opportunity to bring an action for annulment of that decision, in particular if it was directly and individually concerned by that decision, but had not done so within the period laid down in Article 263 TFEU.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.4105, 0.0135],
#         [0.4105, 1.0000, 0.2998],
#         [0.0135, 0.2998, 1.0000]])

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.4447
cosine_accuracy@3 0.4668
cosine_accuracy@5 0.5012
cosine_accuracy@10 0.5528
cosine_precision@1 0.4447
cosine_precision@3 0.421
cosine_precision@5 0.3843
cosine_precision@10 0.327
cosine_recall@1 0.0961
cosine_recall@3 0.2403
cosine_recall@5 0.3143
cosine_recall@10 0.4251
cosine_ndcg@10 0.4912
cosine_mrr@10 0.4675
cosine_map@100 0.5401

Information Retrieval

Metric Value
cosine_accuracy@1 0.4349
cosine_accuracy@3 0.457
cosine_accuracy@5 0.4791
cosine_accuracy@10 0.5233
cosine_precision@1 0.4349
cosine_precision@3 0.412
cosine_precision@5 0.3744
cosine_precision@10 0.314
cosine_recall@1 0.0917
cosine_recall@3 0.2336
cosine_recall@5 0.3043
cosine_recall@10 0.4083
cosine_ndcg@10 0.473
cosine_mrr@10 0.4538
cosine_map@100 0.5253

Information Retrieval

Metric Value
cosine_accuracy@1 0.4079
cosine_accuracy@3 0.4324
cosine_accuracy@5 0.4668
cosine_accuracy@10 0.5086
cosine_precision@1 0.4079
cosine_precision@3 0.3866
cosine_precision@5 0.3543
cosine_precision@10 0.3012
cosine_recall@1 0.0881
cosine_recall@3 0.2217
cosine_recall@5 0.2895
cosine_recall@10 0.3905
cosine_ndcg@10 0.4526
cosine_mrr@10 0.4293
cosine_map@100 0.5005

Information Retrieval

Metric Value
cosine_accuracy@1 0.3538
cosine_accuracy@3 0.3808
cosine_accuracy@5 0.4128
cosine_accuracy@10 0.457
cosine_precision@1 0.3538
cosine_precision@3 0.3407
cosine_precision@5 0.3135
cosine_precision@10 0.27
cosine_recall@1 0.0747
cosine_recall@3 0.192
cosine_recall@5 0.2533
cosine_recall@10 0.3556
cosine_ndcg@10 0.4019
cosine_mrr@10 0.3766
cosine_map@100 0.4618

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,627 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 7 tokens
    • mean: 15.35 tokens
    • max: 34 tokens
    • min: 25 tokens
    • mean: 221.19 tokens
    • max: 256 tokens
  • Samples:
    anchor positive
    What was the plaintiff asked to input first on the clone website? Court (Civil/Criminal): Civil
    Provisions:
    Time of commission of the act:
    Outcome (not guilty, guilty):
    Reasoning: Partially accepts the lawsuit.
    Facts: The plaintiff, who works as a lawyer, maintains a savings account with the defendant banking corporation under account number GR.............. Pursuant to a contract dated June 11, 2010, established in Thessaloniki between the defendant and the plaintiff, the plaintiff was granted access to the electronic banking system (e-banking) to conduct banking transactions remotely. On October 10, 2020, the plaintiff fell victim to electronic fraud through the "phishing" method, whereby an unknown perpetrator managed to extract and transfer €3,000.00 from the plaintiff’s account to another account of the same bank. Specifically, on that day at 6:51 a.m., the plaintiff received an email from the sender ".........", with the address ..........., informing him that his debit card had been suspended and that online p...
    At what time did the defendant send the first message to the plaintiff's mobile phone? **Court (Civil/Criminal): Civil
    Provisions:
    Time of commission of the act:
    Outcome (not guilty, guilty): Rejects the lawsuit.
    Reasoning:
    Facts: The plaintiff holds account number ....................... at the defendant bank. Following the application/contract dated January 9, 2019, the plaintiff became a subscriber to the alternative service networks provided by the defendant bank through online banking (.........................). In the aforementioned application, the plaintiff stated that her mobile phone (number .................) would be used by the defendant to send additional security codes for the approval of her transactions via .......................... It is noted that the plaintiff received subscriber code .................., thus enabling her to conduct transactions without her physical presence at the branches of the defendant bank, from fixed or mobile devices (computers, smartphones, tablets), by entering her username and password to access her personal acc...
    How should the lead supervisory authority and other supervisory authorities exchange information under this Article? 1.The lead supervisory authority shall cooperate with the other supervisory authorities concerned in accordance with this Article in an endeavour to reach consensus. The lead supervisory authority and the supervisory authorities concerned shall exchange all relevant information with each other.
    2.The lead supervisory authority may request at any time other supervisory authorities concerned to provide mutual assistance pursuant to Article 61 and may conduct joint operations pursuant to Article 62, in particular for carrying out investigations or for monitoring the implementation of a measure concerning a controller or processor established in another Member State.
    3.The lead supervisory authority shall, without delay, communicate the relevant information on the matter to the other supervisory authorities concerned. It shall without delay submit a draft decision to the other supervisory authorities concerned for their opinion and take due account of their views.
    4.Where any of the other ...
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            384,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • gradient_accumulation_steps: 2
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss dim_384_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.0098 1 9.314 - - - -
0.0196 2 13.4156 - - - -
0.0294 3 6.5901 - - - -
0.0392 4 7.5108 - - - -
0.0490 5 8.4592 - - - -
0.0588 6 15.2418 - - - -
0.0686 7 8.6375 - - - -
0.0784 8 14.5092 - - - -
0.0882 9 10.0459 - - - -
0.0980 10 8.3529 - - - -
0.1078 11 14.6619 - - - -
0.1176 12 5.3518 - - - -
0.1275 13 9.4316 - - - -
0.1373 14 9.1804 - - - -
0.1471 15 9.9546 - - - -
0.1569 16 12.0305 - - - -
0.1667 17 7.8328 - - - -
0.1765 18 9.0889 - - - -
0.1863 19 8.8536 - - - -
0.1961 20 10.5517 - - - -
0.2059 21 11.9371 - - - -
0.2157 22 8.6637 - - - -
0.2255 23 4.3503 - - - -
0.2353 24 7.4951 - - - -
0.2451 25 7.9368 - - - -
0.2549 26 7.8018 - - - -
0.2647 27 14.6783 - - - -
0.2745 28 10.7873 - - - -
0.2843 29 10.0755 - - - -
0.2941 30 6.2568 - - - -
0.3039 31 5.3363 - - - -
0.3137 32 7.8691 - - - -
0.3235 33 11.2728 - - - -
0.3333 34 6.695 - - - -
0.3431 35 12.1315 - - - -
0.3529 36 7.2586 - - - -
0.3627 37 8.7879 - - - -
0.3725 38 7.639 - - - -
0.3824 39 11.3883 - - - -
0.3922 40 8.0624 - - - -
0.4020 41 10.6244 - - - -
0.4118 42 6.7805 - - - -
0.4216 43 10.4424 - - - -
0.4314 44 9.6474 - - - -
0.4412 45 10.9693 - - - -
0.4510 46 7.6602 - - - -
0.4608 47 3.99 - - - -
0.4706 48 5.6034 - - - -
0.4804 49 7.1135 - - - -
0.4902 50 5.6046 - - - -
0.5 51 10.5781 - - - -
0.5098 52 3.7839 - - - -
0.5196 53 5.3129 - - - -
0.5294 54 8.1674 - - - -
0.5392 55 7.3051 - - - -
0.5490 56 5.154 - - - -
0.5588 57 5.4361 - - - -
0.5686 58 6.0079 - - - -
0.5784 59 7.9415 - - - -
0.5882 60 9.1808 - - - -
0.5980 61 8.0456 - - - -
0.6078 62 12.8651 - - - -
0.6176 63 6.3757 - - - -
0.6275 64 7.2359 - - - -
0.6373 65 7.1767 - - - -
0.6471 66 9.7435 - - - -
0.6569 67 8.1696 - - - -
0.6667 68 4.6557 - - - -
0.6765 69 7.3619 - - - -
0.6863 70 6.937 - - - -
0.6961 71 4.8277 - - - -
0.7059 72 4.7581 - - - -
0.7157 73 5.1204 - - - -
0.7255 74 5.8539 - - - -
0.7353 75 6.0988 - - - -
0.7451 76 8.3001 - - - -
0.7549 77 8.4049 - - - -
0.7647 78 6.928 - - - -
0.7745 79 5.3 - - - -
0.7843 80 8.463 - - - -
0.7941 81 5.7306 - - - -
0.8039 82 7.8955 - - - -
0.8137 83 9.6203 - - - -
0.8235 84 8.6489 - - - -
0.8333 85 3.8349 - - - -
0.8431 86 6.7351 - - - -
0.8529 87 9.5078 - - - -
0.8627 88 5.1711 - - - -
0.8725 89 7.4387 - - - -
0.8824 90 4.5647 - - - -
0.8922 91 4.8621 - - - -
0.9020 92 3.7257 - - - -
0.9118 93 4.6232 - - - -
0.9216 94 7.8424 - - - -
0.9314 95 11.4285 - - - -
0.9412 96 8.2627 - - - -
0.9510 97 8.5677 - - - -
0.9608 98 4.8404 - - - -
0.9706 99 5.5854 - - - -
0.9804 100 3.6296 - - - -
0.9902 101 5.6926 - - - -
1.0 102 4.168 0.4201 0.3982 0.3559 0.3200
1.0098 103 7.2629 - - - -
1.0196 104 2.1611 - - - -
1.0294 105 5.4286 - - - -
1.0392 106 6.2532 - - - -
1.0490 107 5.3801 - - - -
1.0588 108 5.9141 - - - -
1.0686 109 6.121 - - - -
1.0784 110 3.5102 - - - -
1.0882 111 2.6456 - - - -
1.0980 112 6.0685 - - - -
1.1078 113 5.1762 - - - -
1.1176 114 3.3374 - - - -
1.1275 115 3.3591 - - - -
1.1373 116 3.4516 - - - -
1.1471 117 9.5469 - - - -
1.1569 118 5.3393 - - - -
1.1667 119 6.5951 - - - -
1.1765 120 2.3823 - - - -
1.1863 121 7.0711 - - - -
1.1961 122 2.1941 - - - -
1.2059 123 4.1914 - - - -
1.2157 124 3.3929 - - - -
1.2255 125 4.0328 - - - -
1.2353 126 6.383 - - - -
1.2451 127 4.9079 - - - -
1.2549 128 4.7432 - - - -
1.2647 129 3.0618 - - - -
1.2745 130 4.468 - - - -
1.2843 131 6.5799 - - - -
1.2941 132 6.2514 - - - -
1.3039 133 4.0947 - - - -
1.3137 134 5.9042 - - - -
1.3235 135 3.1853 - - - -
1.3333 136 4.7338 - - - -
1.3431 137 3.8172 - - - -
1.3529 138 3.6581 - - - -
1.3627 139 4.923 - - - -
1.3725 140 4.5343 - - - -
1.3824 141 3.7658 - - - -
1.3922 142 11.1072 - - - -
1.4020 143 3.7444 - - - -
1.4118 144 8.2919 - - - -
1.4216 145 6.0605 - - - -
1.4314 146 3.6421 - - - -
1.4412 147 4.8018 - - - -
1.4510 148 6.0681 - - - -
1.4608 149 3.7449 - - - -
1.4706 150 5.372 - - - -
1.4804 151 4.1343 - - - -
1.4902 152 3.57 - - - -
1.5 153 2.355 - - - -
1.5098 154 5.0179 - - - -
1.5196 155 4.5454 - - - -
1.5294 156 4.3362 - - - -
1.5392 157 4.7675 - - - -
1.5490 158 3.4136 - - - -
1.5588 159 3.5347 - - - -
1.5686 160 4.7166 - - - -
1.5784 161 5.3206 - - - -
1.5882 162 5.2678 - - - -
1.5980 163 5.0315 - - - -
1.6078 164 7.7094 - - - -
1.6176 165 2.2483 - - - -
1.6275 166 3.6534 - - - -
1.6373 167 4.6594 - - - -
1.6471 168 4.5964 - - - -
1.6569 169 9.873 - - - -
1.6667 170 2.6889 - - - -
1.6765 171 2.6431 - - - -
1.6863 172 3.6038 - - - -
1.6961 173 3.869 - - - -
1.7059 174 2.8357 - - - -
1.7157 175 4.0195 - - - -
1.7255 176 5.8369 - - - -
1.7353 177 1.4096 - - - -
1.7451 178 8.4592 - - - -
1.7549 179 5.934 - - - -
1.7647 180 2.3132 - - - -
1.7745 181 3.7131 - - - -
1.7843 182 6.267 - - - -
1.7941 183 7.9196 - - - -
1.8039 184 7.2907 - - - -
1.8137 185 3.8729 - - - -
1.8235 186 4.265 - - - -
1.8333 187 2.5177 - - - -
1.8431 188 6.9559 - - - -
1.8529 189 3.2725 - - - -
1.8627 190 4.2916 - - - -
1.8725 191 8.5079 - - - -
1.8824 192 4.4141 - - - -
1.8922 193 9.6196 - - - -
1.9020 194 5.3813 - - - -
1.9118 195 4.6901 - - - -
1.9216 196 5.9262 - - - -
1.9314 197 4.8516 - - - -
1.9412 198 3.2107 - - - -
1.9510 199 3.7887 - - - -
1.9608 200 10.6244 - - - -
1.9706 201 5.9944 - - - -
1.9804 202 5.4088 - - - -
1.9902 203 6.0135 - - - -
2.0 204 9.4758 0.4590 0.4304 0.4099 0.3591
2.0098 205 5.3354 - - - -
2.0196 206 5.217 - - - -
2.0294 207 5.9712 - - - -
2.0392 208 6.6628 - - - -
2.0490 209 6.1048 - - - -
2.0588 210 3.5396 - - - -
2.0686 211 1.2557 - - - -
2.0784 212 2.1268 - - - -
2.0882 213 4.1976 - - - -
2.0980 214 2.0743 - - - -
2.1078 215 2.5679 - - - -
2.1176 216 2.1892 - - - -
2.1275 217 5.1347 - - - -
2.1373 218 2.5574 - - - -
2.1471 219 4.6373 - - - -
2.1569 220 3.4945 - - - -
2.1667 221 4.9608 - - - -
2.1765 222 3.2493 - - - -
2.1863 223 3.2011 - - - -
2.1961 224 5.5386 - - - -
2.2059 225 4.166 - - - -
2.2157 226 2.7747 - - - -
2.2255 227 2.4677 - - - -
2.2353 228 1.9435 - - - -
2.2451 229 3.3285 - - - -
2.2549 230 4.082 - - - -
2.2647 231 2.6793 - - - -
2.2745 232 2.7412 - - - -
2.2843 233 5.4343 - - - -
2.2941 234 5.3182 - - - -
2.3039 235 5.4431 - - - -
2.3137 236 3.825 - - - -
2.3235 237 5.9555 - - - -
2.3333 238 4.0367 - - - -
2.3431 239 0.874 - - - -
2.3529 240 3.7652 - - - -
2.3627 241 3.1397 - - - -
2.3725 242 2.2095 - - - -
2.3824 243 3.6528 - - - -
2.3922 244 3.9126 - - - -
2.4020 245 2.2959 - - - -
2.4118 246 2.1179 - - - -
2.4216 247 3.366 - - - -
2.4314 248 3.7169 - - - -
2.4412 249 2.2856 - - - -
2.4510 250 2.9071 - - - -
2.4608 251 2.7711 - - - -
2.4706 252 3.3077 - - - -
2.4804 253 3.5824 - - - -
2.4902 254 2.2269 - - - -
2.5 255 5.7617 - - - -
2.5098 256 1.7614 - - - -
2.5196 257 4.6898 - - - -
2.5294 258 1.9208 - - - -
2.5392 259 3.2963 - - - -
2.5490 260 6.7511 - - - -
2.5588 261 2.9878 - - - -
2.5686 262 1.3882 - - - -
2.5784 263 2.6977 - - - -
2.5882 264 4.0032 - - - -
2.5980 265 3.5722 - - - -
2.6078 266 3.1433 - - - -
2.6176 267 2.2979 - - - -
2.6275 268 4.1767 - - - -
2.6373 269 2.0087 - - - -
2.6471 270 2.1034 - - - -
2.6569 271 3.381 - - - -
2.6667 272 4.51 - - - -
2.6765 273 2.9127 - - - -
2.6863 274 4.2389 - - - -
2.6961 275 3.2239 - - - -
2.7059 276 6.8586 - - - -
2.7157 277 4.9489 - - - -
2.7255 278 7.1983 - - - -
2.7353 279 2.9171 - - - -
2.7451 280 5.4655 - - - -
2.7549 281 3.5527 - - - -
2.7647 282 2.4092 - - - -
2.7745 283 3.1819 - - - -
2.7843 284 3.2839 - - - -
2.7941 285 2.8816 - - - -
2.8039 286 3.4632 - - - -
2.8137 287 6.4261 - - - -
2.8235 288 5.2636 - - - -
2.8333 289 3.7697 - - - -
2.8431 290 2.7751 - - - -
2.8529 291 2.7938 - - - -
2.8627 292 3.2083 - - - -
2.8725 293 3.2028 - - - -
2.8824 294 4.1932 - - - -
2.8922 295 3.3445 - - - -
2.9020 296 3.4374 - - - -
2.9118 297 4.893 - - - -
2.9216 298 5.9708 - - - -
2.9314 299 1.6077 - - - -
2.9412 300 2.5645 - - - -
2.9510 301 1.5729 - - - -
2.9608 302 1.3197 - - - -
2.9706 303 5.934 - - - -
2.9804 304 1.8081 - - - -
2.9902 305 3.7014 - - - -
3.0 306 1.3901 0.4526 0.4435 0.4243 0.3765
3.0098 307 1.2279 - - - -
3.0196 308 3.0015 - - - -
3.0294 309 2.0001 - - - -
3.0392 310 2.3177 - - - -
3.0490 311 2.0772 - - - -
3.0588 312 1.9896 - - - -
3.0686 313 1.5398 - - - -
3.0784 314 2.7205 - - - -
3.0882 315 4.2203 - - - -
3.0980 316 1.7044 - - - -
3.1078 317 0.4191 - - - -
3.1176 318 5.7572 - - - -
3.1275 319 2.9936 - - - -
3.1373 320 1.4732 - - - -
3.1471 321 0.7598 - - - -
3.1569 322 1.8146 - - - -
3.1667 323 2.211 - - - -
3.1765 324 3.1902 - - - -
3.1863 325 2.1343 - - - -
3.1961 326 3.5396 - - - -
3.2059 327 1.5265 - - - -
3.2157 328 2.4405 - - - -
3.2255 329 2.5522 - - - -
3.2353 330 2.2863 - - - -
3.2451 331 3.5981 - - - -
3.2549 332 4.4424 - - - -
3.2647 333 4.5853 - - - -
3.2745 334 3.5618 - - - -
3.2843 335 2.0108 - - - -
3.2941 336 3.1284 - - - -
3.3039 337 1.3107 - - - -
3.3137 338 6.0269 - - - -
3.3235 339 0.4078 - - - -
3.3333 340 1.4205 - - - -
3.3431 341 2.9532 - - - -
3.3529 342 5.1152 - - - -
3.3627 343 1.4727 - - - -
3.3725 344 1.1699 - - - -
3.3824 345 1.1651 - - - -
3.3922 346 2.7933 - - - -
3.4020 347 3.1528 - - - -
3.4118 348 1.5391 - - - -
3.4216 349 3.0983 - - - -
3.4314 350 2.1153 - - - -
3.4412 351 5.2886 - - - -
3.4510 352 3.3398 - - - -
3.4608 353 2.7488 - - - -
3.4706 354 1.1624 - - - -
3.4804 355 1.502 - - - -
3.4902 356 3.0659 - - - -
3.5 357 2.2818 - - - -
3.5098 358 2.4482 - - - -
3.5196 359 2.1802 - - - -
3.5294 360 5.9712 - - - -
3.5392 361 2.0519 - - - -
3.5490 362 4.983 - - - -
3.5588 363 6.4049 - - - -
3.5686 364 2.3832 - - - -
3.5784 365 4.6559 - - - -
3.5882 366 1.836 - - - -
3.5980 367 1.7374 - - - -
3.6078 368 2.701 - - - -
3.6176 369 4.4824 - - - -
3.6275 370 5.4828 - - - -
3.6373 371 2.5578 - - - -
3.6471 372 1.2774 - - - -
3.6569 373 0.9595 - - - -
3.6667 374 3.0434 - - - -
3.6765 375 6.3731 - - - -
3.6863 376 6.8533 - - - -
3.6961 377 2.7222 - - - -
3.7059 378 1.0102 - - - -
3.7157 379 7.4713 - - - -
3.7255 380 0.3445 - - - -
3.7353 381 3.6325 - - - -
3.7451 382 3.025 - - - -
3.7549 383 1.1259 - - - -
3.7647 384 3.0086 - - - -
3.7745 385 3.1928 - - - -
3.7843 386 6.2071 - - - -
3.7941 387 1.7223 - - - -
3.8039 388 4.6953 - - - -
3.8137 389 2.097 - - - -
3.8235 390 1.2704 - - - -
3.8333 391 1.1491 - - - -
3.8431 392 2.2606 - - - -
3.8529 393 1.4562 - - - -
3.8627 394 2.6823 - - - -
3.8725 395 0.7801 - - - -
3.8824 396 3.1714 - - - -
3.8922 397 2.6189 - - - -
3.9020 398 1.168 - - - -
3.9118 399 2.3271 - - - -
3.9216 400 4.8405 - - - -
3.9314 401 3.9107 - - - -
3.9412 402 2.8518 - - - -
3.9510 403 4.1657 - - - -
3.9608 404 2.2372 - - - -
3.9706 405 1.0614 - - - -
3.9804 406 3.8297 - - - -
3.9902 407 1.9999 - - - -
4.0 408 1.3233 0.4710 0.4510 0.4281 0.4148
4.0098 409 1.0468 - - - -
4.0196 410 2.1942 - - - -
4.0294 411 2.0797 - - - -
4.0392 412 2.8829 - - - -
4.0490 413 0.7442 - - - -
4.0588 414 2.4341 - - - -
4.0686 415 3.232 - - - -
4.0784 416 1.4055 - - - -
4.0882 417 0.724 - - - -
4.0980 418 3.1341 - - - -
4.1078 419 1.786 - - - -
4.1176 420 1.4026 - - - -
4.1275 421 4.7106 - - - -
4.1373 422 5.0459 - - - -
4.1471 423 3.257 - - - -
4.1569 424 2.2771 - - - -
4.1667 425 1.6407 - - - -
4.1765 426 1.953 - - - -
4.1863 427 2.7483 - - - -
4.1961 428 2.284 - - - -
4.2059 429 1.3397 - - - -
4.2157 430 3.0712 - - - -
4.2255 431 4.0505 - - - -
4.2353 432 0.5806 - - - -
4.2451 433 0.8523 - - - -
4.2549 434 5.1672 - - - -
4.2647 435 1.0583 - - - -
4.2745 436 3.5743 - - - -
4.2843 437 2.1655 - - - -
4.2941 438 1.017 - - - -
4.3039 439 1.4406 - - - -
4.3137 440 2.8878 - - - -
4.3235 441 2.9334 - - - -
4.3333 442 2.8944 - - - -
4.3431 443 2.9257 - - - -
4.3529 444 1.2912 - - - -
4.3627 445 4.3522 - - - -
4.3725 446 1.7422 - - - -
4.3824 447 3.6445 - - - -
4.3922 448 4.282 - - - -
4.4020 449 3.7589 - - - -
4.4118 450 0.6328 - - - -
4.4216 451 2.7343 - - - -
4.4314 452 2.7081 - - - -
4.4412 453 2.1833 - - - -
4.4510 454 2.4024 - - - -
4.4608 455 1.8104 - - - -
4.4706 456 4.498 - - - -
4.4804 457 3.3512 - - - -
4.4902 458 2.6532 - - - -
4.5 459 1.2743 - - - -
4.5098 460 4.3366 - - - -
4.5196 461 0.9041 - - - -
4.5294 462 1.7273 - - - -
4.5392 463 2.228 - - - -
4.5490 464 1.9863 - - - -
4.5588 465 1.3791 - - - -
4.5686 466 0.6 - - - -
4.5784 467 0.9613 - - - -
4.5882 468 0.8215 - - - -
4.5980 469 2.0666 - - - -
4.6078 470 3.4636 - - - -
4.6176 471 3.3794 - - - -
4.6275 472 1.5988 - - - -
4.6373 473 0.9649 - - - -
4.6471 474 1.7106 - - - -
4.6569 475 2.9365 - - - -
4.6667 476 1.261 - - - -
4.6765 477 2.7977 - - - -
4.6863 478 3.0166 - - - -
4.6961 479 1.871 - - - -
4.7059 480 3.1311 - - - -
4.7157 481 1.4456 - - - -
4.7255 482 1.7153 - - - -
4.7353 483 2.283 - - - -
4.7451 484 3.1011 - - - -
4.7549 485 1.0986 - - - -
4.7647 486 2.2503 - - - -
4.7745 487 0.3953 - - - -
4.7843 488 4.141 - - - -
4.7941 489 3.4797 - - - -
4.8039 490 4.1368 - - - -
4.8137 491 0.565 - - - -
4.8235 492 2.1161 - - - -
4.8333 493 3.2724 - - - -
4.8431 494 1.0777 - - - -
4.8529 495 2.3217 - - - -
4.8627 496 2.4613 - - - -
4.8725 497 2.6271 - - - -
4.8824 498 3.5758 - - - -
4.8922 499 0.7174 - - - -
4.9020 500 1.3738 - - - -
4.9118 501 4.1496 - - - -
4.9216 502 0.4534 - - - -
4.9314 503 2.1279 - - - -
4.9412 504 3.3264 - - - -
4.9510 505 1.4833 - - - -
4.9608 506 2.6725 - - - -
4.9706 507 2.2114 - - - -
4.9804 508 0.6644 - - - -
4.9902 509 3.18 - - - -
5.0 510 0.6032 0.4756 0.4734 0.4388 0.3863
5.0098 511 2.8545 - - - -
5.0196 512 3.287 - - - -
5.0294 513 2.972 - - - -
5.0392 514 3.6356 - - - -
5.0490 515 0.8827 - - - -
5.0588 516 1.8925 - - - -
5.0686 517 5.0565 - - - -
5.0784 518 2.1866 - - - -
5.0882 519 1.2702 - - - -
5.0980 520 2.3826 - - - -
5.1078 521 2.024 - - - -
5.1176 522 1.606 - - - -
5.1275 523 1.766 - - - -
5.1373 524 2.4618 - - - -
5.1471 525 1.1541 - - - -
5.1569 526 0.607 - - - -
5.1667 527 1.1553 - - - -
5.1765 528 1.5372 - - - -
5.1863 529 4.0782 - - - -
5.1961 530 1.7296 - - - -
5.2059 531 2.2709 - - - -
5.2157 532 1.953 - - - -
5.2255 533 3.8818 - - - -
5.2353 534 2.0291 - - - -
5.2451 535 0.9016 - - - -
5.2549 536 0.8501 - - - -
5.2647 537 1.84 - - - -
5.2745 538 1.3482 - - - -
5.2843 539 1.5901 - - - -
5.2941 540 1.2646 - - - -
5.3039 541 0.9188 - - - -
5.3137 542 0.6407 - - - -
5.3235 543 2.075 - - - -
5.3333 544 1.3842 - - - -
5.3431 545 2.6463 - - - -
5.3529 546 2.0425 - - - -
5.3627 547 2.0199 - - - -
5.3725 548 1.3228 - - - -
5.3824 549 1.4143 - - - -
5.3922 550 0.5431 - - - -
5.4020 551 0.9359 - - - -
5.4118 552 2.7431 - - - -
5.4216 553 6.8094 - - - -
5.4314 554 1.3632 - - - -
5.4412 555 3.6833 - - - -
5.4510 556 1.6772 - - - -
5.4608 557 1.7166 - - - -
5.4706 558 4.0858 - - - -
5.4804 559 1.5822 - - - -
5.4902 560 2.3036 - - - -
5.5 561 1.445 - - - -
5.5098 562 2.8063 - - - -
5.5196 563 2.8822 - - - -
5.5294 564 6.7016 - - - -
5.5392 565 1.8359 - - - -
5.5490 566 3.0261 - - - -
5.5588 567 1.4881 - - - -
5.5686 568 1.0264 - - - -
5.5784 569 2.7369 - - - -
5.5882 570 1.596 - - - -
5.5980 571 0.4797 - - - -
5.6078 572 0.8245 - - - -
5.6176 573 1.6987 - - - -
5.6275 574 0.349 - - - -
5.6373 575 0.4556 - - - -
5.6471 576 1.0309 - - - -
5.6569 577 1.1379 - - - -
5.6667 578 1.0166 - - - -
5.6765 579 2.273 - - - -
5.6863 580 3.2326 - - - -
5.6961 581 1.4613 - - - -
5.7059 582 1.7815 - - - -
5.7157 583 1.087 - - - -
5.7255 584 1.216 - - - -
5.7353 585 2.7214 - - - -
5.7451 586 3.2741 - - - -
5.7549 587 0.5625 - - - -
5.7647 588 1.0939 - - - -
5.7745 589 2.3217 - - - -
5.7843 590 1.0632 - - - -
5.7941 591 2.0565 - - - -
5.8039 592 0.9582 - - - -
5.8137 593 1.8169 - - - -
5.8235 594 2.1545 - - - -
5.8333 595 0.8096 - - - -
5.8431 596 3.3732 - - - -
5.8529 597 1.962 - - - -
5.8627 598 1.892 - - - -
5.8725 599 2.0222 - - - -
5.8824 600 1.2117 - - - -
5.8922 601 0.9108 - - - -
5.9020 602 0.6279 - - - -
5.9118 603 1.6545 - - - -
5.9216 604 1.3964 - - - -
5.9314 605 2.3043 - - - -
5.9412 606 1.7315 - - - -
5.9510 607 0.8683 - - - -
5.9608 608 0.551 - - - -
5.9706 609 1.3417 - - - -
5.9804 610 3.2151 - - - -
5.9902 611 1.3203 - - - -
6.0 612 2.0452 0.4682 0.4634 0.4284 0.4043
6.0098 613 2.1156 - - - -
6.0196 614 1.1371 - - - -
6.0294 615 2.8463 - - - -
6.0392 616 1.4252 - - - -
6.0490 617 0.7853 - - - -
6.0588 618 2.1618 - - - -
6.0686 619 1.9695 - - - -
6.0784 620 2.7691 - - - -
6.0882 621 2.6361 - - - -
6.0980 622 2.4046 - - - -
6.1078 623 3.2089 - - - -
6.1176 624 1.9515 - - - -
6.1275 625 3.0264 - - - -
6.1373 626 1.2899 - - - -
6.1471 627 1.0579 - - - -
6.1569 628 0.7195 - - - -
6.1667 629 1.2911 - - - -
6.1765 630 0.8499 - - - -
6.1863 631 2.0346 - - - -
6.1961 632 2.1705 - - - -
6.2059 633 1.1316 - - - -
6.2157 634 1.7937 - - - -
6.2255 635 1.0471 - - - -
6.2353 636 1.3242 - - - -
6.2451 637 2.0447 - - - -
6.2549 638 1.7709 - - - -
6.2647 639 0.6088 - - - -
6.2745 640 0.6764 - - - -
6.2843 641 0.5081 - - - -
6.2941 642 0.8441 - - - -
6.3039 643 2.0049 - - - -
6.3137 644 1.9641 - - - -
6.3235 645 1.3781 - - - -
6.3333 646 3.5393 - - - -
6.3431 647 1.2912 - - - -
6.3529 648 0.7667 - - - -
6.3627 649 1.6777 - - - -
6.3725 650 0.6319 - - - -
6.3824 651 0.4547 - - - -
6.3922 652 1.851 - - - -
6.4020 653 2.3882 - - - -
6.4118 654 1.6366 - - - -
6.4216 655 1.6582 - - - -
6.4314 656 2.6996 - - - -
6.4412 657 0.9238 - - - -
6.4510 658 5.2133 - - - -
6.4608 659 2.4244 - - - -
6.4706 660 0.9866 - - - -
6.4804 661 0.6605 - - - -
6.4902 662 1.1516 - - - -
6.5 663 3.408 - - - -
6.5098 664 0.1852 - - - -
6.5196 665 3.2213 - - - -
6.5294 666 2.4864 - - - -
6.5392 667 1.4941 - - - -
6.5490 668 1.4647 - - - -
6.5588 669 4.6957 - - - -
6.5686 670 2.9545 - - - -
6.5784 671 1.0226 - - - -
6.5882 672 0.7774 - - - -
6.5980 673 0.9871 - - - -
6.6078 674 0.9773 - - - -
6.6176 675 1.4215 - - - -
6.6275 676 5.0171 - - - -
6.6373 677 1.5318 - - - -
6.6471 678 2.668 - - - -
6.6569 679 3.0617 - - - -
6.6667 680 2.7726 - - - -
6.6765 681 0.7753 - - - -
6.6863 682 0.2535 - - - -
6.6961 683 1.5837 - - - -
6.7059 684 1.9242 - - - -
6.7157 685 0.888 - - - -
6.7255 686 3.4582 - - - -
6.7353 687 2.4686 - - - -
6.7451 688 1.1398 - - - -
6.7549 689 1.6506 - - - -
6.7647 690 2.4229 - - - -
6.7745 691 0.6329 - - - -
6.7843 692 0.6814 - - - -
6.7941 693 2.4256 - - - -
6.8039 694 1.0822 - - - -
6.8137 695 4.7177 - - - -
6.8235 696 1.8025 - - - -
6.8333 697 0.5768 - - - -
6.8431 698 1.725 - - - -
6.8529 699 2.2255 - - - -
6.8627 700 2.4724 - - - -
6.8725 701 1.9854 - - - -
6.8824 702 0.9316 - - - -
6.8922 703 1.5515 - - - -
6.9020 704 2.4141 - - - -
6.9118 705 2.7799 - - - -
6.9216 706 1.9804 - - - -
6.9314 707 1.2819 - - - -
6.9412 708 1.869 - - - -
6.9510 709 1.0925 - - - -
6.9608 710 1.6789 - - - -
6.9706 711 1.7431 - - - -
6.9804 712 0.4036 - - - -
6.9902 713 3.427 - - - -
7.0 714 2.2931 0.4822 0.4740 0.4423 0.3975
7.0098 715 2.3939 - - - -
7.0196 716 1.5321 - - - -
7.0294 717 2.577 - - - -
7.0392 718 1.0859 - - - -
7.0490 719 0.7068 - - - -
7.0588 720 3.4612 - - - -
7.0686 721 1.7073 - - - -
7.0784 722 1.2549 - - - -
7.0882 723 1.5404 - - - -
7.0980 724 5.1588 - - - -
7.1078 725 1.751 - - - -
7.1176 726 2.0296 - - - -
7.1275 727 2.5703 - - - -
7.1373 728 0.9409 - - - -
7.1471 729 3.0675 - - - -
7.1569 730 0.8184 - - - -
7.1667 731 1.2241 - - - -
7.1765 732 1.8152 - - - -
7.1863 733 0.6763 - - - -
7.1961 734 2.4549 - - - -
7.2059 735 0.7423 - - - -
7.2157 736 0.7124 - - - -
7.2255 737 0.8499 - - - -
7.2353 738 0.488 - - - -
7.2451 739 0.4591 - - - -
7.2549 740 1.1732 - - - -
7.2647 741 0.5776 - - - -
7.2745 742 1.2033 - - - -
7.2843 743 2.5784 - - - -
7.2941 744 1.6815 - - - -
7.3039 745 4.3149 - - - -
7.3137 746 1.8871 - - - -
7.3235 747 1.4083 - - - -
7.3333 748 0.0862 - - - -
7.3431 749 0.6226 - - - -
7.3529 750 1.831 - - - -
7.3627 751 0.6017 - - - -
7.3725 752 0.9593 - - - -
7.3824 753 2.1721 - - - -
7.3922 754 3.5582 - - - -
7.4020 755 1.2731 - - - -
7.4118 756 0.5528 - - - -
7.4216 757 0.8623 - - - -
7.4314 758 1.036 - - - -
7.4412 759 1.6895 - - - -
7.4510 760 1.5175 - - - -
7.4608 761 1.0343 - - - -
7.4706 762 1.9994 - - - -
7.4804 763 0.9924 - - - -
7.4902 764 0.8559 - - - -
7.5 765 0.2891 - - - -
7.5098 766 2.1621 - - - -
7.5196 767 0.9101 - - - -
7.5294 768 1.977 - - - -
7.5392 769 1.852 - - - -
7.5490 770 1.5613 - - - -
7.5588 771 0.9339 - - - -
7.5686 772 2.5251 - - - -
7.5784 773 2.1131 - - - -
7.5882 774 1.2763 - - - -
7.5980 775 1.5074 - - - -
7.6078 776 0.5789 - - - -
7.6176 777 1.418 - - - -
7.6275 778 0.7083 - - - -
7.6373 779 1.1283 - - - -
7.6471 780 0.9997 - - - -
7.6569 781 3.2343 - - - -
7.6667 782 0.5454 - - - -
7.6765 783 0.5323 - - - -
7.6863 784 3.0509 - - - -
7.6961 785 0.5691 - - - -
7.7059 786 2.4681 - - - -
7.7157 787 0.8033 - - - -
7.7255 788 0.8835 - - - -
7.7353 789 1.5139 - - - -
7.7451 790 0.9881 - - - -
7.7549 791 1.081 - - - -
7.7647 792 1.803 - - - -
7.7745 793 2.7415 - - - -
7.7843 794 0.9567 - - - -
7.7941 795 1.5433 - - - -
7.8039 796 2.4771 - - - -
7.8137 797 2.4648 - - - -
7.8235 798 1.2313 - - - -
7.8333 799 1.6428 - - - -
7.8431 800 1.863 - - - -
7.8529 801 2.4543 - - - -
7.8627 802 1.7924 - - - -
7.8725 803 0.3716 - - - -
7.8824 804 0.8461 - - - -
7.8922 805 1.2871 - - - -
7.9020 806 1.5693 - - - -
7.9118 807 0.8038 - - - -
7.9216 808 1.0625 - - - -
7.9314 809 0.6098 - - - -
7.9412 810 1.4102 - - - -
7.9510 811 1.5963 - - - -
7.9608 812 0.7807 - - - -
7.9706 813 1.7183 - - - -
7.9804 814 1.5457 - - - -
7.9902 815 0.1146 - - - -
8.0 816 0.5686 0.4841 0.4732 0.4474 0.3997
8.0098 817 0.2666 - - - -
8.0196 818 2.6931 - - - -
8.0294 819 1.6724 - - - -
8.0392 820 1.9833 - - - -
8.0490 821 1.0162 - - - -
8.0588 822 0.9381 - - - -
8.0686 823 3.3628 - - - -
8.0784 824 2.9051 - - - -
8.0882 825 1.2757 - - - -
8.0980 826 0.6009 - - - -
8.1078 827 0.8206 - - - -
8.1176 828 1.056 - - - -
8.1275 829 0.6114 - - - -
8.1373 830 3.1316 - - - -
8.1471 831 1.0243 - - - -
8.1569 832 0.6706 - - - -
8.1667 833 2.8132 - - - -
8.1765 834 1.2525 - - - -
8.1863 835 5.7306 - - - -
8.1961 836 0.5077 - - - -
8.2059 837 1.082 - - - -
8.2157 838 1.329 - - - -
8.2255 839 0.897 - - - -
8.2353 840 1.5678 - - - -
8.2451 841 1.5634 - - - -
8.2549 842 0.5244 - - - -
8.2647 843 1.9586 - - - -
8.2745 844 1.1766 - - - -
8.2843 845 1.7682 - - - -
8.2941 846 1.7959 - - - -
8.3039 847 2.3565 - - - -
8.3137 848 0.8934 - - - -
8.3235 849 1.0446 - - - -
8.3333 850 1.3397 - - - -
8.3431 851 1.0805 - - - -
8.3529 852 1.4928 - - - -
8.3627 853 0.6493 - - - -
8.3725 854 0.9378 - - - -
8.3824 855 1.0568 - - - -
8.3922 856 1.1886 - - - -
8.4020 857 3.6991 - - - -
8.4118 858 1.2898 - - - -
8.4216 859 2.2705 - - - -
8.4314 860 1.4883 - - - -
8.4412 861 0.4358 - - - -
8.4510 862 0.6003 - - - -
8.4608 863 0.7857 - - - -
8.4706 864 3.0006 - - - -
8.4804 865 1.9955 - - - -
8.4902 866 0.1742 - - - -
8.5 867 0.7204 - - - -
8.5098 868 2.0722 - - - -
8.5196 869 0.9973 - - - -
8.5294 870 0.6607 - - - -
8.5392 871 0.9087 - - - -
8.5490 872 1.0388 - - - -
8.5588 873 1.5166 - - - -
8.5686 874 2.1062 - - - -
8.5784 875 1.357 - - - -
8.5882 876 0.8239 - - - -
8.5980 877 1.0221 - - - -
8.6078 878 1.5762 - - - -
8.6176 879 0.7868 - - - -
8.6275 880 2.1688 - - - -
8.6373 881 1.5953 - - - -
8.6471 882 0.9909 - - - -
8.6569 883 0.7163 - - - -
8.6667 884 2.9778 - - - -
8.6765 885 2.6271 - - - -
8.6863 886 1.1875 - - - -
8.6961 887 1.9843 - - - -
8.7059 888 0.578 - - - -
8.7157 889 1.7284 - - - -
8.7255 890 1.5817 - - - -
8.7353 891 1.0349 - - - -
8.7451 892 1.2771 - - - -
8.7549 893 0.7346 - - - -
8.7647 894 0.896 - - - -
8.7745 895 0.3089 - - - -
8.7843 896 1.0116 - - - -
8.7941 897 0.782 - - - -
8.8039 898 1.3643 - - - -
8.8137 899 0.8717 - - - -
8.8235 900 1.0883 - - - -
8.8333 901 2.4553 - - - -
8.8431 902 0.8967 - - - -
8.8529 903 1.4815 - - - -
8.8627 904 2.0851 - - - -
8.8725 905 0.8294 - - - -
8.8824 906 1.4176 - - - -
8.8922 907 0.9584 - - - -
8.9020 908 0.8526 - - - -
8.9118 909 2.1568 - - - -
8.9216 910 1.6507 - - - -
8.9314 911 0.8236 - - - -
8.9412 912 2.4097 - - - -
8.9510 913 0.1605 - - - -
8.9608 914 2.1934 - - - -
8.9706 915 0.8835 - - - -
8.9804 916 2.8156 - - - -
8.9902 917 1.193 - - - -
9.0 918 0.4156 0.4914 0.4730 0.4484 0.3994
9.0098 919 0.6944 - - - -
9.0196 920 1.3762 - - - -
9.0294 921 1.4685 - - - -
9.0392 922 1.985 - - - -
9.0490 923 1.532 - - - -
9.0588 924 0.9062 - - - -
9.0686 925 0.5014 - - - -
9.0784 926 2.4734 - - - -
9.0882 927 1.235 - - - -
9.0980 928 2.0355 - - - -
9.1078 929 1.8362 - - - -
9.1176 930 0.2716 - - - -
9.1275 931 2.4027 - - - -
9.1373 932 1.987 - - - -
9.1471 933 0.866 - - - -
9.1569 934 0.5206 - - - -
9.1667 935 1.1732 - - - -
9.1765 936 0.5978 - - - -
9.1863 937 1.4445 - - - -
9.1961 938 1.563 - - - -
9.2059 939 3.7164 - - - -
9.2157 940 0.6754 - - - -
9.2255 941 0.9696 - - - -
9.2353 942 1.9793 - - - -
9.2451 943 0.5348 - - - -
9.2549 944 1.8555 - - - -
9.2647 945 3.1624 - - - -
9.2745 946 1.3702 - - - -
9.2843 947 1.0587 - - - -
9.2941 948 2.3264 - - - -
9.3039 949 0.3401 - - - -
9.3137 950 3.5269 - - - -
9.3235 951 0.8789 - - - -
9.3333 952 0.1871 - - - -
9.3431 953 0.9429 - - - -
9.3529 954 1.2789 - - - -
9.3627 955 1.3322 - - - -
9.3725 956 0.7246 - - - -
9.3824 957 1.335 - - - -
9.3922 958 0.4446 - - - -
9.4020 959 0.7804 - - - -
9.4118 960 3.2016 - - - -
9.4216 961 2.9954 - - - -
9.4314 962 0.4036 - - - -
9.4412 963 1.893 - - - -
9.4510 964 2.0831 - - - -
9.4608 965 0.4715 - - - -
9.4706 966 1.2693 - - - -
9.4804 967 1.2715 - - - -
9.4902 968 2.7172 - - - -
9.5 969 0.6332 - - - -
9.5098 970 1.9209 - - - -
9.5196 971 1.7342 - - - -
9.5294 972 0.6174 - - - -
9.5392 973 1.0254 - - - -
9.5490 974 1.6523 - - - -
9.5588 975 1.0256 - - - -
9.5686 976 1.7213 - - - -
9.5784 977 0.6547 - - - -
9.5882 978 1.0245 - - - -
9.5980 979 1.0071 - - - -
9.6078 980 1.8322 - - - -
9.6176 981 1.1578 - - - -
9.6275 982 0.3978 - - - -
9.6373 983 1.6247 - - - -
9.6471 984 1.4199 - - - -
9.6569 985 1.0348 - - - -
9.6667 986 1.7685 - - - -
9.6765 987 1.9384 - - - -
9.6863 988 1.0472 - - - -
9.6961 989 1.5427 - - - -
9.7059 990 2.572 - - - -
9.7157 991 2.0624 - - - -
9.7255 992 2.6882 - - - -
9.7353 993 1.4118 - - - -
9.7451 994 3.1801 - - - -
9.7549 995 0.2857 - - - -
9.7647 996 0.5164 - - - -
9.7745 997 1.3223 - - - -
9.7843 998 1.1611 - - - -
9.7941 999 2.4415 - - - -
9.8039 1000 1.4446 - - - -
9.8137 1001 1.094 - - - -
9.8235 1002 1.8717 - - - -
9.8333 1003 1.6525 - - - -
9.8431 1004 0.5387 - - - -
9.8529 1005 0.2567 - - - -
9.8627 1006 1.4519 - - - -
9.8725 1007 0.567 - - - -
9.8824 1008 0.6905 - - - -
9.8922 1009 0.8775 - - - -
9.9020 1010 1.9415 - - - -
9.9118 1011 1.538 - - - -
9.9216 1012 1.0921 - - - -
9.9314 1013 2.3688 - - - -
9.9412 1014 1.3336 - - - -
9.9510 1015 2.078 - - - -
9.9608 1016 0.9699 - - - -
9.9706 1017 0.338 - - - -
9.9804 1018 0.4844 - - - -
9.9902 1019 0.8789 - - - -
10.0 1020 0.6449 0.4912 0.473 0.4526 0.4019
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.51.3
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
4
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IoannisKat1/all-MiniLM-L6-v2-ft-new

Finetuned
(700)
this model

Evaluation results