modernbert_embed_base Finetuned on Data

This is a sentence-transformers model finetuned from nomic-ai/modernbert-embed-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/modernbert-embed-base
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'What action was the plaintiff urged to complete within the next 24 hours?',
    '**Court (Civil/Criminal): Civil**\n\n**Provisions:**\n\n**Time of commission of the act:**\n\n**Outcome (not guilty, guilty):**\n\n**Rationale:**\n\n**Facts:**\nThe plaintiff holds credit card number ............ with the defendant banking corporation. Based on the application for alternative networks dated 19/7/2015 with number ......... submitted at a branch of the defendant, he was granted access to the electronic banking service (e-banking) to conduct banking transactions (debit, credit, updates, payments) remotely. On 30/11/2020, the plaintiff fell victim to electronic fraud through the "phishing" method, whereby an unknown perpetrator managed to withdraw a total amount of €3,121.75 from the aforementioned credit card. Specifically, the plaintiff received an email at 1:35 PM on 29/11/2020 from sender ...... with address ........, informing him that due to an impending system change, he needed to verify the mobile phone number linked to the credit card, urging him to complete the verification process within the next 24 hours by following a link titled ........; otherwise, his account would be locked for security reasons. The plaintiff read this email on the afternoon of 30 November 2020 and, believing it was from the defendant, followed the instructions and proceeded via the provided link to a website that was identical (a clone) to that of the defendant. On this page, he was asked to enter the six-digit security code (.........) that had just been sent to his mobile phone by the defendant at 3:41 PM, with the note that it was an activation code for his ........ card at ........., which he entered.\n\nSubsequently, the plaintiff received, according to his statements, a new email (not submitted), which requested him to enter the details of the aforementioned credit card, specifically the name of the cardholder and the card number, not the PIN, which he also entered, convinced that he was within the online environment of the defendant. Then, at 3:47 PM, he received a message on his mobile phone from the defendant containing the exact same content as the one he received at 3:41 PM, while at 3:50 PM he received a message stating that the activation of his ......... card at ....... had been completed. Once the plaintiff read this, he became concerned that something was not right, and immediately called (at 4:41 PM) the defendant\'s call center to inform them. There, the employees, with whom he finally connected at 5:04 PM due to high call center volume, advised him to delete the relevant emails, cancel his credit card, change his access passwords for the service, and submit a dispute request regarding the conducted transactions. The plaintiff electronically sent this request to the defendant, disputing the detailed transactions amounting to €3,121.75, which were conducted on 30/11/2020 during the time frame of 16:37:45-16:43:34 PM, arguing that he had neither performed them himself nor authorized anyone else to do so. The plaintiff specifically disputed the following transactions, as evidenced by the account activity of the disputed credit card during the aforementioned timeframe: a) transaction number ......... amounting to €150.62 conducted on 30/11/2020 at 4:43:34 PM, b) transaction number ........ amounting to €293.20 conducted on 30/11/2020 at 4:42:40 PM, c) transaction number ............ amounting to €295.21 conducted on 30/11/2020 at 4:42:10 PM, d) transaction number .......... amounting to €299.22 conducted on 30/11/2020 at 4:41:31 PM, e) transaction number ........ amounting to €297.21 conducted on 30/11/2020 at 4:41:01 PM, f) transaction number ........ amounting to €299.22 conducted on 30/11/2020 at 4:40:27 PM, g) transaction number ....... amounting to €299.22 conducted on 30/11/2020 at 4:39:55 PM, h) transaction number ...... amounting to €299.22 conducted on 30/11/2020 at 4:39:22 PM, i) transaction number ......... amounting to €297.22 conducted on 30/11/2020 at 4:38:52 PM, j) transaction number ......... amounting to €295.21 conducted on 30/11/2020 at 4:38:17 PM, and k) transaction number ......... amounting to €296.21 conducted on 30/11/2020 at 4:37:45 PM. In its response letter dated 21/12/2020, the defendant denied responsibility for the costs of the aforementioned transactions, placing the entire blame on the plaintiff for the leak of his card details and security code to the fraudulent page. The plaintiff, completely denying any fault for the conducted transactions, repeatedly contacted the defendant, both by phone and via email (see emails dated 15/1/2021 and 11/2/2021), while on 2/3/2021, he electronically sent a report dated 1/03/2021 to the Consumer Advocate’s email address, recounting the events and requesting that the aforementioned Independent Authority intervene to have the disputed debt canceled. In its letter with reference number ...../27.04.2021, the aforementioned Independent Authority informed the plaintiff that the case was outside its mediating role and was therefore archived. Subsequently, the plaintiff sent the defendant on 5/3/2021 his extrajudicial statement dated 4/3/2021, calling upon it to fully cancel the debt of €3,121.75 that had been unjustly incurred against him within two days and to immediately instruct the representatives of the collection agency working with it to cease contacting him regarding the disputed case. The defendant sent the plaintiff a message on his mobile phone on 20/04/2021 informing him that his case was still being processed due to lengthy operational requirements, while on 23/04/2021, via email, it informed him that considering their good cooperation and his efforts to keep them updated, it had reviewed his case and decided to refund him the amounts of the transactions that were conducted after his contact with their representatives on 30/11/2020 at 4:41 PM, totaling €1,038.25, specifically the following: a) transaction of €150.62 conducted on 30/11/2020 at 4:43 PM, b) transaction of €295.21 conducted on 30/11/2020 at 4:42 PM, c) transaction of €293.20 conducted on 30/11/2020 at 4:42 PM, and d) transaction of €299.22 conducted on 30/11/2020 at 4:41 PM. Beyond this, the defendant refused to refund the plaintiff the amount of the remaining transactions conducted on 30/11/2020, totaling €2,376.08 (and not €2,376.48 as incorrectly stated by the plaintiff in his lawsuit), which the plaintiff ultimately fully paid, transferring €2,342.77 to the defendant on 7/06/2021 and €33.31 on 15/06/2021 (see related deposit receipts).',
    "1.Without prejudice to other tasks set out under this Regulation, each supervisory authority shall on its territory: (a)  monitor and enforce the application of this Regulation; (b)  promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing. Activities addressed specifically to children shall receive specific attention; (c)  advise, in accordance with Member State law, the national parliament, the government, and other institutions and bodies on legislative and administrative measures relating to the protection of natural persons' rights and freedoms with regard to processing; (d)  promote the awareness of controllers and processors of their obligations under this Regulation; (e)  upon request, provide information to any data subject concerning the exercise of their rights under this Regulation and, if appropriate, cooperate with the supervisory authorities in other Member States to that end; (f)  handle complaints lodged by a data subject, or by a body, organisation or association in accordance with Article 80, and investigate, to the extent appropriate, the subject matter of the complaint and inform the complainant of the progress and the outcome of the investigation within a reasonable period, in particular if further investigation or coordination with another supervisory authority is necessary; (g)  cooperate with, including sharing information and provide mutual assistance to, other supervisory authorities with a view to ensuring the consistency of application and enforcement of this Regulation; (h)  conduct investigations on the application of this Regulation, including on the basis of information received from another supervisory authority or other public authority; (i)  monitor relevant developments, insofar as they have an impact on the protection of personal data, in particular the development of information and communication technologies and commercial practices; (j)  adopt standard contractual clauses referred to in Article 28(8) and in point (d) of Article 46(2); (k)  establish and maintain a list in relation to the requirement for data protection impact assessment pursuant to Article 35(4); (l)  give advice on the processing operations referred to in Article 36(2); (m)  encourage the drawing up of codes of conduct pursuant to Article 40(1) and provide an opinion and approve such codes of conduct which provide sufficient safeguards, pursuant to Article 40(5); (n)  encourage the establishment of data protection certification mechanisms and of data protection seals and marks pursuant to Article 42(1), and approve the criteria of certification pursuant to Article 42(5); (o)  where applicable, carry out a periodic review of certifications issued in accordance with Article 42(7); 4.5.2016 L 119/68   (p)  draft and publish the criteria for accreditation of a body for monitoring codes of conduct pursuant to Article 41 and of a certification body pursuant to Article 43; (q)  conduct the accreditation of a body for monitoring codes of conduct pursuant to Article 41 and of a certification body pursuant to Article 43; (r)  authorise contractual clauses and provisions referred to in Article 46(3); (s)  approve binding corporate rules pursuant to Article 47; (t)  contribute to the activities of the Board; (u)  keep internal records of infringements of this Regulation and of measures taken in accordance with Article 58(2); and (v)  fulfil any other tasks related to the protection of personal data.\n2.Each supervisory authority shall facilitate the submission of complaints referred to in point (f) of paragraph 1 by measures such as a complaint submission form which can also be completed electronically, without excluding other means of communication.\n3.The performance of the tasks of each supervisory authority shall be free of charge for the data subject and, where applicable, for the data protection officer.\n4.Where requests are manifestly unfounded or excessive, in particular because of their repetitive character, the supervisory authority may charge a reasonable fee based on administrative costs, or refuse to act on the request. The supervisory authority shall bear the burden of demonstrating the manifestly unfounded or excessive character of the request.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.3932, -0.0156],
#         [ 0.3932,  1.0000, -0.0606],
#         [-0.0156, -0.0606,  1.0000]])

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.516
cosine_accuracy@3 0.5676
cosine_accuracy@5 0.5946
cosine_accuracy@10 0.6314
cosine_precision@1 0.516
cosine_precision@3 0.507
cosine_precision@5 0.4811
cosine_precision@10 0.4199
cosine_recall@1 0.0922
cosine_recall@3 0.2413
cosine_recall@5 0.3284
cosine_recall@10 0.4384
cosine_ndcg@10 0.5731
cosine_mrr@10 0.5448
cosine_map@100 0.6213

Information Retrieval

Metric Value
cosine_accuracy@1 0.5061
cosine_accuracy@3 0.5455
cosine_accuracy@5 0.5848
cosine_accuracy@10 0.6241
cosine_precision@1 0.5061
cosine_precision@3 0.4922
cosine_precision@5 0.4678
cosine_precision@10 0.4101
cosine_recall@1 0.0925
cosine_recall@3 0.2351
cosine_recall@5 0.3212
cosine_recall@10 0.4313
cosine_ndcg@10 0.5621
cosine_mrr@10 0.5338
cosine_map@100 0.6107

Information Retrieval

Metric Value
cosine_accuracy@1 0.4988
cosine_accuracy@3 0.5504
cosine_accuracy@5 0.5676
cosine_accuracy@10 0.5897
cosine_precision@1 0.4988
cosine_precision@3 0.4889
cosine_precision@5 0.4614
cosine_precision@10 0.3983
cosine_recall@1 0.0915
cosine_recall@3 0.2331
cosine_recall@5 0.3163
cosine_recall@10 0.4234
cosine_ndcg@10 0.5488
cosine_mrr@10 0.5232
cosine_map@100 0.6028

Information Retrieval

Metric Value
cosine_accuracy@1 0.4619
cosine_accuracy@3 0.5135
cosine_accuracy@5 0.5332
cosine_accuracy@10 0.57
cosine_precision@1 0.4619
cosine_precision@3 0.4562
cosine_precision@5 0.43
cosine_precision@10 0.3764
cosine_recall@1 0.0832
cosine_recall@3 0.2142
cosine_recall@5 0.2904
cosine_recall@10 0.3925
cosine_ndcg@10 0.5153
cosine_mrr@10 0.4887
cosine_map@100 0.5673

Information Retrieval

Metric Value
cosine_accuracy@1 0.4201
cosine_accuracy@3 0.4545
cosine_accuracy@5 0.4767
cosine_accuracy@10 0.5061
cosine_precision@1 0.4201
cosine_precision@3 0.4103
cosine_precision@5 0.3857
cosine_precision@10 0.3383
cosine_recall@1 0.0752
cosine_recall@3 0.1898
cosine_recall@5 0.2534
cosine_recall@10 0.3454
cosine_ndcg@10 0.4614
cosine_mrr@10 0.4411
cosine_map@100 0.5119

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,627 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 7 tokens
    • mean: 15.44 tokens
    • max: 35 tokens
    • min: 25 tokens
    • mean: 627.36 tokens
    • max: 2429 tokens
  • Samples:
    anchor positive
    What type of data processing triggers the obligations for entities with fewer than 250 employees? 1.Each controller and, where applicable, the controller's representative, shall maintain a record of processing activities under its responsibility. That record shall contain all of the following information: (a) the name and contact details of the controller and, where applicable, the joint controller, the controller's representative and the data protection officer; (b) the purposes of the processing; (c) a description of the categories of data subjects and of the categories of personal data; 4.5.2016 L 119/50 (d) the categories of recipients to whom the personal data have been or will be disclosed including recipients in third countries or international organisations; (e) where applicable, transfers of personal data to a third country or an international organisation, including the identification of that third country or international organisation and, in the case of transfers referred to in the second subparagraph of Article 49(1), the documentation of suitable safeguards; (f) ...
    For what purposes can the data subject object to data processing based on personal grounds? 1.The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions. The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.
    2.Where personal data are processed for direct marketing purposes, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, which includes profiling to the extent that it is related to such direct marketing.
    3.Where the data subject objects to processing for direct marketing purposes, the personal data shall no longer be processed for such purposes. 4.5.2016 L 119/45
    4.At th...
    What should legally binding measures refer to? In order to ensure consistent monitoring and enforcement of this Regulation throughout the Union, the supervisory authorities should have in each Member State the same tasks and effective powers, including powers of investigation, corrective powers and sanctions, and authorisation and advisory powers, in particular in cases of complaints from natural persons, and without prejudice to the powers of prosecutorial authorities under Member State law, to bring infringements of this Regulation to the attention of the judicial authorities and engage in legal proceedings. Such powers should also include the power to impose a temporary or definitive limitation, including a ban, on processing. Member States may specify other tasks related to the protection of personal data under this Regulation. The powers of supervisory authorities should be exercised in accordance with appropriate procedural safeguards set out in Union and Member State law, impartially, fairly and within a reasonable time. In ...
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • gradient_accumulation_steps: 2
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.0098 1 17.4853 - - - - -
0.0196 2 15.5822 - - - - -
0.0294 3 8.6936 - - - - -
0.0392 4 12.4725 - - - - -
0.0490 5 8.4815 - - - - -
0.0588 6 10.6456 - - - - -
0.0686 7 14.3675 - - - - -
0.0784 8 10.4737 - - - - -
0.0882 9 10.21 - - - - -
0.0980 10 8.6732 - - - - -
0.1078 11 6.8418 - - - - -
0.1176 12 8.9665 - - - - -
0.1275 13 7.3845 - - - - -
0.1373 14 7.4307 - - - - -
0.1471 15 8.574 - - - - -
0.1569 16 13.5512 - - - - -
0.1667 17 9.0797 - - - - -
0.1765 18 10.7027 - - - - -
0.1863 19 8.1399 - - - - -
0.1961 20 5.2519 - - - - -
0.2059 21 5.7496 - - - - -
0.2157 22 6.6065 - - - - -
0.2255 23 7.2297 - - - - -
0.2353 24 9.7108 - - - - -
0.2451 25 6.0649 - - - - -
0.2549 26 12.5883 - - - - -
0.2647 27 3.3575 - - - - -
0.2745 28 8.5852 - - - - -
0.2843 29 11.2424 - - - - -
0.2941 30 3.9034 - - - - -
0.3039 31 4.0761 - - - - -
0.3137 32 6.7761 - - - - -
0.3235 33 11.8273 - - - - -
0.3333 34 9.8458 - - - - -
0.3431 35 10.1052 - - - - -
0.3529 36 12.6555 - - - - -
0.3627 37 7.6164 - - - - -
0.3725 38 6.2306 - - - - -
0.3824 39 9.7449 - - - - -
0.3922 40 9.503 - - - - -
0.4020 41 3.9503 - - - - -
0.4118 42 9.0781 - - - - -
0.4216 43 7.6947 - - - - -
0.4314 44 7.4759 - - - - -
0.4412 45 8.9653 - - - - -
0.4510 46 5.9928 - - - - -
0.4608 47 7.7841 - - - - -
0.4706 48 4.9347 - - - - -
0.4804 49 8.5892 - - - - -
0.4902 50 7.7227 - - - - -
0.5 51 8.8423 - - - - -
0.5098 52 7.9743 - - - - -
0.5196 53 6.0536 - - - - -
0.5294 54 6.2513 - - - - -
0.5392 55 3.7778 - - - - -
0.5490 56 9.3877 - - - - -
0.5588 57 9.3963 - - - - -
0.5686 58 5.6104 - - - - -
0.5784 59 5.8724 - - - - -
0.5882 60 3.8029 - - - - -
0.5980 61 8.6739 - - - - -
0.6078 62 4.371 - - - - -
0.6176 63 10.0285 - - - - -
0.6275 64 9.1923 - - - - -
0.6373 65 5.1715 - - - - -
0.6471 66 6.2528 - - - - -
0.6569 67 5.3587 - - - - -
0.6667 68 8.1243 - - - - -
0.6765 69 5.5494 - - - - -
0.6863 70 4.4475 - - - - -
0.6961 71 4.8783 - - - - -
0.7059 72 5.4655 - - - - -
0.7157 73 1.4754 - - - - -
0.7255 74 6.2656 - - - - -
0.7353 75 8.3554 - - - - -
0.7451 76 6.1232 - - - - -
0.7549 77 2.2596 - - - - -
0.7647 78 4.9636 - - - - -
0.7745 79 5.6401 - - - - -
0.7843 80 5.5852 - - - - -
0.7941 81 8.55 - - - - -
0.8039 82 5.2085 - - - - -
0.8137 83 5.7077 - - - - -
0.8235 84 3.9988 - - - - -
0.8333 85 8.3305 - - - - -
0.8431 86 7.063 - - - - -
0.8529 87 6.9146 - - - - -
0.8627 88 7.1729 - - - - -
0.8725 89 5.6916 - - - - -
0.8824 90 4.689 - - - - -
0.8922 91 10.2449 - - - - -
0.9020 92 4.4491 - - - - -
0.9118 93 7.1342 - - - - -
0.9216 94 6.8294 - - - - -
0.9314 95 6.429 - - - - -
0.9412 96 2.6789 - - - - -
0.9510 97 5.7232 - - - - -
0.9608 98 4.0619 - - - - -
0.9706 99 4.7323 - - - - -
0.9804 100 5.403 - - - - -
0.9902 101 7.4416 - - - - -
1.0 102 2.2006 0.5075 0.4838 0.4900 0.4167 0.3557
1.0098 103 2.523 - - - - -
1.0196 104 1.4913 - - - - -
1.0294 105 9.9224 - - - - -
1.0392 106 7.2427 - - - - -
1.0490 107 3.8076 - - - - -
1.0588 108 3.5538 - - - - -
1.0686 109 2.9958 - - - - -
1.0784 110 5.0938 - - - - -
1.0882 111 1.6151 - - - - -
1.0980 112 2.4825 - - - - -
1.1078 113 2.6052 - - - - -
1.1176 114 3.2484 - - - - -
1.1275 115 1.7134 - - - - -
1.1373 116 4.5488 - - - - -
1.1471 117 2.9845 - - - - -
1.1569 118 3.6352 - - - - -
1.1667 119 5.681 - - - - -
1.1765 120 2.7269 - - - - -
1.1863 121 3.1317 - - - - -
1.1961 122 8.6548 - - - - -
1.2059 123 1.2277 - - - - -
1.2157 124 4.8203 - - - - -
1.2255 125 5.0602 - - - - -
1.2353 126 5.9304 - - - - -
1.2451 127 3.8992 - - - - -
1.2549 128 4.6071 - - - - -
1.2647 129 7.071 - - - - -
1.2745 130 2.796 - - - - -
1.2843 131 4.1005 - - - - -
1.2941 132 2.4508 - - - - -
1.3039 133 3.0313 - - - - -
1.3137 134 1.6569 - - - - -
1.3235 135 5.6474 - - - - -
1.3333 136 5.0485 - - - - -
1.3431 137 5.342 - - - - -
1.3529 138 2.1806 - - - - -
1.3627 139 2.3089 - - - - -
1.3725 140 2.0881 - - - - -
1.3824 141 1.2435 - - - - -
1.3922 142 2.3912 - - - - -
1.4020 143 1.7524 - - - - -
1.4118 144 5.1758 - - - - -
1.4216 145 1.9937 - - - - -
1.4314 146 3.3948 - - - - -
1.4412 147 4.8789 - - - - -
1.4510 148 1.9967 - - - - -
1.4608 149 1.9438 - - - - -
1.4706 150 5.8335 - - - - -
1.4804 151 3.2073 - - - - -
1.4902 152 8.3916 - - - - -
1.5 153 1.6447 - - - - -
1.5098 154 2.7262 - - - - -
1.5196 155 4.0002 - - - - -
1.5294 156 2.0588 - - - - -
1.5392 157 1.9514 - - - - -
1.5490 158 2.0048 - - - - -
1.5588 159 4.8991 - - - - -
1.5686 160 5.2414 - - - - -
1.5784 161 2.193 - - - - -
1.5882 162 4.6859 - - - - -
1.5980 163 3.1137 - - - - -
1.6078 164 2.8398 - - - - -
1.6176 165 4.6547 - - - - -
1.6275 166 4.1404 - - - - -
1.6373 167 5.2769 - - - - -
1.6471 168 3.6466 - - - - -
1.6569 169 1.2928 - - - - -
1.6667 170 7.6842 - - - - -
1.6765 171 3.6167 - - - - -
1.6863 172 1.5441 - - - - -
1.6961 173 4.6245 - - - - -
1.7059 174 3.4359 - - - - -
1.7157 175 5.561 - - - - -
1.7255 176 9.2408 - - - - -
1.7353 177 3.4619 - - - - -
1.7451 178 0.7945 - - - - -
1.7549 179 1.4854 - - - - -
1.7647 180 4.4899 - - - - -
1.7745 181 2.9133 - - - - -
1.7843 182 2.2408 - - - - -
1.7941 183 3.7768 - - - - -
1.8039 184 3.2455 - - - - -
1.8137 185 3.9414 - - - - -
1.8235 186 2.1961 - - - - -
1.8333 187 2.4825 - - - - -
1.8431 188 3.2995 - - - - -
1.8529 189 2.8202 - - - - -
1.8627 190 6.1953 - - - - -
1.8725 191 3.3925 - - - - -
1.8824 192 3.3051 - - - - -
1.8922 193 4.141 - - - - -
1.9020 194 8.7842 - - - - -
1.9118 195 2.0724 - - - - -
1.9216 196 5.1611 - - - - -
1.9314 197 5.0744 - - - - -
1.9412 198 1.7611 - - - - -
1.9510 199 1.9447 - - - - -
1.9608 200 1.0533 - - - - -
1.9706 201 6.2447 - - - - -
1.9804 202 1.6885 - - - - -
1.9902 203 2.0872 - - - - -
2.0 204 4.7202 0.5224 0.5188 0.5182 0.4677 0.3829
2.0098 205 1.8304 - - - - -
2.0196 206 0.9245 - - - - -
2.0294 207 2.1126 - - - - -
2.0392 208 5.1247 - - - - -
2.0490 209 1.9362 - - - - -
2.0588 210 2.6958 - - - - -
2.0686 211 2.4759 - - - - -
2.0784 212 2.092 - - - - -
2.0882 213 4.3632 - - - - -
2.0980 214 2.8144 - - - - -
2.1078 215 0.6525 - - - - -
2.1176 216 0.7783 - - - - -
2.1275 217 3.2555 - - - - -
2.1373 218 2.5865 - - - - -
2.1471 219 3.927 - - - - -
2.1569 220 0.5981 - - - - -
2.1667 221 5.5659 - - - - -
2.1765 222 2.2788 - - - - -
2.1863 223 1.8267 - - - - -
2.1961 224 2.0744 - - - - -
2.2059 225 3.8103 - - - - -
2.2157 226 1.1361 - - - - -
2.2255 227 3.3677 - - - - -
2.2353 228 3.0295 - - - - -
2.2451 229 1.5912 - - - - -
2.2549 230 4.2332 - - - - -
2.2647 231 3.0785 - - - - -
2.2745 232 2.137 - - - - -
2.2843 233 3.521 - - - - -
2.2941 234 5.2255 - - - - -
2.3039 235 5.3743 - - - - -
2.3137 236 2.6036 - - - - -
2.3235 237 0.571 - - - - -
2.3333 238 0.5066 - - - - -
2.3431 239 2.6968 - - - - -
2.3529 240 1.0818 - - - - -
2.3627 241 0.9833 - - - - -
2.3725 242 0.8127 - - - - -
2.3824 243 0.9684 - - - - -
2.3922 244 4.3469 - - - - -
2.4020 245 3.7872 - - - - -
2.4118 246 0.6947 - - - - -
2.4216 247 1.0844 - - - - -
2.4314 248 0.4574 - - - - -
2.4412 249 2.5933 - - - - -
2.4510 250 1.6238 - - - - -
2.4608 251 1.5579 - - - - -
2.4706 252 3.1798 - - - - -
2.4804 253 1.3299 - - - - -
2.4902 254 1.431 - - - - -
2.5 255 1.0556 - - - - -
2.5098 256 2.3683 - - - - -
2.5196 257 3.6157 - - - - -
2.5294 258 1.5859 - - - - -
2.5392 259 1.2728 - - - - -
2.5490 260 2.0595 - - - - -
2.5588 261 2.7455 - - - - -
2.5686 262 1.3221 - - - - -
2.5784 263 1.7831 - - - - -
2.5882 264 1.8362 - - - - -
2.5980 265 0.4301 - - - - -
2.6078 266 1.4383 - - - - -
2.6176 267 3.6068 - - - - -
2.6275 268 3.2374 - - - - -
2.6373 269 1.1956 - - - - -
2.6471 270 3.1378 - - - - -
2.6569 271 2.8349 - - - - -
2.6667 272 1.4831 - - - - -
2.6765 273 2.628 - - - - -
2.6863 274 1.4708 - - - - -
2.6961 275 0.8406 - - - - -
2.7059 276 1.0961 - - - - -
2.7157 277 0.8955 - - - - -
2.7255 278 2.2775 - - - - -
2.7353 279 4.0415 - - - - -
2.7451 280 3.2129 - - - - -
2.7549 281 1.4543 - - - - -
2.7647 282 3.2836 - - - - -
2.7745 283 1.9991 - - - - -
2.7843 284 1.7477 - - - - -
2.7941 285 2.853 - - - - -
2.8039 286 0.4566 - - - - -
2.8137 287 0.9655 - - - - -
2.8235 288 1.6009 - - - - -
2.8333 289 2.776 - - - - -
2.8431 290 0.1765 - - - - -
2.8529 291 0.9924 - - - - -
2.8627 292 2.1822 - - - - -
2.8725 293 1.5509 - - - - -
2.8824 294 0.8738 - - - - -
2.8922 295 1.1838 - - - - -
2.9020 296 0.6173 - - - - -
2.9118 297 1.8889 - - - - -
2.9216 298 3.8679 - - - - -
2.9314 299 1.7225 - - - - -
2.9412 300 2.289 - - - - -
2.9510 301 3.0041 - - - - -
2.9608 302 0.7329 - - - - -
2.9706 303 2.8791 - - - - -
2.9804 304 3.0804 - - - - -
2.9902 305 1.1065 - - - - -
3.0 306 0.2322 0.5170 0.5126 0.5129 0.4741 0.4248
3.0098 307 1.5574 - - - - -
3.0196 308 1.3758 - - - - -
3.0294 309 0.6382 - - - - -
3.0392 310 1.8904 - - - - -
3.0490 311 2.7908 - - - - -
3.0588 312 4.3568 - - - - -
3.0686 313 1.591 - - - - -
3.0784 314 2.5855 - - - - -
3.0882 315 1.7845 - - - - -
3.0980 316 2.7024 - - - - -
3.1078 317 0.1437 - - - - -
3.1176 318 0.8981 - - - - -
3.1275 319 0.8955 - - - - -
3.1373 320 1.0776 - - - - -
3.1471 321 1.8035 - - - - -
3.1569 322 1.1962 - - - - -
3.1667 323 1.0252 - - - - -
3.1765 324 0.6757 - - - - -
3.1863 325 0.4822 - - - - -
3.1961 326 1.0597 - - - - -
3.2059 327 2.4075 - - - - -
3.2157 328 0.8851 - - - - -
3.2255 329 1.4165 - - - - -
3.2353 330 3.2401 - - - - -
3.2451 331 0.455 - - - - -
3.2549 332 2.5575 - - - - -
3.2647 333 0.397 - - - - -
3.2745 334 2.365 - - - - -
3.2843 335 1.2017 - - - - -
3.2941 336 0.5282 - - - - -
3.3039 337 3.37 - - - - -
3.3137 338 1.1749 - - - - -
3.3235 339 2.494 - - - - -
3.3333 340 1.3695 - - - - -
3.3431 341 2.1702 - - - - -
3.3529 342 1.7424 - - - - -
3.3627 343 2.481 - - - - -
3.3725 344 0.195 - - - - -
3.3824 345 0.5217 - - - - -
3.3922 346 1.0893 - - - - -
3.4020 347 0.606 - - - - -
3.4118 348 1.5417 - - - - -
3.4216 349 2.3694 - - - - -
3.4314 350 0.7988 - - - - -
3.4412 351 1.2099 - - - - -
3.4510 352 0.9519 - - - - -
3.4608 353 1.0354 - - - - -
3.4706 354 0.4518 - - - - -
3.4804 355 3.0758 - - - - -
3.4902 356 0.9814 - - - - -
3.5 357 2.4242 - - - - -
3.5098 358 3.3301 - - - - -
3.5196 359 1.4931 - - - - -
3.5294 360 0.8788 - - - - -
3.5392 361 1.056 - - - - -
3.5490 362 1.3501 - - - - -
3.5588 363 3.3744 - - - - -
3.5686 364 7.6844 - - - - -
3.5784 365 1.9189 - - - - -
3.5882 366 1.2354 - - - - -
3.5980 367 1.1185 - - - - -
3.6078 368 1.4144 - - - - -
3.6176 369 0.4259 - - - - -
3.6275 370 0.2264 - - - - -
3.6373 371 0.7256 - - - - -
3.6471 372 2.6337 - - - - -
3.6569 373 1.2556 - - - - -
3.6667 374 2.3852 - - - - -
3.6765 375 0.4105 - - - - -
3.6863 376 1.7846 - - - - -
3.6961 377 6.6395 - - - - -
3.7059 378 0.7761 - - - - -
3.7157 379 1.6567 - - - - -
3.7255 380 2.2471 - - - - -
3.7353 381 0.8596 - - - - -
3.7451 382 0.3693 - - - - -
3.7549 383 0.8207 - - - - -
3.7647 384 2.9248 - - - - -
3.7745 385 1.4509 - - - - -
3.7843 386 2.2966 - - - - -
3.7941 387 3.726 - - - - -
3.8039 388 1.7707 - - - - -
3.8137 389 0.9623 - - - - -
3.8235 390 0.7915 - - - - -
3.8333 391 0.1255 - - - - -
3.8431 392 1.8356 - - - - -
3.8529 393 2.0525 - - - - -
3.8627 394 1.7096 - - - - -
3.8725 395 1.7327 - - - - -
3.8824 396 2.4524 - - - - -
3.8922 397 0.8552 - - - - -
3.9020 398 2.2829 - - - - -
3.9118 399 1.7359 - - - - -
3.9216 400 0.761 - - - - -
3.9314 401 1.1795 - - - - -
3.9412 402 0.6309 - - - - -
3.9510 403 1.5526 - - - - -
3.9608 404 1.5281 - - - - -
3.9706 405 1.1863 - - - - -
3.9804 406 2.1151 - - - - -
3.9902 407 1.4431 - - - - -
4.0 408 0.604 0.5387 0.5246 0.5152 0.4760 0.4248
4.0098 409 0.5226 - - - - -
4.0196 410 0.4814 - - - - -
4.0294 411 0.7015 - - - - -
4.0392 412 0.9108 - - - - -
4.0490 413 1.8203 - - - - -
4.0588 414 1.2907 - - - - -
4.0686 415 0.1011 - - - - -
4.0784 416 0.6205 - - - - -
4.0882 417 0.2523 - - - - -
4.0980 418 1.7956 - - - - -
4.1078 419 1.3579 - - - - -
4.1176 420 0.6554 - - - - -
4.1275 421 0.1167 - - - - -
4.1373 422 0.3729 - - - - -
4.1471 423 1.2643 - - - - -
4.1569 424 0.3683 - - - - -
4.1667 425 0.8441 - - - - -
4.1765 426 1.7266 - - - - -
4.1863 427 0.4744 - - - - -
4.1961 428 0.5775 - - - - -
4.2059 429 0.4439 - - - - -
4.2157 430 1.3623 - - - - -
4.2255 431 0.2321 - - - - -
4.2353 432 0.6784 - - - - -
4.2451 433 2.2527 - - - - -
4.2549 434 0.2091 - - - - -
4.2647 435 0.3422 - - - - -
4.2745 436 0.7188 - - - - -
4.2843 437 0.4749 - - - - -
4.2941 438 0.9337 - - - - -
4.3039 439 0.2575 - - - - -
4.3137 440 0.5921 - - - - -
4.3235 441 1.2174 - - - - -
4.3333 442 0.5094 - - - - -
4.3431 443 3.8625 - - - - -
4.3529 444 1.5764 - - - - -
4.3627 445 2.267 - - - - -
4.3725 446 0.8361 - - - - -
4.3824 447 2.5708 - - - - -
4.3922 448 1.0165 - - - - -
4.4020 449 0.9901 - - - - -
4.4118 450 2.3626 - - - - -
4.4216 451 0.4889 - - - - -
4.4314 452 1.2405 - - - - -
4.4412 453 0.3081 - - - - -
4.4510 454 1.2049 - - - - -
4.4608 455 1.3629 - - - - -
4.4706 456 0.3651 - - - - -
4.4804 457 0.3298 - - - - -
4.4902 458 0.2576 - - - - -
4.5 459 0.5005 - - - - -
4.5098 460 1.3059 - - - - -
4.5196 461 0.4972 - - - - -
4.5294 462 0.2702 - - - - -
4.5392 463 0.4177 - - - - -
4.5490 464 1.1491 - - - - -
4.5588 465 0.8601 - - - - -
4.5686 466 0.3014 - - - - -
4.5784 467 0.3109 - - - - -
4.5882 468 0.5373 - - - - -
4.5980 469 2.7848 - - - - -
4.6078 470 0.3327 - - - - -
4.6176 471 0.205 - - - - -
4.6275 472 0.957 - - - - -
4.6373 473 0.1345 - - - - -
4.6471 474 0.2789 - - - - -
4.6569 475 0.9098 - - - - -
4.6667 476 2.7092 - - - - -
4.6765 477 1.7403 - - - - -
4.6863 478 0.4427 - - - - -
4.6961 479 0.8062 - - - - -
4.7059 480 1.1155 - - - - -
4.7157 481 0.7681 - - - - -
4.7255 482 0.9159 - - - - -
4.7353 483 0.9655 - - - - -
4.7451 484 1.2566 - - - - -
4.7549 485 1.3371 - - - - -
4.7647 486 0.8586 - - - - -
4.7745 487 0.8426 - - - - -
4.7843 488 0.4057 - - - - -
4.7941 489 1.6484 - - - - -
4.8039 490 0.8504 - - - - -
4.8137 491 0.1841 - - - - -
4.8235 492 0.6473 - - - - -
4.8333 493 0.1751 - - - - -
4.8431 494 0.3423 - - - - -
4.8529 495 0.9846 - - - - -
4.8627 496 0.8286 - - - - -
4.8725 497 0.2899 - - - - -
4.8824 498 0.8783 - - - - -
4.8922 499 0.7759 - - - - -
4.9020 500 3.1335 - - - - -
4.9118 501 0.4373 - - - - -
4.9216 502 1.1926 - - - - -
4.9314 503 2.6567 - - - - -
4.9412 504 1.9625 - - - - -
4.9510 505 2.3935 - - - - -
4.9608 506 1.3384 - - - - -
4.9706 507 0.6214 - - - - -
4.9804 508 0.2068 - - - - -
4.9902 509 1.1153 - - - - -
5.0 510 0.2192 0.5473 0.5499 0.5326 0.5063 0.4490
5.0098 511 0.4635 - - - - -
5.0196 512 0.3765 - - - - -
5.0294 513 1.3445 - - - - -
5.0392 514 0.1555 - - - - -
5.0490 515 1.4591 - - - - -
5.0588 516 0.797 - - - - -
5.0686 517 0.5968 - - - - -
5.0784 518 0.1294 - - - - -
5.0882 519 1.1385 - - - - -
5.0980 520 0.0714 - - - - -
5.1078 521 0.2861 - - - - -
5.1176 522 1.9842 - - - - -
5.1275 523 1.3137 - - - - -
5.1373 524 2.5049 - - - - -
5.1471 525 0.2286 - - - - -
5.1569 526 0.7288 - - - - -
5.1667 527 0.3438 - - - - -
5.1765 528 1.3318 - - - - -
5.1863 529 1.1948 - - - - -
5.1961 530 1.1211 - - - - -
5.2059 531 0.5072 - - - - -
5.2157 532 0.5509 - - - - -
5.2255 533 0.3087 - - - - -
5.2353 534 0.091 - - - - -
5.2451 535 0.0683 - - - - -
5.2549 536 0.1064 - - - - -
5.2647 537 0.2124 - - - - -
5.2745 538 0.1575 - - - - -
5.2843 539 0.2538 - - - - -
5.2941 540 1.2086 - - - - -
5.3039 541 0.714 - - - - -
5.3137 542 1.2357 - - - - -
5.3235 543 0.4326 - - - - -
5.3333 544 0.162 - - - - -
5.3431 545 3.2438 - - - - -
5.3529 546 0.3651 - - - - -
5.3627 547 1.5591 - - - - -
5.3725 548 1.2833 - - - - -
5.3824 549 1.0673 - - - - -
5.3922 550 1.0428 - - - - -
5.4020 551 0.2781 - - - - -
5.4118 552 0.5207 - - - - -
5.4216 553 0.1488 - - - - -
5.4314 554 0.8952 - - - - -
5.4412 555 1.8624 - - - - -
5.4510 556 0.478 - - - - -
5.4608 557 0.4321 - - - - -
5.4706 558 0.4267 - - - - -
5.4804 559 0.1908 - - - - -
5.4902 560 0.4265 - - - - -
5.5 561 0.3623 - - - - -
5.5098 562 0.3901 - - - - -
5.5196 563 0.2053 - - - - -
5.5294 564 0.5849 - - - - -
5.5392 565 1.1978 - - - - -
5.5490 566 1.1177 - - - - -
5.5588 567 3.9417 - - - - -
5.5686 568 0.0611 - - - - -
5.5784 569 1.8463 - - - - -
5.5882 570 0.3208 - - - - -
5.5980 571 0.2139 - - - - -
5.6078 572 0.3233 - - - - -
5.6176 573 1.1404 - - - - -
5.6275 574 0.5637 - - - - -
5.6373 575 1.2947 - - - - -
5.6471 576 0.5029 - - - - -
5.6569 577 0.9816 - - - - -
5.6667 578 1.0183 - - - - -
5.6765 579 0.9679 - - - - -
5.6863 580 0.4796 - - - - -
5.6961 581 1.4002 - - - - -
5.7059 582 0.4527 - - - - -
5.7157 583 2.5092 - - - - -
5.7255 584 2.203 - - - - -
5.7353 585 0.4727 - - - - -
5.7451 586 0.9486 - - - - -
5.7549 587 1.0363 - - - - -
5.7647 588 0.0555 - - - - -
5.7745 589 0.152 - - - - -
5.7843 590 0.1018 - - - - -
5.7941 591 1.252 - - - - -
5.8039 592 0.6691 - - - - -
5.8137 593 1.5344 - - - - -
5.8235 594 0.5273 - - - - -
5.8333 595 0.1534 - - - - -
5.8431 596 0.16 - - - - -
5.8529 597 1.128 - - - - -
5.8627 598 0.2065 - - - - -
5.8725 599 0.3988 - - - - -
5.8824 600 0.107 - - - - -
5.8922 601 0.5792 - - - - -
5.9020 602 0.8125 - - - - -
5.9118 603 0.19 - - - - -
5.9216 604 0.2934 - - - - -
5.9314 605 1.3682 - - - - -
5.9412 606 0.4138 - - - - -
5.9510 607 0.2711 - - - - -
5.9608 608 0.7787 - - - - -
5.9706 609 0.4397 - - - - -
5.9804 610 0.4851 - - - - -
5.9902 611 0.2568 - - - - -
6.0 612 0.2262 0.5626 0.5517 0.5370 0.4973 0.4439
6.0098 613 0.0581 - - - - -
6.0196 614 0.0368 - - - - -
6.0294 615 0.3975 - - - - -
6.0392 616 1.859 - - - - -
6.0490 617 0.2347 - - - - -
6.0588 618 1.2704 - - - - -
6.0686 619 0.4264 - - - - -
6.0784 620 0.6203 - - - - -
6.0882 621 0.7693 - - - - -
6.0980 622 2.2813 - - - - -
6.1078 623 0.2037 - - - - -
6.1176 624 0.6719 - - - - -
6.1275 625 0.8625 - - - - -
6.1373 626 0.1724 - - - - -
6.1471 627 1.1075 - - - - -
6.1569 628 0.1819 - - - - -
6.1667 629 0.0981 - - - - -
6.1765 630 0.0588 - - - - -
6.1863 631 1.938 - - - - -
6.1961 632 0.0238 - - - - -
6.2059 633 0.9169 - - - - -
6.2157 634 0.4274 - - - - -
6.2255 635 0.2964 - - - - -
6.2353 636 0.0783 - - - - -
6.2451 637 0.0955 - - - - -
6.2549 638 0.2783 - - - - -
6.2647 639 0.2676 - - - - -
6.2745 640 0.2932 - - - - -
6.2843 641 2.0471 - - - - -
6.2941 642 0.1922 - - - - -
6.3039 643 0.2343 - - - - -
6.3137 644 0.3383 - - - - -
6.3235 645 0.2505 - - - - -
6.3333 646 0.1997 - - - - -
6.3431 647 0.0996 - - - - -
6.3529 648 0.1974 - - - - -
6.3627 649 0.1524 - - - - -
6.3725 650 0.167 - - - - -
6.3824 651 0.6196 - - - - -
6.3922 652 0.4484 - - - - -
6.4020 653 0.6763 - - - - -
6.4118 654 0.0762 - - - - -
6.4216 655 0.1122 - - - - -
6.4314 656 1.5314 - - - - -
6.4412 657 0.2948 - - - - -
6.4510 658 0.5316 - - - - -
6.4608 659 0.8572 - - - - -
6.4706 660 1.9052 - - - - -
6.4804 661 0.3168 - - - - -
6.4902 662 0.6483 - - - - -
6.5 663 1.2313 - - - - -
6.5098 664 0.5487 - - - - -
6.5196 665 0.3067 - - - - -
6.5294 666 0.3117 - - - - -
6.5392 667 0.2003 - - - - -
6.5490 668 0.4868 - - - - -
6.5588 669 0.3717 - - - - -
6.5686 670 0.9412 - - - - -
6.5784 671 1.6313 - - - - -
6.5882 672 0.6452 - - - - -
6.5980 673 1.9431 - - - - -
6.6078 674 0.1115 - - - - -
6.6176 675 1.0499 - - - - -
6.6275 676 0.6338 - - - - -
6.6373 677 1.5934 - - - - -
6.6471 678 0.9806 - - - - -
6.6569 679 0.4624 - - - - -
6.6667 680 1.1895 - - - - -
6.6765 681 0.3156 - - - - -
6.6863 682 0.3458 - - - - -
6.6961 683 0.8909 - - - - -
6.7059 684 1.4376 - - - - -
6.7157 685 2.3231 - - - - -
6.7255 686 1.6846 - - - - -
6.7353 687 0.2639 - - - - -
6.7451 688 0.1005 - - - - -
6.7549 689 0.7306 - - - - -
6.7647 690 0.5565 - - - - -
6.7745 691 0.3282 - - - - -
6.7843 692 0.1483 - - - - -
6.7941 693 1.1891 - - - - -
6.8039 694 0.936 - - - - -
6.8137 695 0.2507 - - - - -
6.8235 696 1.0859 - - - - -
6.8333 697 0.6285 - - - - -
6.8431 698 0.208 - - - - -
6.8529 699 0.2511 - - - - -
6.8627 700 0.2671 - - - - -
6.8725 701 0.1619 - - - - -
6.8824 702 0.1394 - - - - -
6.8922 703 0.7688 - - - - -
6.9020 704 1.8208 - - - - -
6.9118 705 0.3901 - - - - -
6.9216 706 1.3879 - - - - -
6.9314 707 0.1595 - - - - -
6.9412 708 0.1905 - - - - -
6.9510 709 0.1992 - - - - -
6.9608 710 0.1662 - - - - -
6.9706 711 0.116 - - - - -
6.9804 712 0.116 - - - - -
6.9902 713 0.1566 - - - - -
7.0 714 0.8007 0.5614 0.5626 0.5485 0.5108 0.4514
7.0098 715 1.0775 - - - - -
7.0196 716 0.44 - - - - -
7.0294 717 0.0443 - - - - -
7.0392 718 1.3114 - - - - -
7.0490 719 0.2947 - - - - -
7.0588 720 0.5323 - - - - -
7.0686 721 0.3138 - - - - -
7.0784 722 0.8026 - - - - -
7.0882 723 0.0669 - - - - -
7.0980 724 0.0805 - - - - -
7.1078 725 0.0799 - - - - -
7.1176 726 0.0574 - - - - -
7.1275 727 0.4692 - - - - -
7.1373 728 1.6721 - - - - -
7.1471 729 0.9704 - - - - -
7.1569 730 0.2699 - - - - -
7.1667 731 0.6855 - - - - -
7.1765 732 0.1086 - - - - -
7.1863 733 0.1754 - - - - -
7.1961 734 1.2667 - - - - -
7.2059 735 0.1927 - - - - -
7.2157 736 0.0114 - - - - -
7.2255 737 0.12 - - - - -
7.2353 738 0.4625 - - - - -
7.2451 739 0.0489 - - - - -
7.2549 740 0.5556 - - - - -
7.2647 741 0.0939 - - - - -
7.2745 742 1.6064 - - - - -
7.2843 743 0.237 - - - - -
7.2941 744 1.7594 - - - - -
7.3039 745 0.374 - - - - -
7.3137 746 0.6035 - - - - -
7.3235 747 0.0507 - - - - -
7.3333 748 0.0689 - - - - -
7.3431 749 0.1226 - - - - -
7.3529 750 0.5325 - - - - -
7.3627 751 0.7446 - - - - -
7.3725 752 0.0552 - - - - -
7.3824 753 0.1349 - - - - -
7.3922 754 0.1023 - - - - -
7.4020 755 0.0752 - - - - -
7.4118 756 0.3072 - - - - -
7.4216 757 0.6699 - - - - -
7.4314 758 0.082 - - - - -
7.4412 759 0.0643 - - - - -
7.4510 760 0.5188 - - - - -
7.4608 761 0.2556 - - - - -
7.4706 762 0.4907 - - - - -
7.4804 763 0.2051 - - - - -
7.4902 764 0.5969 - - - - -
7.5 765 0.1616 - - - - -
7.5098 766 0.1203 - - - - -
7.5196 767 2.1851 - - - - -
7.5294 768 0.7754 - - - - -
7.5392 769 1.4312 - - - - -
7.5490 770 0.067 - - - - -
7.5588 771 1.1101 - - - - -
7.5686 772 1.7196 - - - - -
7.5784 773 1.233 - - - - -
7.5882 774 0.4773 - - - - -
7.5980 775 0.6321 - - - - -
7.6078 776 1.1809 - - - - -
7.6176 777 0.6292 - - - - -
7.6275 778 0.224 - - - - -
7.6373 779 0.0644 - - - - -
7.6471 780 0.0691 - - - - -
7.6569 781 0.7733 - - - - -
7.6667 782 0.1648 - - - - -
7.6765 783 0.2088 - - - - -
7.6863 784 0.1723 - - - - -
7.6961 785 0.2156 - - - - -
7.7059 786 0.0082 - - - - -
7.7157 787 0.0436 - - - - -
7.7255 788 0.1314 - - - - -
7.7353 789 0.3727 - - - - -
7.7451 790 0.1463 - - - - -
7.7549 791 0.3104 - - - - -
7.7647 792 0.2729 - - - - -
7.7745 793 1.0571 - - - - -
7.7843 794 0.1072 - - - - -
7.7941 795 0.0875 - - - - -
7.8039 796 0.5096 - - - - -
7.8137 797 0.2251 - - - - -
7.8235 798 0.1296 - - - - -
7.8333 799 0.3319 - - - - -
7.8431 800 1.3618 - - - - -
7.8529 801 0.3323 - - - - -
7.8627 802 1.606 - - - - -
7.8725 803 0.0946 - - - - -
7.8824 804 0.5191 - - - - -
7.8922 805 0.0633 - - - - -
7.9020 806 0.1488 - - - - -
7.9118 807 1.5574 - - - - -
7.9216 808 1.4453 - - - - -
7.9314 809 0.1937 - - - - -
7.9412 810 0.056 - - - - -
7.9510 811 0.5072 - - - - -
7.9608 812 0.1725 - - - - -
7.9706 813 2.6814 - - - - -
7.9804 814 0.1841 - - - - -
7.9902 815 0.1535 - - - - -
8.0 816 0.1073 0.5691 0.5611 0.5543 0.5114 0.4612
8.0098 817 0.9323 - - - - -
8.0196 818 0.1021 - - - - -
8.0294 819 0.1725 - - - - -
8.0392 820 2.4472 - - - - -
8.0490 821 1.0844 - - - - -
8.0588 822 0.4443 - - - - -
8.0686 823 0.7524 - - - - -
8.0784 824 0.1689 - - - - -
8.0882 825 0.5503 - - - - -
8.0980 826 0.2375 - - - - -
8.1078 827 0.1354 - - - - -
8.1176 828 1.6215 - - - - -
8.1275 829 0.0303 - - - - -
8.1373 830 0.3468 - - - - -
8.1471 831 0.7726 - - - - -
8.1569 832 0.6792 - - - - -
8.1667 833 0.1267 - - - - -
8.1765 834 0.4011 - - - - -
8.1863 835 0.6397 - - - - -
8.1961 836 0.0328 - - - - -
8.2059 837 1.165 - - - - -
8.2157 838 0.0828 - - - - -
8.2255 839 0.2508 - - - - -
8.2353 840 0.0924 - - - - -
8.2451 841 0.6274 - - - - -
8.2549 842 0.1431 - - - - -
8.2647 843 0.1021 - - - - -
8.2745 844 0.0431 - - - - -
8.2843 845 0.1479 - - - - -
8.2941 846 2.9723 - - - - -
8.3039 847 0.1429 - - - - -
8.3137 848 0.1741 - - - - -
8.3235 849 0.1693 - - - - -
8.3333 850 0.0472 - - - - -
8.3431 851 0.9259 - - - - -
8.3529 852 0.7457 - - - - -
8.3627 853 0.2439 - - - - -
8.3725 854 0.3118 - - - - -
8.3824 855 0.2956 - - - - -
8.3922 856 0.0947 - - - - -
8.4020 857 0.0534 - - - - -
8.4118 858 0.13 - - - - -
8.4216 859 0.0773 - - - - -
8.4314 860 0.0955 - - - - -
8.4412 861 0.7643 - - - - -
8.4510 862 0.4416 - - - - -
8.4608 863 0.6986 - - - - -
8.4706 864 0.3296 - - - - -
8.4804 865 0.4026 - - - - -
8.4902 866 0.229 - - - - -
8.5 867 0.0834 - - - - -
8.5098 868 0.0912 - - - - -
8.5196 869 0.0671 - - - - -
8.5294 870 1.1507 - - - - -
8.5392 871 0.1502 - - - - -
8.5490 872 0.1046 - - - - -
8.5588 873 0.0684 - - - - -
8.5686 874 0.9547 - - - - -
8.5784 875 0.7067 - - - - -
8.5882 876 0.2754 - - - - -
8.5980 877 0.1952 - - - - -
8.6078 878 0.057 - - - - -
8.6176 879 1.1675 - - - - -
8.6275 880 0.3879 - - - - -
8.6373 881 0.1829 - - - - -
8.6471 882 0.0479 - - - - -
8.6569 883 0.2623 - - - - -
8.6667 884 0.0809 - - - - -
8.6765 885 1.4486 - - - - -
8.6863 886 0.0912 - - - - -
8.6961 887 0.2709 - - - - -
8.7059 888 1.368 - - - - -
8.7157 889 0.0792 - - - - -
8.7255 890 0.3759 - - - - -
8.7353 891 0.1112 - - - - -
8.7451 892 0.4938 - - - - -
8.7549 893 0.2688 - - - - -
8.7647 894 0.0963 - - - - -
8.7745 895 0.1265 - - - - -
8.7843 896 1.6611 - - - - -
8.7941 897 5.0674 - - - - -
8.8039 898 0.1443 - - - - -
8.8137 899 1.0081 - - - - -
8.8235 900 0.1214 - - - - -
8.8333 901 0.2684 - - - - -
8.8431 902 0.1454 - - - - -
8.8529 903 0.271 - - - - -
8.8627 904 0.2519 - - - - -
8.8725 905 0.6393 - - - - -
8.8824 906 0.6015 - - - - -
8.8922 907 0.1017 - - - - -
8.9020 908 0.1849 - - - - -
8.9118 909 1.0315 - - - - -
8.9216 910 0.1676 - - - - -
8.9314 911 0.0583 - - - - -
8.9412 912 1.6259 - - - - -
8.9510 913 0.5683 - - - - -
8.9608 914 0.5538 - - - - -
8.9706 915 0.1566 - - - - -
8.9804 916 0.6487 - - - - -
8.9902 917 0.1371 - - - - -
9.0 918 0.1319 0.5718 0.5614 0.5493 0.5161 0.4632
9.0098 919 0.1949 - - - - -
9.0196 920 0.803 - - - - -
9.0294 921 0.1071 - - - - -
9.0392 922 0.0769 - - - - -
9.0490 923 0.1774 - - - - -
9.0588 924 0.1152 - - - - -
9.0686 925 0.7411 - - - - -
9.0784 926 0.4709 - - - - -
9.0882 927 0.0696 - - - - -
9.0980 928 0.1564 - - - - -
9.1078 929 0.3091 - - - - -
9.1176 930 0.2911 - - - - -
9.1275 931 0.334 - - - - -
9.1373 932 0.1871 - - - - -
9.1471 933 0.0695 - - - - -
9.1569 934 0.6205 - - - - -
9.1667 935 0.0369 - - - - -
9.1765 936 0.0349 - - - - -
9.1863 937 0.092 - - - - -
9.1961 938 0.0415 - - - - -
9.2059 939 0.0679 - - - - -
9.2157 940 0.0508 - - - - -
9.2255 941 0.1636 - - - - -
9.2353 942 0.4012 - - - - -
9.2451 943 0.0441 - - - - -
9.2549 944 0.0979 - - - - -
9.2647 945 0.152 - - - - -
9.2745 946 0.1169 - - - - -
9.2843 947 0.5413 - - - - -
9.2941 948 0.1161 - - - - -
9.3039 949 3.1074 - - - - -
9.3137 950 0.2096 - - - - -
9.3235 951 0.4277 - - - - -
9.3333 952 0.2897 - - - - -
9.3431 953 0.1469 - - - - -
9.3529 954 1.4157 - - - - -
9.3627 955 0.2669 - - - - -
9.3725 956 0.0898 - - - - -
9.3824 957 0.9439 - - - - -
9.3922 958 0.0796 - - - - -
9.4020 959 1.0793 - - - - -
9.4118 960 1.6456 - - - - -
9.4216 961 0.1987 - - - - -
9.4314 962 0.0507 - - - - -
9.4412 963 0.742 - - - - -
9.4510 964 0.0597 - - - - -
9.4608 965 0.3919 - - - - -
9.4706 966 0.0297 - - - - -
9.4804 967 2.4682 - - - - -
9.4902 968 0.1802 - - - - -
9.5 969 0.1366 - - - - -
9.5098 970 1.3555 - - - - -
9.5196 971 0.2502 - - - - -
9.5294 972 0.0649 - - - - -
9.5392 973 0.636 - - - - -
9.5490 974 0.1024 - - - - -
9.5588 975 1.0457 - - - - -
9.5686 976 0.1249 - - - - -
9.5784 977 1.8466 - - - - -
9.5882 978 0.1933 - - - - -
9.5980 979 0.3122 - - - - -
9.6078 980 1.8335 - - - - -
9.6176 981 0.3253 - - - - -
9.6275 982 0.069 - - - - -
9.6373 983 0.6156 - - - - -
9.6471 984 0.489 - - - - -
9.6569 985 0.6514 - - - - -
9.6667 986 0.2197 - - - - -
9.6765 987 0.4248 - - - - -
9.6863 988 1.4729 - - - - -
9.6961 989 0.2605 - - - - -
9.7059 990 0.0786 - - - - -
9.7157 991 0.2453 - - - - -
9.7255 992 0.7775 - - - - -
9.7353 993 0.0392 - - - - -
9.7451 994 0.0447 - - - - -
9.7549 995 0.2058 - - - - -
9.7647 996 1.8886 - - - - -
9.7745 997 0.0469 - - - - -
9.7843 998 0.7654 - - - - -
9.7941 999 0.4572 - - - - -
9.8039 1000 0.2576 - - - - -
9.8137 1001 0.1645 - - - - -
9.8235 1002 0.0466 - - - - -
9.8333 1003 0.0876 - - - - -
9.8431 1004 0.1035 - - - - -
9.8529 1005 0.0956 - - - - -
9.8627 1006 0.1782 - - - - -
9.8725 1007 0.0851 - - - - -
9.8824 1008 0.1276 - - - - -
9.8922 1009 0.1314 - - - - -
9.9020 1010 0.2475 - - - - -
9.9118 1011 0.2508 - - - - -
9.9216 1012 0.2586 - - - - -
9.9314 1013 0.1971 - - - - -
9.9412 1014 0.2402 - - - - -
9.9510 1015 0.1706 - - - - -
9.9608 1016 0.6377 - - - - -
9.9706 1017 0.4221 - - - - -
9.9804 1018 2.381 - - - - -
9.9902 1019 0.2951 - - - - -
10.0 1020 0.0307 0.5731 0.5621 0.5488 0.5153 0.4614
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.51.3
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
16
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IoannisKat1/modernbert-embed-base-ft-new

Finetuned
(96)
this model

Papers for IoannisKat1/modernbert-embed-base-ft-new

Evaluation results