bge_m3 Finetuned on Data

This is a sentence-transformers model finetuned from BAAI/bge-m3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-m3
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'What may impede authorities in the discharge of their responsibilities under Union law?',
    'The objectives and principles of Directive 95/46/EC remain sound, but it has not prevented fragmentation in the implementation of data protection across the Union, legal uncertainty or a widespread public perception that there are significant risks to the protection of natural persons, in particular with regard to online activity. Differences in the level of protection of the rights and freedoms of natural persons, in particular the right to the protection of personal data, with regard to the processing of personal data in the Member States may prevent the free flow of personal data throughout the Union. Those differences may therefore constitute an obstacle to the pursuit of economic activities at the level of the Union, distort competition and impede authorities in the discharge of their responsibilities under Union law. Such a difference in levels of protection is due to the existence of differences in the implementation and application of Directive 95/46/EC.',
    'The processing of personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes should be subject to appropriate safeguards for the rights and freedoms of the data subject pursuant to this Regulation. Those safeguards should ensure that technical and organisational measures are in place in order to ensure, in particular, the principle of data minimisation. The further processing of personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes is to be carried out when the controller has assessed the feasibility to fulfil those purposes by processing data which do not permit or no longer permit the identification of data subjects, provided that appropriate safeguards exist (such as, for instance, pseudonymisation of the data). Member States should provide for appropriate safeguards for the processing of personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes. Member States should be authorised to provide, under specific conditions and subject to appropriate safeguards for data subjects, specifications and derogations with regard to the information requirements and rights to rectification, to erasure, to be forgotten, to restriction of processing, to data portability, and to object when processing personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes. The conditions and safeguards in question may entail specific procedures for data subjects to exercise those rights if this is appropriate in the light of the purposes sought by the specific processing along with technical and organisational measures aimed at minimising the processing of personal data in pursuance of the proportionality and necessity principles. The processing of personal data for scientific purposes should also comply with other relevant legislation such as on clinical trials.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.6716, 0.2994],
#         [0.6716, 1.0000, 0.3108],
#         [0.2994, 0.3108, 1.0000]])

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.4373
cosine_accuracy@3 0.4717
cosine_accuracy@5 0.4939
cosine_accuracy@10 0.5233
cosine_precision@1 0.4373
cosine_precision@3 0.4259
cosine_precision@5 0.3985
cosine_precision@10 0.3337
cosine_recall@1 0.0863
cosine_recall@3 0.2209
cosine_recall@5 0.2991
cosine_recall@10 0.3932
cosine_ndcg@10 0.4785
cosine_mrr@10 0.4594
cosine_map@100 0.5295

Information Retrieval

Metric Value
cosine_accuracy@1 0.4324
cosine_accuracy@3 0.457
cosine_accuracy@5 0.4865
cosine_accuracy@10 0.5184
cosine_precision@1 0.4324
cosine_precision@3 0.4169
cosine_precision@5 0.3907
cosine_precision@10 0.3312
cosine_recall@1 0.0843
cosine_recall@3 0.213
cosine_recall@5 0.2881
cosine_recall@10 0.3851
cosine_ndcg@10 0.4711
cosine_mrr@10 0.452
cosine_map@100 0.5216

Information Retrieval

Metric Value
cosine_accuracy@1 0.4152
cosine_accuracy@3 0.4496
cosine_accuracy@5 0.4644
cosine_accuracy@10 0.5037
cosine_precision@1 0.4152
cosine_precision@3 0.4054
cosine_precision@5 0.3799
cosine_precision@10 0.3219
cosine_recall@1 0.0803
cosine_recall@3 0.2055
cosine_recall@5 0.2793
cosine_recall@10 0.3745
cosine_ndcg@10 0.4555
cosine_mrr@10 0.4365
cosine_map@100 0.5042

Information Retrieval

Metric Value
cosine_accuracy@1 0.3907
cosine_accuracy@3 0.4349
cosine_accuracy@5 0.4521
cosine_accuracy@10 0.484
cosine_precision@1 0.3907
cosine_precision@3 0.3849
cosine_precision@5 0.3661
cosine_precision@10 0.3128
cosine_recall@1 0.0746
cosine_recall@3 0.1916
cosine_recall@5 0.2642
cosine_recall@10 0.358
cosine_ndcg@10 0.4362
cosine_mrr@10 0.4142
cosine_map@100 0.4853

Information Retrieval

Metric Value
cosine_accuracy@1 0.3735
cosine_accuracy@3 0.4103
cosine_accuracy@5 0.4275
cosine_accuracy@10 0.484
cosine_precision@1 0.3735
cosine_precision@3 0.3653
cosine_precision@5 0.3455
cosine_precision@10 0.3037
cosine_recall@1 0.0716
cosine_recall@3 0.1824
cosine_recall@5 0.2494
cosine_recall@10 0.3544
cosine_ndcg@10 0.4222
cosine_mrr@10 0.3975
cosine_map@100 0.4701

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,627 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 7 tokens
    • mean: 17.42 tokens
    • max: 43 tokens
    • min: 36 tokens
    • mean: 729.41 tokens
    • max: 2824 tokens
  • Samples:
    anchor positive
    Who is responsible for monitoring the application of this Regulation? 1.The Board shall ensure the consistent application of this Regulation. To that end, the Board shall, on its own initiative or, where relevant, at the request of the Commission, in particular: (a) monitor and ensure the correct application of this Regulation in the cases provided for in Articles 64 and 65 without prejudice to the tasks of national supervisory authorities; 4.5.2016 L 119/76 (b) advise the Commission on any issue related to the protection of personal data in the Union, including on any proposed amendment of this Regulation; (c) advise the Commission on the format and procedures for the exchange of information between controllers, processors and supervisory authorities for binding corporate rules; (d) issue guidelines, recommendations, and best practices on procedures for erasing links, copies or replications of personal data from publicly available communication services as referred to in Article 17(2); (e) examine, on its own initiative, on request of one of its ...
    What should European statistics be developed, produced, and disseminated in accordance with? The confidential information which the Union and national statistical authorities collect for the production of official European and official national statistics should be protected. European statistics should be developed, produced and disseminated in accordance with the statistical principles as set out in Article 338(2) TFEU, while national statistics should also comply with Member State law. Regulation (EC) No 223/2009 of the European Parliament and of the Council (2) provides further specifications on statistical confidentiality for European statistics.
    What is evaluated in terms of adherence to approved codes of conduct? 1.Each supervisory authority shall ensure that the imposition of administrative fines pursuant to this Article in respect of infringements of this Regulation referred to in paragraphs 4, 5 and 6 shall in each individual case be effective, proportionate and dissuasive.
    2.Administrative fines shall, depending on the circumstances of each individual case, be imposed in addition to, or instead of, measures referred to in points (a) to (h) and (j) of Article 58(2). When deciding whether to impose an administrative fine and deciding on the amount of the administrative fine in each individual case due regard shall be given to the following: (a) the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of data subjects affected and the level of damage suffered by them; (b) the intentional or negligent character of the infringement; (c) any action taken by the controller or processor to mitigate the ...
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 2
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 4
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 2
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.0049 1 2.1126 - - - - -
0.0098 2 8.6334 - - - - -
0.0147 3 8.2284 - - - - -
0.0197 4 8.0839 - - - - -
0.0246 5 9.525 - - - - -
0.0295 6 1.5607 - - - - -
0.0344 7 4.5191 - - - - -
0.0393 8 4.0471 - - - - -
0.0442 9 1.2372 - - - - -
0.0491 10 7.5335 - - - - -
0.0541 11 2.9568 - - - - -
0.0590 12 9.9398 - - - - -
0.0639 13 3.0405 - - - - -
0.0688 14 3.304 - - - - -
0.0737 15 1.7575 - - - - -
0.0786 16 5.9459 - - - - -
0.0835 17 6.5538 - - - - -
0.0885 18 1.1162 - - - - -
0.0934 19 5.7308 - - - - -
0.0983 20 2.9328 - - - - -
0.1032 21 1.4039 - - - - -
0.1081 22 1.3671 - - - - -
0.1130 23 1.0179 - - - - -
0.1179 24 3.9188 - - - - -
0.1229 25 2.4424 - - - - -
0.1278 26 2.6898 - - - - -
0.1327 27 1.3133 - - - - -
0.1376 28 2.9771 - - - - -
0.1425 29 3.1579 - - - - -
0.1474 30 0.9947 - - - - -
0.1523 31 3.7173 - - - - -
0.1572 32 8.2647 - - - - -
0.1622 33 2.6619 - - - - -
0.1671 34 1.4925 - - - - -
0.1720 35 3.4065 - - - - -
0.1769 36 1.7833 - - - - -
0.1818 37 3.2862 - - - - -
0.1867 38 8.176 - - - - -
0.1916 39 2.8571 - - - - -
0.1966 40 4.3703 - - - - -
0.2015 41 1.3122 - - - - -
0.2064 42 6.6555 - - - - -
0.2113 43 9.1111 - - - - -
0.2162 44 6.1262 - - - - -
0.2211 45 0.7347 - - - - -
0.2260 46 1.0265 - - - - -
0.2310 47 6.0457 - - - - -
0.2359 48 9.6469 - - - - -
0.2408 49 1.8979 - - - - -
0.2457 50 3.5441 - - - - -
0.2506 51 2.7489 - - - - -
0.2555 52 2.1327 - - - - -
0.2604 53 1.642 - - - - -
0.2654 54 2.2627 - - - - -
0.2703 55 0.3302 - - - - -
0.2752 56 4.9844 - - - - -
0.2801 57 7.8189 - - - - -
0.2850 58 1.6654 - - - - -
0.2899 59 9.5835 - - - - -
0.2948 60 15.5611 - - - - -
0.2998 61 6.318 - - - - -
0.3047 62 0.7203 - - - - -
0.3096 63 7.7936 - - - - -
0.3145 64 0.2338 - - - - -
0.3194 65 3.5607 - - - - -
0.3243 66 3.2031 - - - - -
0.3292 67 2.0295 - - - - -
0.3342 68 1.2046 - - - - -
0.3391 69 1.9642 - - - - -
0.3440 70 3.3386 - - - - -
0.3489 71 0.2486 - - - - -
0.3538 72 3.8262 - - - - -
0.3587 73 5.0852 - - - - -
0.3636 74 6.9813 - - - - -
0.3686 75 1.6531 - - - - -
0.3735 76 3.2036 - - - - -
0.3784 77 1.6394 - - - - -
0.3833 78 0.5295 - - - - -
0.3882 79 4.7569 - - - - -
0.3931 80 0.4613 - - - - -
0.3980 81 3.2973 - - - - -
0.4029 82 4.2086 - - - - -
0.4079 83 1.8324 - - - - -
0.4128 84 1.4644 - - - - -
0.4177 85 8.7691 - - - - -
0.4226 86 2.0521 - - - - -
0.4275 87 1.1568 - - - - -
0.4324 88 4.8268 - - - - -
0.4373 89 3.0676 - - - - -
0.4423 90 1.4163 - - - - -
0.4472 91 1.809 - - - - -
0.4521 92 5.6695 - - - - -
0.4570 93 8.6196 - - - - -
0.4619 94 3.991 - - - - -
0.4668 95 1.4334 - - - - -
0.4717 96 0.843 - - - - -
0.4767 97 2.0174 - - - - -
0.4816 98 1.41 - - - - -
0.4865 99 1.474 - - - - -
0.4914 100 1.0071 - - - - -
0.4963 101 9.759 - - - - -
0.5012 102 1.3183 - - - - -
0.5061 103 6.5495 - - - - -
0.5111 104 6.7574 - - - - -
0.5160 105 5.2306 - - - - -
0.5209 106 9.8882 - - - - -
0.5258 107 5.6908 - - - - -
0.5307 108 0.204 - - - - -
0.5356 109 0.8722 - - - - -
0.5405 110 7.1935 - - - - -
0.5455 111 1.0109 - - - - -
0.5504 112 10.1742 - - - - -
0.5553 113 2.2728 - - - - -
0.5602 114 5.1402 - - - - -
0.5651 115 5.2589 - - - - -
0.5700 116 2.4842 - - - - -
0.5749 117 1.3957 - - - - -
0.5799 118 7.7296 - - - - -
0.5848 119 4.5689 - - - - -
0.5897 120 2.8115 - - - - -
0.5946 121 2.498 - - - - -
0.5995 122 0.4326 - - - - -
0.6044 123 1.403 - - - - -
0.6093 124 2.4042 - - - - -
0.6143 125 1.7056 - - - - -
0.6192 126 0.4784 - - - - -
0.6241 127 14.7394 - - - - -
0.6290 128 4.8974 - - - - -
0.6339 129 9.9614 - - - - -
0.6388 130 6.626 - - - - -
0.6437 131 2.1262 - - - - -
0.6486 132 5.0936 - - - - -
0.6536 133 4.5314 - - - - -
0.6585 134 0.6033 - - - - -
0.6634 135 4.8067 - - - - -
0.6683 136 0.3481 - - - - -
0.6732 137 1.2218 - - - - -
0.6781 138 1.0191 - - - - -
0.6830 139 0.366 - - - - -
0.6880 140 3.331 - - - - -
0.6929 141 0.5188 - - - - -
0.6978 142 5.6721 - - - - -
0.7027 143 1.5885 - - - - -
0.7076 144 2.7388 - - - - -
0.7125 145 11.0374 - - - - -
0.7174 146 3.0557 - - - - -
0.7224 147 0.0759 - - - - -
0.7273 148 0.8796 - - - - -
0.7322 149 4.3239 - - - - -
0.7371 150 6.7367 - - - - -
0.7420 151 0.5102 - - - - -
0.7469 152 2.8752 - - - - -
0.7518 153 3.2807 - - - - -
0.7568 154 0.7898 - - - - -
0.7617 155 2.3934 - - - - -
0.7666 156 6.4737 - - - - -
0.7715 157 10.7761 - - - - -
0.7764 158 1.5748 - - - - -
0.7813 159 2.9404 - - - - -
0.7862 160 3.3701 - - - - -
0.7912 161 1.4911 - - - - -
0.7961 162 1.059 - - - - -
0.8010 163 0.2009 - - - - -
0.8059 164 4.3053 - - - - -
0.8108 165 8.1984 - - - - -
0.8157 166 1.4482 - - - - -
0.8206 167 6.9033 - - - - -
0.8256 168 0.1755 - - - - -
0.8305 169 3.1569 - - - - -
0.8354 170 10.1853 - - - - -
0.8403 171 4.021 - - - - -
0.8452 172 1.2394 - - - - -
0.8501 173 2.2133 - - - - -
0.8550 174 2.8847 - - - - -
0.8600 175 3.6311 - - - - -
0.8649 176 0.3839 - - - - -
0.8698 177 4.1139 - - - - -
0.8747 178 1.5477 - - - - -
0.8796 179 14.3119 - - - - -
0.8845 180 2.9921 - - - - -
0.8894 181 2.6819 - - - - -
0.8943 182 0.4497 - - - - -
0.8993 183 0.9949 - - - - -
0.9042 184 0.1105 - - - - -
0.9091 185 1.6702 - - - - -
0.9140 186 3.0496 - - - - -
0.9189 187 13.805 - - - - -
0.9238 188 4.3785 - - - - -
0.9287 189 3.3977 - - - - -
0.9337 190 9.0982 - - - - -
0.9386 191 3.6429 - - - - -
0.9435 192 0.0417 - - - - -
0.9484 193 6.3926 - - - - -
0.9533 194 1.2612 - - - - -
0.9582 195 4.2172 - - - - -
0.9631 196 1.7142 - - - - -
0.9681 197 3.8045 - - - - -
0.9730 198 5.5675 - - - - -
0.9779 199 2.0159 - - - - -
0.9828 200 0.3509 - - - - -
0.9877 201 3.0548 - - - - -
0.9926 202 6.3828 - - - - -
0.9975 203 1.7689 - - - - -
1.0 204 0.0099 0.4091 0.4105 0.3859 0.3417 0.2651
1.0049 205 3.7857 - - - - -
1.0098 206 3.7477 - - - - -
1.0147 207 0.1039 - - - - -
1.0197 208 1.9289 - - - - -
1.0246 209 4.0245 - - - - -
1.0295 210 0.0435 - - - - -
1.0344 211 0.1228 - - - - -
1.0393 212 4.9706 - - - - -
1.0442 213 0.0138 - - - - -
1.0491 214 0.2759 - - - - -
1.0541 215 2.7098 - - - - -
1.0590 216 2.1365 - - - - -
1.0639 217 1.4398 - - - - -
1.0688 218 0.9605 - - - - -
1.0737 219 0.4798 - - - - -
1.0786 220 0.8345 - - - - -
1.0835 221 0.1061 - - - - -
1.0885 222 1.7815 - - - - -
1.0934 223 2.3127 - - - - -
1.0983 224 1.9179 - - - - -
1.1032 225 1.4213 - - - - -
1.1081 226 1.0013 - - - - -
1.1130 227 0.3178 - - - - -
1.1179 228 0.365 - - - - -
1.1229 229 1.2156 - - - - -
1.1278 230 5.9569 - - - - -
1.1327 231 12.2159 - - - - -
1.1376 232 0.4907 - - - - -
1.1425 233 4.4903 - - - - -
1.1474 234 0.0977 - - - - -
1.1523 235 3.0879 - - - - -
1.1572 236 3.836 - - - - -
1.1622 237 1.3545 - - - - -
1.1671 238 2.8771 - - - - -
1.1720 239 2.591 - - - - -
1.1769 240 0.3288 - - - - -
1.1818 241 3.2974 - - - - -
1.1867 242 0.2526 - - - - -
1.1916 243 0.65 - - - - -
1.1966 244 0.3477 - - - - -
1.2015 245 0.6088 - - - - -
1.2064 246 0.5337 - - - - -
1.2113 247 0.1657 - - - - -
1.2162 248 0.1702 - - - - -
1.2211 249 0.724 - - - - -
1.2260 250 1.6809 - - - - -
1.2310 251 2.1363 - - - - -
1.2359 252 0.5129 - - - - -
1.2408 253 2.8669 - - - - -
1.2457 254 5.3834 - - - - -
1.2506 255 0.2463 - - - - -
1.2555 256 1.6799 - - - - -
1.2604 257 3.9336 - - - - -
1.2654 258 1.3616 - - - - -
1.2703 259 8.8923 - - - - -
1.2752 260 0.1433 - - - - -
1.2801 261 5.4949 - - - - -
1.2850 262 0.0395 - - - - -
1.2899 263 0.828 - - - - -
1.2948 264 4.5095 - - - - -
1.2998 265 0.3675 - - - - -
1.3047 266 0.8749 - - - - -
1.3096 267 0.9569 - - - - -
1.3145 268 3.8587 - - - - -
1.3194 269 0.2796 - - - - -
1.3243 270 0.0195 - - - - -
1.3292 271 3.5645 - - - - -
1.3342 272 0.0594 - - - - -
1.3391 273 5.8937 - - - - -
1.3440 274 0.3646 - - - - -
1.3489 275 7.9112 - - - - -
1.3538 276 1.943 - - - - -
1.3587 277 0.8289 - - - - -
1.3636 278 0.2369 - - - - -
1.3686 279 2.7562 - - - - -
1.3735 280 0.0293 - - - - -
1.3784 281 4.2874 - - - - -
1.3833 282 0.5958 - - - - -
1.3882 283 12.9377 - - - - -
1.3931 284 4.748 - - - - -
1.3980 285 1.9095 - - - - -
1.4029 286 2.2069 - - - - -
1.4079 287 2.627 - - - - -
1.4128 288 0.0352 - - - - -
1.4177 289 0.8522 - - - - -
1.4226 290 0.2362 - - - - -
1.4275 291 0.5066 - - - - -
1.4324 292 0.1919 - - - - -
1.4373 293 2.9645 - - - - -
1.4423 294 1.2498 - - - - -
1.4472 295 0.2644 - - - - -
1.4521 296 3.2783 - - - - -
1.4570 297 0.8226 - - - - -
1.4619 298 0.2221 - - - - -
1.4668 299 1.2149 - - - - -
1.4717 300 0.5072 - - - - -
1.4767 301 0.5181 - - - - -
1.4816 302 0.937 - - - - -
1.4865 303 2.1448 - - - - -
1.4914 304 0.1876 - - - - -
1.4963 305 6.4399 - - - - -
1.5012 306 0.0395 - - - - -
1.5061 307 1.9102 - - - - -
1.5111 308 0.0729 - - - - -
1.5160 309 0.2241 - - - - -
1.5209 310 6.0667 - - - - -
1.5258 311 0.0848 - - - - -
1.5307 312 0.8551 - - - - -
1.5356 313 0.0216 - - - - -
1.5405 314 0.025 - - - - -
1.5455 315 17.4796 - - - - -
1.5504 316 0.2316 - - - - -
1.5553 317 2.9799 - - - - -
1.5602 318 6.4386 - - - - -
1.5651 319 1.1931 - - - - -
1.5700 320 1.9184 - - - - -
1.5749 321 12.6412 - - - - -
1.5799 322 4.9206 - - - - -
1.5848 323 0.1121 - - - - -
1.5897 324 10.8083 - - - - -
1.5946 325 3.1304 - - - - -
1.5995 326 4.435 - - - - -
1.6044 327 17.1776 - - - - -
1.6093 328 0.0288 - - - - -
1.6143 329 5.829 - - - - -
1.6192 330 6.4497 - - - - -
1.6241 331 0.518 - - - - -
1.6290 332 0.27 - - - - -
1.6339 333 0.0485 - - - - -
1.6388 334 0.1782 - - - - -
1.6437 335 0.0779 - - - - -
1.6486 336 1.8905 - - - - -
1.6536 337 11.1906 - - - - -
1.6585 338 0.0944 - - - - -
1.6634 339 9.5066 - - - - -
1.6683 340 0.0802 - - - - -
1.6732 341 5.3741 - - - - -
1.6781 342 0.3266 - - - - -
1.6830 343 9.735 - - - - -
1.6880 344 8.7255 - - - - -
1.6929 345 4.9033 - - - - -
1.6978 346 0.6812 - - - - -
1.7027 347 0.4711 - - - - -
1.7076 348 0.6801 - - - - -
1.7125 349 9.1073 - - - - -
1.7174 350 1.8518 - - - - -
1.7224 351 3.0041 - - - - -
1.7273 352 0.074 - - - - -
1.7322 353 0.3182 - - - - -
1.7371 354 1.0006 - - - - -
1.7420 355 6.0453 - - - - -
1.7469 356 10.938 - - - - -
1.7518 357 0.3152 - - - - -
1.7568 358 3.1026 - - - - -
1.7617 359 1.0964 - - - - -
1.7666 360 2.2737 - - - - -
1.7715 361 0.8961 - - - - -
1.7764 362 5.6047 - - - - -
1.7813 363 10.9563 - - - - -
1.7862 364 5.2848 - - - - -
1.7912 365 0.0042 - - - - -
1.7961 366 1.3142 - - - - -
1.8010 367 4.5506 - - - - -
1.8059 368 0.7704 - - - - -
1.8108 369 5.4817 - - - - -
1.8157 370 1.8137 - - - - -
1.8206 371 1.4185 - - - - -
1.8256 372 11.2898 - - - - -
1.8305 373 2.7519 - - - - -
1.8354 374 0.5657 - - - - -
1.8403 375 1.7001 - - - - -
1.8452 376 0.6031 - - - - -
1.8501 377 0.1446 - - - - -
1.8550 378 3.34 - - - - -
1.8600 379 3.4178 - - - - -
1.8649 380 0.3415 - - - - -
1.8698 381 0.6753 - - - - -
1.8747 382 0.0068 - - - - -
1.8796 383 14.0264 - - - - -
1.8845 384 0.2598 - - - - -
1.8894 385 0.0627 - - - - -
1.8943 386 0.1462 - - - - -
1.8993 387 0.0358 - - - - -
1.9042 388 2.2702 - - - - -
1.9091 389 17.183 - - - - -
1.9140 390 0.0035 - - - - -
1.9189 391 7.0306 - - - - -
1.9238 392 3.5389 - - - - -
1.9287 393 3.0237 - - - - -
1.9337 394 1.9053 - - - - -
1.9386 395 0.3276 - - - - -
1.9435 396 1.8516 - - - - -
1.9484 397 0.1166 - - - - -
1.9533 398 3.4987 - - - - -
1.9582 399 1.5579 - - - - -
1.9631 400 0.0375 - - - - -
1.9681 401 1.8356 - - - - -
1.9730 402 1.6129 - - - - -
1.9779 403 0.0597 - - - - -
1.9828 404 2.4842 - - - - -
1.9877 405 0.9398 - - - - -
1.9926 406 4.4135 - - - - -
1.9975 407 0.8529 - - - - -
2.0 408 0.0555 0.4050 0.4000 0.3685 0.3534 0.2996
2.0049 409 6.4779 - - - - -
2.0098 410 0.1078 - - - - -
2.0147 411 0.9369 - - - - -
2.0197 412 1.3398 - - - - -
2.0246 413 3.2537 - - - - -
2.0295 414 0.7063 - - - - -
2.0344 415 1.4912 - - - - -
2.0393 416 0.5246 - - - - -
2.0442 417 2.5067 - - - - -
2.0491 418 0.1324 - - - - -
2.0541 419 2.7333 - - - - -
2.0590 420 0.1247 - - - - -
2.0639 421 0.1794 - - - - -
2.0688 422 5.055 - - - - -
2.0737 423 0.1422 - - - - -
2.0786 424 1.511 - - - - -
2.0835 425 0.5098 - - - - -
2.0885 426 9.1543 - - - - -
2.0934 427 0.0017 - - - - -
2.0983 428 0.8768 - - - - -
2.1032 429 0.6347 - - - - -
2.1081 430 0.062 - - - - -
2.1130 431 1.6098 - - - - -
2.1179 432 2.6084 - - - - -
2.1229 433 4.9172 - - - - -
2.1278 434 0.02 - - - - -
2.1327 435 1.5805 - - - - -
2.1376 436 3.6176 - - - - -
2.1425 437 2.8269 - - - - -
2.1474 438 1.4134 - - - - -
2.1523 439 0.8755 - - - - -
2.1572 440 2.6434 - - - - -
2.1622 441 10.6355 - - - - -
2.1671 442 0.0964 - - - - -
2.1720 443 0.0223 - - - - -
2.1769 444 5.9773 - - - - -
2.1818 445 2.8012 - - - - -
2.1867 446 0.0054 - - - - -
2.1916 447 0.0303 - - - - -
2.1966 448 0.8663 - - - - -
2.2015 449 0.0423 - - - - -
2.2064 450 8.415 - - - - -
2.2113 451 8.1867 - - - - -
2.2162 452 1.0138 - - - - -
2.2211 453 6.004 - - - - -
2.2260 454 1.2642 - - - - -
2.2310 455 1.0424 - - - - -
2.2359 456 2.9422 - - - - -
2.2408 457 0.2013 - - - - -
2.2457 458 1.1042 - - - - -
2.2506 459 1.7342 - - - - -
2.2555 460 0.0039 - - - - -
2.2604 461 0.5644 - - - - -
2.2654 462 6.9326 - - - - -
2.2703 463 0.1054 - - - - -
2.2752 464 0.1134 - - - - -
2.2801 465 0.281 - - - - -
2.2850 466 0.0216 - - - - -
2.2899 467 1.5359 - - - - -
2.2948 468 0.066 - - - - -
2.2998 469 0.0401 - - - - -
2.3047 470 0.9936 - - - - -
2.3096 471 0.0539 - - - - -
2.3145 472 0.5133 - - - - -
2.3194 473 6.1059 - - - - -
2.3243 474 0.038 - - - - -
2.3292 475 0.0854 - - - - -
2.3342 476 0.0276 - - - - -
2.3391 477 0.039 - - - - -
2.3440 478 0.0163 - - - - -
2.3489 479 0.0204 - - - - -
2.3538 480 0.1532 - - - - -
2.3587 481 1.7098 - - - - -
2.3636 482 0.2531 - - - - -
2.3686 483 0.3448 - - - - -
2.3735 484 0.0023 - - - - -
2.3784 485 0.0148 - - - - -
2.3833 486 25.1183 - - - - -
2.3882 487 0.1733 - - - - -
2.3931 488 0.0948 - - - - -
2.3980 489 0.0019 - - - - -
2.4029 490 5.0696 - - - - -
2.4079 491 0.0803 - - - - -
2.4128 492 2.7643 - - - - -
2.4177 493 0.0711 - - - - -
2.4226 494 0.3009 - - - - -
2.4275 495 1.9427 - - - - -
2.4324 496 0.2456 - - - - -
2.4373 497 2.7995 - - - - -
2.4423 498 0.0666 - - - - -
2.4472 499 0.5429 - - - - -
2.4521 500 0.7347 - - - - -
2.4570 501 1.507 - - - - -
2.4619 502 0.1055 - - - - -
2.4668 503 1.0503 - - - - -
2.4717 504 0.0046 - - - - -
2.4767 505 0.0089 - - - - -
2.4816 506 3.0963 - - - - -
2.4865 507 0.0141 - - - - -
2.4914 508 0.0219 - - - - -
2.4963 509 1.7336 - - - - -
2.5012 510 0.0162 - - - - -
2.5061 511 0.0275 - - - - -
2.5111 512 0.5769 - - - - -
2.5160 513 0.1312 - - - - -
2.5209 514 0.4667 - - - - -
2.5258 515 24.2047 - - - - -
2.5307 516 0.0749 - - - - -
2.5356 517 0.0788 - - - - -
2.5405 518 0.8379 - - - - -
2.5455 519 3.0144 - - - - -
2.5504 520 4.8283 - - - - -
2.5553 521 0.4814 - - - - -
2.5602 522 2.7476 - - - - -
2.5651 523 0.2397 - - - - -
2.5700 524 0.0043 - - - - -
2.5749 525 6.0072 - - - - -
2.5799 526 0.3147 - - - - -
2.5848 527 1.1259 - - - - -
2.5897 528 0.3375 - - - - -
2.5946 529 4.9853 - - - - -
2.5995 530 0.1985 - - - - -
2.6044 531 0.143 - - - - -
2.6093 532 0.031 - - - - -
2.6143 533 1.2722 - - - - -
2.6192 534 0.0658 - - - - -
2.6241 535 2.1504 - - - - -
2.6290 536 0.0042 - - - - -
2.6339 537 0.1536 - - - - -
2.6388 538 4.659 - - - - -
2.6437 539 4.3203 - - - - -
2.6486 540 8.4674 - - - - -
2.6536 541 0.1089 - - - - -
2.6585 542 0.0138 - - - - -
2.6634 543 2.3114 - - - - -
2.6683 544 0.0151 - - - - -
2.6732 545 5.4184 - - - - -
2.6781 546 3.0697 - - - - -
2.6830 547 8.4918 - - - - -
2.6880 548 0.09 - - - - -
2.6929 549 0.7371 - - - - -
2.6978 550 0.0025 - - - - -
2.7027 551 0.0232 - - - - -
2.7076 552 1.4486 - - - - -
2.7125 553 0.7637 - - - - -
2.7174 554 0.5324 - - - - -
2.7224 555 15.0489 - - - - -
2.7273 556 0.0422 - - - - -
2.7322 557 5.4301 - - - - -
2.7371 558 0.3509 - - - - -
2.7420 559 0.1936 - - - - -
2.7469 560 0.1582 - - - - -
2.7518 561 3.1946 - - - - -
2.7568 562 0.0097 - - - - -
2.7617 563 0.0353 - - - - -
2.7666 564 3.6379 - - - - -
2.7715 565 0.0123 - - - - -
2.7764 566 0.0222 - - - - -
2.7813 567 0.0079 - - - - -
2.7862 568 0.1142 - - - - -
2.7912 569 3.3224 - - - - -
2.7961 570 1.5419 - - - - -
2.8010 571 0.2215 - - - - -
2.8059 572 5.9916 - - - - -
2.8108 573 1.9149 - - - - -
2.8157 574 12.2268 - - - - -
2.8206 575 0.1859 - - - - -
2.8256 576 8.719 - - - - -
2.8305 577 0.1879 - - - - -
2.8354 578 0.0797 - - - - -
2.8403 579 0.3781 - - - - -
2.8452 580 0.047 - - - - -
2.8501 581 0.6769 - - - - -
2.8550 582 0.11 - - - - -
2.8600 583 8.1354 - - - - -
2.8649 584 1.125 - - - - -
2.8698 585 3.3 - - - - -
2.8747 586 3.2444 - - - - -
2.8796 587 0.0642 - - - - -
2.8845 588 0.017 - - - - -
2.8894 589 10.6164 - - - - -
2.8943 590 0.885 - - - - -
2.8993 591 5.437 - - - - -
2.9042 592 12.8624 - - - - -
2.9091 593 0.0331 - - - - -
2.9140 594 1.9041 - - - - -
2.9189 595 0.0406 - - - - -
2.9238 596 0.0947 - - - - -
2.9287 597 0.2682 - - - - -
2.9337 598 4.0826 - - - - -
2.9386 599 0.0041 - - - - -
2.9435 600 0.0244 - - - - -
2.9484 601 0.2647 - - - - -
2.9533 602 4.1754 - - - - -
2.9582 603 0.9577 - - - - -
2.9631 604 2.1507 - - - - -
2.9681 605 11.0173 - - - - -
2.9730 606 0.0023 - - - - -
2.9779 607 1.4786 - - - - -
2.9828 608 0.9823 - - - - -
2.9877 609 1.4711 - - - - -
2.9926 610 0.6055 - - - - -
2.9975 611 0.4423 - - - - -
3.0 612 4.3458 0.4805 0.4756 0.4405 0.4046 0.3865
3.0049 613 0.0673 - - - - -
3.0098 614 0.4593 - - - - -
3.0147 615 0.045 - - - - -
3.0197 616 1.473 - - - - -
3.0246 617 5.4872 - - - - -
3.0295 618 0.0125 - - - - -
3.0344 619 4.1117 - - - - -
3.0393 620 0.5648 - - - - -
3.0442 621 0.2546 - - - - -
3.0491 622 1.8912 - - - - -
3.0541 623 0.0366 - - - - -
3.0590 624 0.0343 - - - - -
3.0639 625 2.3127 - - - - -
3.0688 626 0.0717 - - - - -
3.0737 627 0.6625 - - - - -
3.0786 628 0.016 - - - - -
3.0835 629 23.2323 - - - - -
3.0885 630 0.1324 - - - - -
3.0934 631 0.1937 - - - - -
3.0983 632 6.0085 - - - - -
3.1032 633 0.0487 - - - - -
3.1081 634 15.3035 - - - - -
3.1130 635 0.1268 - - - - -
3.1179 636 0.3022 - - - - -
3.1229 637 2.0766 - - - - -
3.1278 638 2.0618 - - - - -
3.1327 639 5.9734 - - - - -
3.1376 640 8.3862 - - - - -
3.1425 641 0.0284 - - - - -
3.1474 642 0.1016 - - - - -
3.1523 643 0.3523 - - - - -
3.1572 644 0.2117 - - - - -
3.1622 645 2.2211 - - - - -
3.1671 646 0.5002 - - - - -
3.1720 647 0.0016 - - - - -
3.1769 648 0.0077 - - - - -
3.1818 649 0.4822 - - - - -
3.1867 650 0.0004 - - - - -
3.1916 651 0.0275 - - - - -
3.1966 652 0.0907 - - - - -
3.2015 653 0.0751 - - - - -
3.2064 654 0.0147 - - - - -
3.2113 655 0.4161 - - - - -
3.2162 656 3.217 - - - - -
3.2211 657 0.0028 - - - - -
3.2260 658 0.0028 - - - - -
3.2310 659 0.0079 - - - - -
3.2359 660 0.0083 - - - - -
3.2408 661 0.0379 - - - - -
3.2457 662 0.0294 - - - - -
3.2506 663 0.1602 - - - - -
3.2555 664 0.0018 - - - - -
3.2604 665 0.0035 - - - - -
3.2654 666 0.387 - - - - -
3.2703 667 0.0984 - - - - -
3.2752 668 4.7913 - - - - -
3.2801 669 0.1947 - - - - -
3.2850 670 3.7142 - - - - -
3.2899 671 0.096 - - - - -
3.2948 672 0.0027 - - - - -
3.2998 673 0.0359 - - - - -
3.3047 674 0.6312 - - - - -
3.3096 675 8.0573 - - - - -
3.3145 676 2.9999 - - - - -
3.3194 677 0.1601 - - - - -
3.3243 678 3.4294 - - - - -
3.3292 679 0.0017 - - - - -
3.3342 680 0.0068 - - - - -
3.3391 681 1.7632 - - - - -
3.3440 682 0.0085 - - - - -
3.3489 683 0.7914 - - - - -
3.3538 684 0.0857 - - - - -
3.3587 685 0.0445 - - - - -
3.3636 686 0.0049 - - - - -
3.3686 687 0.0697 - - - - -
3.3735 688 0.1631 - - - - -
3.3784 689 0.0157 - - - - -
3.3833 690 0.0021 - - - - -
3.3882 691 0.0015 - - - - -
3.3931 692 0.3901 - - - - -
3.3980 693 1.7331 - - - - -
3.4029 694 6.0614 - - - - -
3.4079 695 0.0287 - - - - -
3.4128 696 2.9112 - - - - -
3.4177 697 0.0373 - - - - -
3.4226 698 0.0298 - - - - -
3.4275 699 3.69 - - - - -
3.4324 700 0.0247 - - - - -
3.4373 701 0.0097 - - - - -
3.4423 702 7.6551 - - - - -
3.4472 703 1.6144 - - - - -
3.4521 704 0.0964 - - - - -
3.4570 705 0.1756 - - - - -
3.4619 706 0.0834 - - - - -
3.4668 707 0.0264 - - - - -
3.4717 708 0.7804 - - - - -
3.4767 709 0.0252 - - - - -
3.4816 710 9.4975 - - - - -
3.4865 711 1.8196 - - - - -
3.4914 712 0.0874 - - - - -
3.4963 713 0.0515 - - - - -
3.5012 714 2.1845 - - - - -
3.5061 715 0.0063 - - - - -
3.5111 716 0.0005 - - - - -
3.5160 717 0.0246 - - - - -
3.5209 718 0.0042 - - - - -
3.5258 719 0.2764 - - - - -
3.5307 720 0.0604 - - - - -
3.5356 721 0.5046 - - - - -
3.5405 722 0.1577 - - - - -
3.5455 723 0.3033 - - - - -
3.5504 724 0.0348 - - - - -
3.5553 725 0.1488 - - - - -
3.5602 726 0.0189 - - - - -
3.5651 727 0.0688 - - - - -
3.5700 728 0.001 - - - - -
3.5749 729 4.4539 - - - - -
3.5799 730 1.9252 - - - - -
3.5848 731 0.1565 - - - - -
3.5897 732 0.0041 - - - - -
3.5946 733 0.0026 - - - - -
3.5995 734 0.0044 - - - - -
3.6044 735 0.0666 - - - - -
3.6093 736 0.0383 - - - - -
3.6143 737 11.7611 - - - - -
3.6192 738 12.9361 - - - - -
3.6241 739 0.001 - - - - -
3.6290 740 3.1032 - - - - -
3.6339 741 0.0002 - - - - -
3.6388 742 0.1582 - - - - -
3.6437 743 0.4429 - - - - -
3.6486 744 2.7306 - - - - -
3.6536 745 0.1293 - - - - -
3.6585 746 1.9508 - - - - -
3.6634 747 0.0719 - - - - -
3.6683 748 0.0586 - - - - -
3.6732 749 3.3595 - - - - -
3.6781 750 0.8918 - - - - -
3.6830 751 0.0548 - - - - -
3.6880 752 0.2089 - - - - -
3.6929 753 4.3746 - - - - -
3.6978 754 1.923 - - - - -
3.7027 755 0.1344 - - - - -
3.7076 756 0.002 - - - - -
3.7125 757 6.0042 - - - - -
3.7174 758 0.0317 - - - - -
3.7224 759 21.2955 - - - - -
3.7273 760 0.5189 - - - - -
3.7322 761 2.1175 - - - - -
3.7371 762 7.9324 - - - - -
3.7420 763 0.0259 - - - - -
3.7469 764 0.1636 - - - - -
3.7518 765 0.037 - - - - -
3.7568 766 0.4433 - - - - -
3.7617 767 0.3925 - - - - -
3.7666 768 0.0118 - - - - -
3.7715 769 0.0039 - - - - -
3.7764 770 12.2304 - - - - -
3.7813 771 0.0159 - - - - -
3.7862 772 0.0369 - - - - -
3.7912 773 0.0076 - - - - -
3.7961 774 0.0014 - - - - -
3.8010 775 0.0441 - - - - -
3.8059 776 0.0469 - - - - -
3.8108 777 0.1142 - - - - -
3.8157 778 0.4633 - - - - -
3.8206 779 0.4934 - - - - -
3.8256 780 0.2579 - - - - -
3.8305 781 0.0166 - - - - -
3.8354 782 0.05 - - - - -
3.8403 783 3.154 - - - - -
3.8452 784 0.5013 - - - - -
3.8501 785 3.0774 - - - - -
3.8550 786 13.6522 - - - - -
3.8600 787 0.0302 - - - - -
3.8649 788 10.857 - - - - -
3.8698 789 0.0159 - - - - -
3.8747 790 2.6406 - - - - -
3.8796 791 0.0053 - - - - -
3.8845 792 0.0013 - - - - -
3.8894 793 0.0158 - - - - -
3.8943 794 0.1813 - - - - -
3.8993 795 0.0177 - - - - -
3.9042 796 7.1747 - - - - -
3.9091 797 8.2372 - - - - -
3.9140 798 0.1498 - - - - -
3.9189 799 2.8086 - - - - -
3.9238 800 0.0003 - - - - -
3.9287 801 2.1916 - - - - -
3.9337 802 0.0079 - - - - -
3.9386 803 0.0199 - - - - -
3.9435 804 0.0023 - - - - -
3.9484 805 0.017 - - - - -
3.9533 806 0.0047 - - - - -
3.9582 807 0.119 - - - - -
3.9631 808 3.4038 - - - - -
3.9681 809 0.0024 - - - - -
3.9730 810 0.0113 - - - - -
3.9779 811 3.7138 - - - - -
3.9828 812 1.1807 - - - - -
3.9877 813 7.1167 - - - - -
3.9926 814 0.6866 - - - - -
3.9975 815 0.0302 - - - - -
4.0 816 0.0 0.4828 0.4783 0.4511 0.4436 0.4257
4.0049 817 0.2861 - - - - -
4.0098 818 0.2564 - - - - -
4.0147 819 3.7804 - - - - -
4.0197 820 0.002 - - - - -
4.0246 821 0.064 - - - - -
4.0295 822 0.0054 - - - - -
4.0344 823 6.2346 - - - - -
4.0393 824 0.0931 - - - - -
4.0442 825 0.234 - - - - -
4.0491 826 0.0019 - - - - -
4.0541 827 17.2578 - - - - -
4.0590 828 0.6095 - - - - -
4.0639 829 0.1835 - - - - -
4.0688 830 1.2404 - - - - -
4.0737 831 0.0181 - - - - -
4.0786 832 0.0269 - - - - -
4.0835 833 0.0089 - - - - -
4.0885 834 5.2607 - - - - -
4.0934 835 0.0021 - - - - -
4.0983 836 0.0073 - - - - -
4.1032 837 0.0131 - - - - -
4.1081 838 0.2196 - - - - -
4.1130 839 0.0002 - - - - -
4.1179 840 1.7468 - - - - -
4.1229 841 0.0695 - - - - -
4.1278 842 0.0069 - - - - -
4.1327 843 0.0283 - - - - -
4.1376 844 0.0886 - - - - -
4.1425 845 0.0936 - - - - -
4.1474 846 0.5142 - - - - -
4.1523 847 5.2388 - - - - -
4.1572 848 0.0043 - - - - -
4.1622 849 3.6164 - - - - -
4.1671 850 0.7693 - - - - -
4.1720 851 0.0729 - - - - -
4.1769 852 8.5971 - - - - -
4.1818 853 2.8356 - - - - -
4.1867 854 0.0057 - - - - -
4.1916 855 0.0037 - - - - -
4.1966 856 0.0564 - - - - -
4.2015 857 0.0603 - - - - -
4.2064 858 3.4425 - - - - -
4.2113 859 0.0108 - - - - -
4.2162 860 0.0418 - - - - -
4.2211 861 0.0125 - - - - -
4.2260 862 0.0053 - - - - -
4.2310 863 0.0561 - - - - -
4.2359 864 0.1855 - - - - -
4.2408 865 0.0101 - - - - -
4.2457 866 4.9659 - - - - -
4.2506 867 0.0055 - - - - -
4.2555 868 8.4273 - - - - -
4.2604 869 0.0005 - - - - -
4.2654 870 0.737 - - - - -
4.2703 871 0.0038 - - - - -
4.2752 872 0.0763 - - - - -
4.2801 873 0.2328 - - - - -
4.2850 874 1.5894 - - - - -
4.2899 875 0.0121 - - - - -
4.2948 876 0.008 - - - - -
4.2998 877 1.2244 - - - - -
4.3047 878 4.2282 - - - - -
4.3096 879 0.0064 - - - - -
4.3145 880 0.9835 - - - - -
4.3194 881 0.0671 - - - - -
4.3243 882 0.003 - - - - -
4.3292 883 0.0287 - - - - -
4.3342 884 0.0061 - - - - -
4.3391 885 0.0044 - - - - -
4.3440 886 0.0705 - - - - -
4.3489 887 2.2214 - - - - -
4.3538 888 0.0004 - - - - -
4.3587 889 0.0082 - - - - -
4.3636 890 0.1958 - - - - -
4.3686 891 1.7235 - - - - -
4.3735 892 0.0167 - - - - -
4.3784 893 0.6845 - - - - -
4.3833 894 0.0001 - - - - -
4.3882 895 1.0436 - - - - -
4.3931 896 0.0112 - - - - -
4.3980 897 0.0404 - - - - -
4.4029 898 0.0005 - - - - -
4.4079 899 0.1872 - - - - -
4.4128 900 0.0002 - - - - -
4.4177 901 1.8915 - - - - -
4.4226 902 1.7094 - - - - -
4.4275 903 0.1402 - - - - -
4.4324 904 0.0643 - - - - -
4.4373 905 0.0776 - - - - -
4.4423 906 0.0005 - - - - -
4.4472 907 0.8611 - - - - -
4.4521 908 0.0003 - - - - -
4.4570 909 0.1705 - - - - -
4.4619 910 0.1037 - - - - -
4.4668 911 0.0134 - - - - -
4.4717 912 0.173 - - - - -
4.4767 913 0.0914 - - - - -
4.4816 914 0.5567 - - - - -
4.4865 915 0.0187 - - - - -
4.4914 916 0.0121 - - - - -
4.4963 917 1.0578 - - - - -
4.5012 918 0.0656 - - - - -
4.5061 919 0.0447 - - - - -
4.5111 920 0.0164 - - - - -
4.5160 921 0.0009 - - - - -
4.5209 922 1.1752 - - - - -
4.5258 923 0.0125 - - - - -
4.5307 924 0.3298 - - - - -
4.5356 925 0.0022 - - - - -
4.5405 926 0.9016 - - - - -
4.5455 927 0.0154 - - - - -
4.5504 928 0.0028 - - - - -
4.5553 929 0.0229 - - - - -
4.5602 930 0.8454 - - - - -
4.5651 931 5.024 - - - - -
4.5700 932 0.0031 - - - - -
4.5749 933 1.1479 - - - - -
4.5799 934 0.0018 - - - - -
4.5848 935 0.0457 - - - - -
4.5897 936 11.4855 - - - - -
4.5946 937 0.1458 - - - - -
4.5995 938 0.0159 - - - - -
4.6044 939 1.3953 - - - - -
4.6093 940 0.0497 - - - - -
4.6143 941 0.0192 - - - - -
4.6192 942 1.2701 - - - - -
4.6241 943 1.5943 - - - - -
4.6290 944 0.2584 - - - - -
4.6339 945 0.0015 - - - - -
4.6388 946 0.0079 - - - - -
4.6437 947 0.0221 - - - - -
4.6486 948 0.0049 - - - - -
4.6536 949 0.1198 - - - - -
4.6585 950 1.2655 - - - - -
4.6634 951 0.2401 - - - - -
4.6683 952 0.0001 - - - - -
4.6732 953 0.0 - - - - -
4.6781 954 0.203 - - - - -
4.6830 955 0.0066 - - - - -
4.6880 956 0.002 - - - - -
4.6929 957 0.0057 - - - - -
4.6978 958 0.0004 - - - - -
4.7027 959 3.0077 - - - - -
4.7076 960 0.0558 - - - - -
4.7125 961 0.0018 - - - - -
4.7174 962 0.0001 - - - - -
4.7224 963 0.818 - - - - -
4.7273 964 0.0587 - - - - -
4.7322 965 0.0785 - - - - -
4.7371 966 0.3031 - - - - -
4.7420 967 0.0018 - - - - -
4.7469 968 0.15 - - - - -
4.7518 969 1.1299 - - - - -
4.7568 970 4.6018 - - - - -
4.7617 971 0.0022 - - - - -
4.7666 972 0.0006 - - - - -
4.7715 973 0.0295 - - - - -
4.7764 974 0.4112 - - - - -
4.7813 975 3.3922 - - - - -
4.7862 976 0.0108 - - - - -
4.7912 977 0.0003 - - - - -
4.7961 978 0.003 - - - - -
4.8010 979 4.5611 - - - - -
4.8059 980 1.8257 - - - - -
4.8108 981 0.0129 - - - - -
4.8157 982 0.6645 - - - - -
4.8206 983 0.0207 - - - - -
4.8256 984 0.0032 - - - - -
4.8305 985 0.0077 - - - - -
4.8354 986 7.022 - - - - -
4.8403 987 17.4472 - - - - -
4.8452 988 0.0 - - - - -
4.8501 989 0.0042 - - - - -
4.8550 990 6.4386 - - - - -
4.8600 991 0.0182 - - - - -
4.8649 992 0.0191 - - - - -
4.8698 993 0.2168 - - - - -
4.8747 994 0.0071 - - - - -
4.8796 995 0.004 - - - - -
4.8845 996 0.0026 - - - - -
4.8894 997 0.6601 - - - - -
4.8943 998 0.0045 - - - - -
4.8993 999 0.0239 - - - - -
4.9042 1000 0.0069 - - - - -
4.9091 1001 0.0215 - - - - -
4.9140 1002 0.1398 - - - - -
4.9189 1003 0.2671 - - - - -
4.9238 1004 0.0045 - - - - -
4.9287 1005 0.0487 - - - - -
4.9337 1006 0.0024 - - - - -
4.9386 1007 0.0464 - - - - -
4.9435 1008 0.1883 - - - - -
4.9484 1009 0.0332 - - - - -
4.9533 1010 0.0235 - - - - -
4.9582 1011 0.0074 - - - - -
4.9631 1012 2.0417 - - - - -
4.9681 1013 0.0087 - - - - -
4.9730 1014 0.0375 - - - - -
4.9779 1015 0.0096 - - - - -
4.9828 1016 1.0317 - - - - -
4.9877 1017 9.0656 - - - - -
4.9926 1018 0.4875 - - - - -
4.9975 1019 0.2363 - - - - -
5.0 1020 6.9161 0.4730 0.4752 0.4446 0.4208 0.4295
5.0049 1021 0.0076 - - - - -
5.0098 1022 0.0059 - - - - -
5.0147 1023 1.8155 - - - - -
5.0197 1024 0.0035 - - - - -
5.0246 1025 0.0095 - - - - -
5.0295 1026 0.265 - - - - -
5.0344 1027 0.0682 - - - - -
5.0393 1028 0.0011 - - - - -
5.0442 1029 0.0073 - - - - -
5.0491 1030 0.002 - - - - -
5.0541 1031 0.0749 - - - - -
5.0590 1032 0.0143 - - - - -
5.0639 1033 0.1008 - - - - -
5.0688 1034 0.0003 - - - - -
5.0737 1035 2.8253 - - - - -
5.0786 1036 4.2712 - - - - -
5.0835 1037 0.0008 - - - - -
5.0885 1038 0.0056 - - - - -
5.0934 1039 1.2385 - - - - -
5.0983 1040 0.0372 - - - - -
5.1032 1041 0.4262 - - - - -
5.1081 1042 0.0012 - - - - -
5.1130 1043 0.0461 - - - - -
5.1179 1044 0.0335 - - - - -
5.1229 1045 0.0008 - - - - -
5.1278 1046 0.8225 - - - - -
5.1327 1047 0.0027 - - - - -
5.1376 1048 0.0278 - - - - -
5.1425 1049 0.0106 - - - - -
5.1474 1050 0.0086 - - - - -
5.1523 1051 0.4186 - - - - -
5.1572 1052 3.0836 - - - - -
5.1622 1053 0.1715 - - - - -
5.1671 1054 0.0007 - - - - -
5.1720 1055 0.0002 - - - - -
5.1769 1056 0.6005 - - - - -
5.1818 1057 0.0003 - - - - -
5.1867 1058 0.9117 - - - - -
5.1916 1059 0.0078 - - - - -
5.1966 1060 0.1735 - - - - -
5.2015 1061 0.013 - - - - -
5.2064 1062 0.0005 - - - - -
5.2113 1063 0.0115 - - - - -
5.2162 1064 2.2331 - - - - -
5.2211 1065 0.7744 - - - - -
5.2260 1066 0.0006 - - - - -
5.2310 1067 0.2042 - - - - -
5.2359 1068 0.0503 - - - - -
5.2408 1069 0.0079 - - - - -
5.2457 1070 0.1597 - - - - -
5.2506 1071 0.026 - - - - -
5.2555 1072 0.2249 - - - - -
5.2604 1073 0.0014 - - - - -
5.2654 1074 0.6355 - - - - -
5.2703 1075 2.9834 - - - - -
5.2752 1076 0.0379 - - - - -
5.2801 1077 0.0002 - - - - -
5.2850 1078 0.8298 - - - - -
5.2899 1079 0.0611 - - - - -
5.2948 1080 0.0058 - - - - -
5.2998 1081 0.0023 - - - - -
5.3047 1082 0.0152 - - - - -
5.3096 1083 0.0938 - - - - -
5.3145 1084 5.463 - - - - -
5.3194 1085 0.0048 - - - - -
5.3243 1086 0.005 - - - - -
5.3292 1087 0.0029 - - - - -
5.3342 1088 0.0302 - - - - -
5.3391 1089 0.0001 - - - - -
5.3440 1090 0.4089 - - - - -
5.3489 1091 1.078 - - - - -
5.3538 1092 0.2965 - - - - -
5.3587 1093 0.0014 - - - - -
5.3636 1094 0.0001 - - - - -
5.3686 1095 0.0187 - - - - -
5.3735 1096 9.4833 - - - - -
5.3784 1097 0.6337 - - - - -
5.3833 1098 0.0061 - - - - -
5.3882 1099 0.0008 - - - - -
5.3931 1100 0.0197 - - - - -
5.3980 1101 0.0034 - - - - -
5.4029 1102 8.6963 - - - - -
5.4079 1103 0.227 - - - - -
5.4128 1104 2.3266 - - - - -
5.4177 1105 0.8878 - - - - -
5.4226 1106 0.0199 - - - - -
5.4275 1107 0.4296 - - - - -
5.4324 1108 0.1091 - - - - -
5.4373 1109 0.0005 - - - - -
5.4423 1110 2.1527 - - - - -
5.4472 1111 0.204 - - - - -
5.4521 1112 0.0012 - - - - -
5.4570 1113 0.2691 - - - - -
5.4619 1114 1.5596 - - - - -
5.4668 1115 0.0004 - - - - -
5.4717 1116 0.0687 - - - - -
5.4767 1117 1.7525 - - - - -
5.4816 1118 2.2527 - - - - -
5.4865 1119 0.0545 - - - - -
5.4914 1120 0.0004 - - - - -
5.4963 1121 0.2455 - - - - -
5.5012 1122 3.9042 - - - - -
5.5061 1123 0.0332 - - - - -
5.5111 1124 0.7148 - - - - -
5.5160 1125 0.0329 - - - - -
5.5209 1126 0.4336 - - - - -
5.5258 1127 0.0007 - - - - -
5.5307 1128 0.0351 - - - - -
5.5356 1129 0.0071 - - - - -
5.5405 1130 0.0021 - - - - -
5.5455 1131 7.8513 - - - - -
5.5504 1132 9.1421 - - - - -
5.5553 1133 0.0222 - - - - -
5.5602 1134 0.009 - - - - -
5.5651 1135 0.3206 - - - - -
5.5700 1136 2.1536 - - - - -
5.5749 1137 0.7952 - - - - -
5.5799 1138 4.8399 - - - - -
5.5848 1139 0.0121 - - - - -
5.5897 1140 1.8385 - - - - -
5.5946 1141 0.0033 - - - - -
5.5995 1142 5.1626 - - - - -
5.6044 1143 0.0003 - - - - -
5.6093 1144 1.3345 - - - - -
5.6143 1145 0.0044 - - - - -
5.6192 1146 15.9512 - - - - -
5.6241 1147 0.0001 - - - - -
5.6290 1148 1.8027 - - - - -
5.6339 1149 0.0027 - - - - -
5.6388 1150 3.0103 - - - - -
5.6437 1151 0.2128 - - - - -
5.6486 1152 0.0291 - - - - -
5.6536 1153 0.0032 - - - - -
5.6585 1154 0.3131 - - - - -
5.6634 1155 0.0403 - - - - -
5.6683 1156 0.0001 - - - - -
5.6732 1157 0.3199 - - - - -
5.6781 1158 2.783 - - - - -
5.6830 1159 0.0048 - - - - -
5.6880 1160 0.0171 - - - - -
5.6929 1161 0.0001 - - - - -
5.6978 1162 0.0003 - - - - -
5.7027 1163 0.0836 - - - - -
5.7076 1164 4.8424 - - - - -
5.7125 1165 0.1689 - - - - -
5.7174 1166 0.0332 - - - - -
5.7224 1167 0.0071 - - - - -
5.7273 1168 0.9302 - - - - -
5.7322 1169 0.0009 - - - - -
5.7371 1170 0.131 - - - - -
5.7420 1171 0.0011 - - - - -
5.7469 1172 0.3116 - - - - -
5.7518 1173 3.2493 - - - - -
5.7568 1174 0.0012 - - - - -
5.7617 1175 0.9874 - - - - -
5.7666 1176 12.0563 - - - - -
5.7715 1177 0.0221 - - - - -
5.7764 1178 0.0045 - - - - -
5.7813 1179 0.0017 - - - - -
5.7862 1180 0.1269 - - - - -
5.7912 1181 0.0629 - - - - -
5.7961 1182 0.0015 - - - - -
5.8010 1183 2.6041 - - - - -
5.8059 1184 3.2558 - - - - -
5.8108 1185 0.0874 - - - - -
5.8157 1186 0.0006 - - - - -
5.8206 1187 0.0013 - - - - -
5.8256 1188 0.2076 - - - - -
5.8305 1189 0.0002 - - - - -
5.8354 1190 0.0013 - - - - -
5.8403 1191 0.087 - - - - -
5.8452 1192 0.0856 - - - - -
5.8501 1193 0.0009 - - - - -
5.8550 1194 0.0002 - - - - -
5.8600 1195 0.0718 - - - - -
5.8649 1196 0.0359 - - - - -
5.8698 1197 0.0155 - - - - -
5.8747 1198 2.4011 - - - - -
5.8796 1199 1.7882 - - - - -
5.8845 1200 0.003 - - - - -
5.8894 1201 0.0003 - - - - -
5.8943 1202 0.0301 - - - - -
5.8993 1203 0.0016 - - - - -
5.9042 1204 0.0026 - - - - -
5.9091 1205 0.0567 - - - - -
5.9140 1206 1.6303 - - - - -
5.9189 1207 0.0001 - - - - -
5.9238 1208 17.0385 - - - - -
5.9287 1209 1.1137 - - - - -
5.9337 1210 0.0022 - - - - -
5.9386 1211 0.0436 - - - - -
5.9435 1212 11.5906 - - - - -
5.9484 1213 0.0012 - - - - -
5.9533 1214 0.0027 - - - - -
5.9582 1215 2.7394 - - - - -
5.9631 1216 0.0005 - - - - -
5.9681 1217 0.0033 - - - - -
5.9730 1218 0.0949 - - - - -
5.9779 1219 0.0002 - - - - -
5.9828 1220 0.1714 - - - - -
5.9877 1221 12.5216 - - - - -
5.9926 1222 0.0057 - - - - -
5.9975 1223 0.0776 - - - - -
6.0 1224 0.0 0.4675 0.4648 0.4481 0.4329 0.4236
6.0049 1225 6.2431 - - - - -
6.0098 1226 0.0034 - - - - -
6.0147 1227 1.607 - - - - -
6.0197 1228 0.0011 - - - - -
6.0246 1229 5.6846 - - - - -
6.0295 1230 0.9689 - - - - -
6.0344 1231 0.0115 - - - - -
6.0393 1232 0.0332 - - - - -
6.0442 1233 0.0716 - - - - -
6.0491 1234 0.0059 - - - - -
6.0541 1235 0.0111 - - - - -
6.0590 1236 1.3194 - - - - -
6.0639 1237 0.0113 - - - - -
6.0688 1238 0.1392 - - - - -
6.0737 1239 0.0007 - - - - -
6.0786 1240 0.0016 - - - - -
6.0835 1241 0.0001 - - - - -
6.0885 1242 0.5488 - - - - -
6.0934 1243 0.0016 - - - - -
6.0983 1244 0.0145 - - - - -
6.1032 1245 4.2422 - - - - -
6.1081 1246 6.2805 - - - - -
6.1130 1247 0.1299 - - - - -
6.1179 1248 0.068 - - - - -
6.1229 1249 0.146 - - - - -
6.1278 1250 8.7093 - - - - -
6.1327 1251 0.001 - - - - -
6.1376 1252 0.0002 - - - - -
6.1425 1253 0.0298 - - - - -
6.1474 1254 0.0006 - - - - -
6.1523 1255 0.7816 - - - - -
6.1572 1256 0.001 - - - - -
6.1622 1257 0.1088 - - - - -
6.1671 1258 5.3505 - - - - -
6.1720 1259 0.0071 - - - - -
6.1769 1260 0.0185 - - - - -
6.1818 1261 2.3218 - - - - -
6.1867 1262 0.0776 - - - - -
6.1916 1263 0.0021 - - - - -
6.1966 1264 2.0907 - - - - -
6.2015 1265 1.4855 - - - - -
6.2064 1266 0.0069 - - - - -
6.2113 1267 0.0003 - - - - -
6.2162 1268 0.001 - - - - -
6.2211 1269 0.107 - - - - -
6.2260 1270 4.1798 - - - - -
6.2310 1271 0.0176 - - - - -
6.2359 1272 2.6422 - - - - -
6.2408 1273 0.0045 - - - - -
6.2457 1274 0.0018 - - - - -
6.2506 1275 0.2778 - - - - -
6.2555 1276 0.5032 - - - - -
6.2604 1277 0.1316 - - - - -
6.2654 1278 0.854 - - - - -
6.2703 1279 0.0043 - - - - -
6.2752 1280 0.094 - - - - -
6.2801 1281 0.0005 - - - - -
6.2850 1282 0.0003 - - - - -
6.2899 1283 1.8024 - - - - -
6.2948 1284 0.0791 - - - - -
6.2998 1285 0.0052 - - - - -
6.3047 1286 0.0861 - - - - -
6.3096 1287 0.1681 - - - - -
6.3145 1288 0.0005 - - - - -
6.3194 1289 0.0107 - - - - -
6.3243 1290 0.0074 - - - - -
6.3292 1291 0.0073 - - - - -
6.3342 1292 0.0395 - - - - -
6.3391 1293 0.0093 - - - - -
6.3440 1294 8.5639 - - - - -
6.3489 1295 0.0007 - - - - -
6.3538 1296 0.0001 - - - - -
6.3587 1297 1.1961 - - - - -
6.3636 1298 0.0187 - - - - -
6.3686 1299 9.2708 - - - - -
6.3735 1300 0.2093 - - - - -
6.3784 1301 8.735 - - - - -
6.3833 1302 0.0024 - - - - -
6.3882 1303 0.0029 - - - - -
6.3931 1304 0.0005 - - - - -
6.3980 1305 0.0764 - - - - -
6.4029 1306 1.1146 - - - - -
6.4079 1307 0.0007 - - - - -
6.4128 1308 0.3375 - - - - -
6.4177 1309 4.8914 - - - - -
6.4226 1310 0.0013 - - - - -
6.4275 1311 0.0684 - - - - -
6.4324 1312 2.272 - - - - -
6.4373 1313 0.9252 - - - - -
6.4423 1314 0.0099 - - - - -
6.4472 1315 0.01 - - - - -
6.4521 1316 0.1832 - - - - -
6.4570 1317 1.0361 - - - - -
6.4619 1318 0.0175 - - - - -
6.4668 1319 0.0443 - - - - -
6.4717 1320 0.0266 - - - - -
6.4767 1321 0.0004 - - - - -
6.4816 1322 2.6529 - - - - -
6.4865 1323 0.8222 - - - - -
6.4914 1324 0.0001 - - - - -
6.4963 1325 0.0055 - - - - -
6.5012 1326 0.0002 - - - - -
6.5061 1327 0.0003 - - - - -
6.5111 1328 0.009 - - - - -
6.5160 1329 3.8228 - - - - -
6.5209 1330 0.4528 - - - - -
6.5258 1331 0.4198 - - - - -
6.5307 1332 0.0007 - - - - -
6.5356 1333 0.1281 - - - - -
6.5405 1334 0.0131 - - - - -
6.5455 1335 3.3781 - - - - -
6.5504 1336 0.1504 - - - - -
6.5553 1337 0.0128 - - - - -
6.5602 1338 0.0427 - - - - -
6.5651 1339 0.1236 - - - - -
6.5700 1340 0.0005 - - - - -
6.5749 1341 8.4445 - - - - -
6.5799 1342 3.8744 - - - - -
6.5848 1343 0.0001 - - - - -
6.5897 1344 0.1478 - - - - -
6.5946 1345 0.0463 - - - - -
6.5995 1346 0.0005 - - - - -
6.6044 1347 0.4847 - - - - -
6.6093 1348 0.0008 - - - - -
6.6143 1349 1.2222 - - - - -
6.6192 1350 0.8561 - - - - -
6.6241 1351 0.5016 - - - - -
6.6290 1352 0.8134 - - - - -
6.6339 1353 0.663 - - - - -
6.6388 1354 0.0373 - - - - -
6.6437 1355 4.6129 - - - - -
6.6486 1356 0.0003 - - - - -
6.6536 1357 0.036 - - - - -
6.6585 1358 0.0591 - - - - -
6.6634 1359 0.0724 - - - - -
6.6683 1360 2.0866 - - - - -
6.6732 1361 0.0015 - - - - -
6.6781 1362 0.0446 - - - - -
6.6830 1363 0.0003 - - - - -
6.6880 1364 3.0299 - - - - -
6.6929 1365 0.0068 - - - - -
6.6978 1366 0.0257 - - - - -
6.7027 1367 0.2567 - - - - -
6.7076 1368 0.6689 - - - - -
6.7125 1369 3.2128 - - - - -
6.7174 1370 0.0011 - - - - -
6.7224 1371 0.0004 - - - - -
6.7273 1372 0.0383 - - - - -
6.7322 1373 0.0005 - - - - -
6.7371 1374 1.4068 - - - - -
6.7420 1375 0.1166 - - - - -
6.7469 1376 0.8068 - - - - -
6.7518 1377 0.8235 - - - - -
6.7568 1378 0.7885 - - - - -
6.7617 1379 0.0022 - - - - -
6.7666 1380 0.0 - - - - -
6.7715 1381 0.0026 - - - - -
6.7764 1382 0.0015 - - - - -
6.7813 1383 0.0848 - - - - -
6.7862 1384 0.1958 - - - - -
6.7912 1385 0.0173 - - - - -
6.7961 1386 0.0008 - - - - -
6.8010 1387 7.3964 - - - - -
6.8059 1388 0.0005 - - - - -
6.8108 1389 0.3121 - - - - -
6.8157 1390 0.0226 - - - - -
6.8206 1391 0.0007 - - - - -
6.8256 1392 0.0072 - - - - -
6.8305 1393 0.0001 - - - - -
6.8354 1394 0.459 - - - - -
6.8403 1395 0.0009 - - - - -
6.8452 1396 4.7422 - - - - -
6.8501 1397 0.0004 - - - - -
6.8550 1398 0.0028 - - - - -
6.8600 1399 0.0011 - - - - -
6.8649 1400 0.0 - - - - -
6.8698 1401 0.0005 - - - - -
6.8747 1402 0.0033 - - - - -
6.8796 1403 0.0 - - - - -
6.8845 1404 4.9801 - - - - -
6.8894 1405 3.454 - - - - -
6.8943 1406 5.912 - - - - -
6.8993 1407 0.0053 - - - - -
6.9042 1408 0.0001 - - - - -
6.9091 1409 0.0064 - - - - -
6.9140 1410 0.0747 - - - - -
6.9189 1411 0.1264 - - - - -
6.9238 1412 5.4115 - - - - -
6.9287 1413 0.0373 - - - - -
6.9337 1414 0.5234 - - - - -
6.9386 1415 0.743 - - - - -
6.9435 1416 0.0004 - - - - -
6.9484 1417 0.0039 - - - - -
6.9533 1418 0.008 - - - - -
6.9582 1419 0.005 - - - - -
6.9631 1420 0.2574 - - - - -
6.9681 1421 0.0001 - - - - -
6.9730 1422 0.0001 - - - - -
6.9779 1423 0.0018 - - - - -
6.9828 1424 0.0186 - - - - -
6.9877 1425 0.0113 - - - - -
6.9926 1426 0.0048 - - - - -
6.9975 1427 1.5498 - - - - -
7.0 1428 0.0001 0.4785 0.4711 0.4555 0.4362 0.4222
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.51.3
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
45
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IoannisKat1/bge-m3-ft-new

Base model

BAAI/bge-m3
Finetuned
(350)
this model

Evaluation results