SentenceTransformer
This is a sentence-transformers model trained. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'cta test̾i sur: u̾bi qd̾ madauit in mil egones. Q uod disposunt ad abrahã. mũti sui ad p̃saaci Et statuit il acob ĩ p̾ceptũ: ⁊ isrł mn testiñ etꝰ Dices tibi dabo t̾ram chanaan: fu ncdũ heditatis ur̃e. Dũ e̾e̾nt nũo ocui. paucissimi ⁊ ĩcole ouis. Et ꝑtͣni eẽt de gnͣte ĩ gentẽ: ⁊ de regno ad ulũ alterũ. Non reliquit hoĩem',
'cta test̾i sur: u̾bi qd̾ madauit in mil egones. Q uod disposunt ad abrahã. mũti sui ad p̃saaci Et statuit il acob ĩ p̾ceptũ: ⁊ isrł mn testiñ etꝰ Dices tibi dabo t̾ram chanaan: fu ncdũ heditatis ur̃e. Dũ e̾e̾nt nũo ocui. paucissimi ⁊ ĩcole ouis. Et ꝑtͣni eẽt de gnͣte ĩ gentẽ: ⁊ de regno ad ulũ alterũ. Non reliquit hoĩem',
'p̾mioꝵ. p̃s. b̾ildixit finis tuus inte. Et ĩminitas apee cato. Qua xp̃c donatus e̾ ps. p̾ucinsti eũ i bñ. dicidis ¶Infernans quo\uf1ac dupiex .s. adinacio. ps. laudat᷑ͣ ptc̃ce indesidus aĩe sue ⁊ ñquis bñdi. et cũ quis sibi tribuit bona que ht̃ atco. Iob. timebat enĩ ne forte peccau̾int fuii eius. ⁊ bñdix̾int deo incordib\uf1ac suis. Corꝑans ẽ ad carnis delecta tr̃em us. or̃s caro feñ. ⁊ oĩs gła euis qiͣ d̾r ꝑ ysaiam. ue qͥ niungitis domũ addom̃. ⁊ agr̃ ago copłatis us\uf1ac ad t̾minũ ioci. Nñquid ħ̾itabitis uos so',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 1.0000, 0.2812],
# [1.0000, 1.0000, 0.2812],
# [0.2812, 0.2812, 1.0000]])
Training Details
Training Dataset
Unnamed Dataset
- Size: 99,840 training samples
- Columns:
sentence_0andsentence_1 - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 6 tokens
- mean: 85.65 tokens
- max: 473 tokens
- min: 6 tokens
- mean: 85.65 tokens
- max: 473 tokens
- Samples:
sentence_0 sentence_1 Per totum namque mundum est mundus; et mundum persequitur mundus, coinquinatus mundum, perditus redemptum, damnatus salvatum.Per totum namque mundum est mundus; et mundum persequitur mundus, coinquinatus mundum, perditus redemptum, damnatus salvatum.motꝰ siait supͣ sepe dixmꝰ gꝰ anteon nem generanonem est motus ge eti am aute generaitionem primi mobilis est mo tus go etiam motus est. inte p̾mum mo tum᷑ ꝙ est impossibile go fint hec caisa ꝙ motus non eet̾ sꝑ momĩ p̾tito iprẽ ꝙ primum mobile oportet᷑ prius generari mẽe et postea moneri qr absq dubio se queret᷑ ꝙ quedam mutatio eet̃ anteilmotꝰ siait supͣ sepe dixmꝰ gꝰ anteon nem generanonem est motus ge eti am aute generaitionem primi mobilis est mo tus go etiam motus est. inte p̾mum mo tum᷑ ꝙ est impossibile go fint hec caisa ꝙ motus non eet̾ sꝑ momĩ p̾tito iprẽ ꝙ primum mobile oportet᷑ prius generari mẽe et postea moneri qr absq dubio se queret᷑ ꝙ quedam mutatio eet̃ anteilDictum est, id quod in nomine confuse significaretur, in definitione quae fit enumeratione partium, aperiri atque explicari. Quod fieri non potest, nisi per quarumdam partium nuncupationem; nihil enim dum explicatur oratione, totum simul dici potest. Quae cum ita sint, cumque omnis hujusmodi definitio quaedam sit partium distributio, quatuor his modis fieri potest. Aut enim substantiales partes explicantur, aut proprietatis partes dicuntur, aut quasi totius membra enumerantur, aut tanquam species dividuntur. Substantiales partes explicantur, cum ex genere ac differentiis definitio constituitur. Genus enim quod singulariter praedicatur, speciei totum est. Id genus sumptum in definitione, pars quaedam fit. Non enim solum speciem complet, nisi adjiciantur etiam differentiae, in quibus eadem ratio quae in genere est. Nam cum ipsae singulariter dictae totam speciem claudant, in definitione sumptae, partes speciei fiunt, quia non solum speciem quidem esse designant, sed etiam genus.Dictum est, id quod in nomine confuse significaretur, in definitione quae fit enumeratione partium, aperiri atque explicari. Quod fieri non potest, nisi per quarumdam partium nuncupationem; nihil enim dum explicatur oratione, totum simul dici potest. Quae cum ita sint, cumque omnis hujusmodi definitio quaedam sit partium distributio, quatuor his modis fieri potest. Aut enim substantiales partes explicantur, aut proprietatis partes dicuntur, aut quasi totius membra enumerantur, aut tanquam species dividuntur. Substantiales partes explicantur, cum ex genere ac differentiis definitio constituitur. Genus enim quod singulariter praedicatur, speciei totum est. Id genus sumptum in definitione, pars quaedam fit. Non enim solum speciem complet, nisi adjiciantur etiam differentiae, in quibus eadem ratio quae in genere est. Nam cum ipsae singulariter dictae totam speciem claudant, in definitione sumptae, partes speciei fiunt, quia non solum speciem quidem esse designant, sed etiam genus. - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "gather_across_devices": false }
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size: 128per_device_eval_batch_size: 128num_train_epochs: 1fp16: Truemulti_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 128per_device_eval_batch_size: 128per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torch_fusedoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robinrouter_mapping: {}learning_rate_mapping: {}
Training Logs
| Epoch | Step | Training Loss |
|---|---|---|
| 0.6410 | 500 | 0.1311 |
Framework Versions
- Python: 3.12.11
- Sentence Transformers: 5.1.0
- Transformers: 4.56.0
- PyTorch: 2.8.0+cu128
- Accelerate: 1.10.1
- Datasets: 4.0.0
- Tokenizers: 0.22.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 1