|
--- |
|
tags: |
|
- sentence-transformers |
|
- sentence-similarity |
|
- feature-extraction |
|
- generated_from_trainer |
|
- dataset_size:21123868 |
|
- loss:CachedMultipleNegativesRankingLoss |
|
base_model: sentence-transformers/paraphrase-multilingual-mpnet-base-v2 |
|
widget: |
|
- source_sentence: 系统管理员技术员——TS/SCI级别并拥有多项式验证 |
|
sentences: |
|
- >- |
|
support development of annual budget, create a financial report, report |
|
analysis results, Microsoft Access, accounting, use presentation software, |
|
interpret financial statements, synthesise financial information, develop |
|
vaccines, handle financial overviews of the store, produce statistical |
|
financial records, develop financial statistics reports, explain accounting |
|
records, financial analysis, SAP R3, represent the company, examine budgets, |
|
prepare presentation material, use spreadsheets software, forecast account |
|
metrics, meet deadlines, prepare financial projections, manage budgets, |
|
exercise self-control, financial statements |
|
- >- |
|
ensure cross-department cooperation, establish customer rapport, improve |
|
business processes, manage technical security systems, handle incidents, |
|
maintain ICT system, explain characteristics of computer peripheral |
|
equipment, gather technical information, collaborate in company's daily |
|
operations , apply change management, maintain technical equipment, |
|
communicate with customers, solve technical problems, perform ICT |
|
troubleshooting, use ICT equipment in maintenance activities, manage major |
|
incidents, build business relationships, computer engineering, perform |
|
software recovery testing, identify process improvements, maintain |
|
relationship with customers, carry out project activities, collaborate in |
|
the development of marketing strategies, computer technology, technical |
|
terminology |
|
- >- |
|
utilise machine learning, cloud technologies, develop predictive models, |
|
assess sportive performance, formulate findings , principles of artificial |
|
intelligence, perform business research, communicate with stakeholders, |
|
computer engineering, build predictive models, computer science, develop |
|
automated software tests, analyse business objectives, Agile development, |
|
cloud monitoring and reporting, provide written content, obtain relevant |
|
licenses, design prototypes, machine learning, e-learning software |
|
infrastructure, analyse education system, disseminate results to the |
|
scientific community, learning technologies, ML (computer programming), task |
|
algorithmisation |
|
- source_sentence: 安全运营官 |
|
sentences: |
|
- >- |
|
deliver outstanding service, manage carriers, direct customers to |
|
merchandise, improve customer interaction, manage time, support managers, |
|
assist customers, process customer orders, manage customer service, satisfy |
|
customers, guarantee customer satisfaction, respond to customers' inquiries |
|
- >- |
|
manage several projects, implement operational business plans, identify |
|
improvement actions, develop strategy to solve problems, manage website, |
|
carry out project activities, follow reporting procedures, supervise site |
|
maintenance, adjust priorities, schedule shifts, conduct public |
|
presentations, motivate others, manage operational budgets, report to the |
|
team leader, encourage teams for continuous improvement, lead the |
|
sustainability reporting process, implement sustainable procurement, show an |
|
exemplary leading role in an organisation, manage manufacturing facilities, |
|
develop training programmes, develop production line, supply chain |
|
management, leadership principles, lead a team, coaching techniques |
|
- >- |
|
provide emergency supplies, provide first aid, liaise with security |
|
authorities, apply medical first aid in case of emergency, regulate traffic, |
|
train security officers, maintain physical fitness, provide protective |
|
escort, ensure public safety and security, ensure inspections of facilities, |
|
work in inclement conditions, follow procedures in the event of an alarm, |
|
set safety and security standards, comply with the principles of |
|
self-defence, present reports, maintain facility security systems, conduct |
|
security screenings, types of evaluation , monitor security measures, office |
|
equipment, escort pedestrians across streets, advise on security staff |
|
selection, wear appropriate protective gear, work in outdoor conditions, |
|
assist emergency services |
|
- source_sentence: Empleado de control de COVID |
|
sentences: |
|
- >- |
|
maintain records of clients' prescriptions, assist people in contaminated |
|
areas, label samples, maintain museum records, apply social distancing |
|
protocols, collect biological samples from patients, infection control, |
|
label medical laboratory samples, disinfect surfaces, maintain customer |
|
records, ensure health and safety of staff, personal protective equipment, |
|
remove contaminated materials, store contaminated materials, prepare |
|
prescription labels, use personal protection equipment |
|
- >- |
|
promote organisational communication, provide legal advice, human resource |
|
management, company policies, perform customer management, business |
|
processes, ensure compliance with legal requirements, develop communications |
|
strategies, enforce company values, develop outreach training plans, use |
|
consulting techniques, develop employment policies, human resources |
|
department processes, personnel management, identify training needs, |
|
participate in health personnel training, health and safety in the |
|
workplace, lead police investigations, ensure compliance with policies, |
|
prepare compliance documents, perform internal investigations, develop |
|
employee retention programs, develop corporate training programmes, customer |
|
relationship management, manage localisation |
|
- >- |
|
perform escalation procedure, imprint visionary aspirations into the |
|
business management, observe confidentiality, impart business plans to |
|
collaborators, lead a team, human resources department processes, respect |
|
confidentiality obligations, hire human resources, manage commercial risks, |
|
develop business plans, communicate with stakeholders, maintain relationship |
|
with customers, manage several projects, provide improvement strategies, |
|
manage technical security systems, knowledge management, risk management, |
|
develop program ideas, perform project management, project management, cope |
|
with uncertainty, address identified risks, provide performance feedback, |
|
information confidentiality, track key performance indicators |
|
- source_sentence: Aerie - Brand Ambassador (Sales Associate) - US |
|
sentences: |
|
- >- |
|
lay bricks, provide first aid, enforce park rules, conflict management, give |
|
swimming lessons, assist in performing physical exercises, perform park |
|
safety inspections, assist in the movement of heavy loads, lead a team, |
|
first aid, supervise pool activities, swim, coach staff for running the |
|
performance, show an exemplary leading role in an organisation, teach public |
|
speaking principles, collaborate with coaching team, supervise work, |
|
calculate stairs rise and run, calculate compensation payments, manage a |
|
team, information confidentiality |
|
- >- |
|
react to events in time-critical environments, operate in a specific field |
|
of nursing care, clinical science, promote healthy fitness environment, lead |
|
others, comply with legislation related to health care, maintain a safe, |
|
hygienic and secure working environment, provide healthcare services to |
|
patients in specialised medicine, write English, conduct physical |
|
examinations, leadership principles, use clinical assessment techniques, |
|
apply context specific clinical competences, conduct health related |
|
research, conceptualise healthcare user’s needs, assessment processes, |
|
communicate in healthcare, provide professional care in nursing, nursing |
|
science, promote health and safety, implement policy in healthcare |
|
practices, engage with stakeholders, identify problems, respond to changing |
|
situations in health care, perform resource planning |
|
- >- |
|
ensure the privacy of guests, provide customised products, company policies, |
|
exude enthusiasm during the action sessions, provide customer guidance on |
|
product selection, collect briefing regarding products, perform multiple |
|
tasks at the same time, create solutions to problems, respond to visitor |
|
complaints |
|
- source_sentence: 医师——危重症护理——重症监护专家——项目医务总监 |
|
sentences: |
|
- >- |
|
handle incidents, provide technical documentation, coordinate operational |
|
activities, ensure information security, work in teams, manage manufacturing |
|
documentation, project configuration management, operate call distribution |
|
system, maintain computer hardware, apply change management, manage aircraft |
|
support systems, perform escalation procedure, manage production |
|
changeovers, maintenance operations, call-centre technologies, manage |
|
service contracts in the drilling industry, encourage teambuilding, manage |
|
major incidents, resolve equipment malfunctions, work independently, think |
|
analytically, manage maintenance operations, maintain plan for continuity of |
|
operations |
|
- >- |
|
develop recycling programs, receive actors' resumes, work in cold |
|
environments, perform cleaning duties, operate floor cleaning equipment, |
|
operate forklift |
|
- >- |
|
perform technical tasks with great care, supervise medical residents, manage |
|
a multidisciplinary team involved in patient care, administrative tasks in a |
|
medical environment, demonstrate technical skills during neurological |
|
surgery, apply problem solving in social service, intensive care medicine, |
|
provide comprehensive care for patients with surgical conditions, work in |
|
teams, solve problems |
|
pipeline_tag: sentence-similarity |
|
library_name: sentence-transformers |
|
co2_eq_emissions: |
|
emissions: 717.3535184611766 |
|
energy_consumed: 1.9440474755045436 |
|
source: codecarbon |
|
training_type: fine-tuning |
|
on_cloud: true |
|
cpu_model: Intel(R) Xeon(R) CPU @ 2.20GHz |
|
ram_total_size: 83.47684860229492 |
|
hours_used: 5.34 |
|
hardware_used: 1 x NVIDIA A100-SXM4-40GB |
|
license: mit |
|
language: |
|
- en |
|
- es |
|
- de |
|
- zh |
|
- mul |
|
- multilingual |
|
--- |
|
|
|
# SentenceTransformer based on sentence-transformers/paraphrase-multilingual-mpnet-base-v2 |
|
|
|
This is a [sentence-transformers](https://www.SBERT.net) model specifically trained for job title matching and similarity. It's finetuned from [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) on a large dataset of job titles and their associated skills/requirements across multiple languages. The model maps English, Spanish, German and Chinese job titles and descriptions to a 1024-dimensional dense vector space and can be used for semantic job title matching, job similarity search, and related HR/recruitment tasks. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
- **Model Type:** Sentence Transformer |
|
- **Base model:** [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2) <!-- at revision 84fccfe766bcfd679e39efefe4ebf45af190ad2d --> |
|
- **Maximum Sequence Length:** 64 tokens |
|
- **Output Dimensionality:** 1024 dimensions |
|
- **Similarity Function:** Cosine Similarity |
|
- **Training Dataset:** 4 x 5.2M high-quality job title - skills pairs in English, Spanish, German and Chinese |
|
|
|
### Model Sources |
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net) |
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) |
|
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) |
|
|
|
### Full Model Architecture |
|
|
|
``` |
|
SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 64, 'do_lower_case': False}) with Transformer model: XLMRobertaModel |
|
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) |
|
(2): Asym( |
|
(anchor-0): Dense({'in_features': 768, 'out_features': 1024, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'}) |
|
(positive-0): Dense({'in_features': 768, 'out_features': 1024, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'}) |
|
) |
|
) |
|
``` |
|
|
|
## Usage |
|
|
|
### Direct Usage (Sentence Transformers) |
|
|
|
First install the Sentence Transformers library: |
|
|
|
```bash |
|
pip install -U sentence-transformers |
|
``` |
|
|
|
Then you can load and use the model with the following code: |
|
```python |
|
import torch |
|
import numpy as np |
|
from tqdm.auto import tqdm |
|
from sentence_transformers import SentenceTransformer |
|
from sentence_transformers.util import batch_to_device, cos_sim |
|
|
|
# Load the model |
|
model = SentenceTransformer("TechWolf/JobBERT-v3") |
|
|
|
def encode_batch(jobbert_model, texts): |
|
features = jobbert_model.tokenize(texts) |
|
features = batch_to_device(features, jobbert_model.device) |
|
features["text_keys"] = ["anchor"] |
|
with torch.no_grad(): |
|
out_features = jobbert_model.forward(features) |
|
return out_features["sentence_embedding"].cpu().numpy() |
|
|
|
def encode(jobbert_model, texts, batch_size: int = 8): |
|
# Sort texts by length and keep track of original indices |
|
sorted_indices = np.argsort([len(text) for text in texts]) |
|
sorted_texts = [texts[i] for i in sorted_indices] |
|
|
|
embeddings = [] |
|
|
|
# Encode in batches |
|
for i in tqdm(range(0, len(sorted_texts), batch_size)): |
|
batch = sorted_texts[i:i+batch_size] |
|
embeddings.append(encode_batch(jobbert_model, batch)) |
|
|
|
# Concatenate embeddings and reorder to original indices |
|
sorted_embeddings = np.concatenate(embeddings) |
|
original_order = np.argsort(sorted_indices) |
|
return sorted_embeddings[original_order] |
|
|
|
# Example usage |
|
job_titles = [ |
|
'Software Engineer', |
|
'高级软件开发人员', # senior software developer |
|
'Produktmanager', # product manager |
|
'Científica de datos' # data scientist |
|
] |
|
|
|
# Get embeddings |
|
embeddings = encode(model, job_titles) |
|
|
|
# Calculate cosine similarity matrix |
|
similarities = cos_sim(embeddings, embeddings) |
|
print(similarities) |
|
``` |
|
|
|
The output will be a similarity matrix where each value represents the cosine similarity between two job titles: |
|
|
|
``` |
|
tensor([[1.0000, 0.8087, 0.4673, 0.5669], |
|
[0.8087, 1.0000, 0.4428, 0.4968], |
|
[0.4673, 0.4428, 1.0000, 0.4292], |
|
[0.5669, 0.4968, 0.4292, 1.0000]]) |
|
``` |
|
|
|
|
|
<!-- |
|
### Direct Usage (Transformers) |
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Downstream Usage (Sentence Transformers) |
|
|
|
You can finetune this model on your own dataset. |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Out-of-Scope Use |
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
--> |
|
|
|
<!-- |
|
## Bias, Risks and Limitations |
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
--> |
|
|
|
<!-- |
|
### Recommendations |
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
--> |
|
|
|
## Training Details |
|
|
|
### Training Dataset |
|
|
|
#### Unnamed Dataset |
|
|
|
* Size: 21,123,868 training samples |
|
* Columns: <code>anchor</code> and <code>positive</code> |
|
* Approximate statistics based on the first 1000 samples: |
|
| | anchor | positive | |
|
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| |
|
| type | string | string | |
|
| details | <ul><li>min: 4 tokens</li><li>mean: 10.56 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 61.08 tokens</li><li>max: 64 tokens</li></ul> | |
|
* Samples: |
|
| anchor | positive | |
|
|:-----------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| <code>通信与培训专员</code> | <code>deliver online training, liaise with educational support staff, interact with an audience, construct individual learning plans, lead a team, develop corporate training programmes, learning technologies, communication, identify with the company's goals, address an audience, learning management systems, use presentation software, motivate others, provide learning support, engage with stakeholders, identify skills gaps, meet expectations of target audience, develop training programmes</code> | |
|
| <code>Associate Infrastructure Engineer</code> | <code>create solutions to problems, design user interface, cloud technologies, use databases, automate cloud tasks, keep up-to-date to computer trends, work in teams, use object-oriented programming, keep updated on innovations in various business fields, design principles, Angular, adapt to changing situations, JavaScript, Agile development, manage stable, Swift (computer programming), keep up-to-date to design industry trends, monitor technology trends, web programming, provide mentorship, advise on efficiency improvements, adapt to change, JavaScript Framework, database management systems, stimulate creative processes</code> | |
|
| <code>客户顾问/出纳</code> | <code>customer service, handle financial transactions, adapt to changing situations, have computer literacy, manage cash desk, attend to detail, provide customer guidance on product selection, perform multiple tasks at the same time, carry out financial transactions, provide membership service, manage accounts, adapt to change, identify customer's needs, solve problems</code> | |
|
* Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters: |
|
```json |
|
{ |
|
"scale": 20.0, |
|
"similarity_fct": "cos_sim", |
|
"mini_batch_size": 512 |
|
} |
|
``` |
|
|
|
### Training Hyperparameters |
|
#### Non-Default Hyperparameters |
|
|
|
- `overwrite_output_dir`: True |
|
- `per_device_train_batch_size`: 2048 |
|
- `per_device_eval_batch_size`: 2048 |
|
- `num_train_epochs`: 1 |
|
- `fp16`: True |
|
|
|
#### All Hyperparameters |
|
<details><summary>Click to expand</summary> |
|
|
|
- `overwrite_output_dir`: True |
|
- `do_predict`: False |
|
- `eval_strategy`: no |
|
- `prediction_loss_only`: True |
|
- `per_device_train_batch_size`: 2048 |
|
- `per_device_eval_batch_size`: 2048 |
|
- `per_gpu_train_batch_size`: None |
|
- `per_gpu_eval_batch_size`: None |
|
- `gradient_accumulation_steps`: 1 |
|
- `eval_accumulation_steps`: None |
|
- `torch_empty_cache_steps`: None |
|
- `learning_rate`: 5e-05 |
|
- `weight_decay`: 0.0 |
|
- `adam_beta1`: 0.9 |
|
- `adam_beta2`: 0.999 |
|
- `adam_epsilon`: 1e-08 |
|
- `max_grad_norm`: 1.0 |
|
- `num_train_epochs`: 1 |
|
- `max_steps`: -1 |
|
- `lr_scheduler_type`: linear |
|
- `lr_scheduler_kwargs`: {} |
|
- `warmup_ratio`: 0.0 |
|
- `warmup_steps`: 0 |
|
- `log_level`: passive |
|
- `log_level_replica`: warning |
|
- `log_on_each_node`: True |
|
- `logging_nan_inf_filter`: True |
|
- `save_safetensors`: True |
|
- `save_on_each_node`: False |
|
- `save_only_model`: False |
|
- `restore_callback_states_from_checkpoint`: False |
|
- `no_cuda`: False |
|
- `use_cpu`: False |
|
- `use_mps_device`: False |
|
- `seed`: 42 |
|
- `data_seed`: None |
|
- `jit_mode_eval`: False |
|
- `use_ipex`: False |
|
- `bf16`: False |
|
- `fp16`: True |
|
- `fp16_opt_level`: O1 |
|
- `half_precision_backend`: auto |
|
- `bf16_full_eval`: False |
|
- `fp16_full_eval`: False |
|
- `tf32`: None |
|
- `local_rank`: 0 |
|
- `ddp_backend`: None |
|
- `tpu_num_cores`: None |
|
- `tpu_metrics_debug`: False |
|
- `debug`: [] |
|
- `dataloader_drop_last`: False |
|
- `dataloader_num_workers`: 0 |
|
- `dataloader_prefetch_factor`: None |
|
- `past_index`: -1 |
|
- `disable_tqdm`: False |
|
- `remove_unused_columns`: True |
|
- `label_names`: None |
|
- `load_best_model_at_end`: False |
|
- `ignore_data_skip`: False |
|
- `fsdp`: [] |
|
- `fsdp_min_num_params`: 0 |
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} |
|
- `fsdp_transformer_layer_cls_to_wrap`: None |
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} |
|
- `deepspeed`: None |
|
- `label_smoothing_factor`: 0.0 |
|
- `optim`: adamw_torch |
|
- `optim_args`: None |
|
- `adafactor`: False |
|
- `group_by_length`: False |
|
- `length_column_name`: length |
|
- `ddp_find_unused_parameters`: None |
|
- `ddp_bucket_cap_mb`: None |
|
- `ddp_broadcast_buffers`: False |
|
- `dataloader_pin_memory`: True |
|
- `dataloader_persistent_workers`: False |
|
- `skip_memory_metrics`: True |
|
- `use_legacy_prediction_loop`: False |
|
- `push_to_hub`: False |
|
- `resume_from_checkpoint`: None |
|
- `hub_model_id`: None |
|
- `hub_strategy`: every_save |
|
- `hub_private_repo`: None |
|
- `hub_always_push`: False |
|
- `gradient_checkpointing`: False |
|
- `gradient_checkpointing_kwargs`: None |
|
- `include_inputs_for_metrics`: False |
|
- `include_for_metrics`: [] |
|
- `eval_do_concat_batches`: True |
|
- `fp16_backend`: auto |
|
- `push_to_hub_model_id`: None |
|
- `push_to_hub_organization`: None |
|
- `mp_parameters`: |
|
- `auto_find_batch_size`: False |
|
- `full_determinism`: False |
|
- `torchdynamo`: None |
|
- `ray_scope`: last |
|
- `ddp_timeout`: 1800 |
|
- `torch_compile`: False |
|
- `torch_compile_backend`: None |
|
- `torch_compile_mode`: None |
|
- `dispatch_batches`: None |
|
- `split_batches`: None |
|
- `include_tokens_per_second`: False |
|
- `include_num_input_tokens_seen`: False |
|
- `neftune_noise_alpha`: None |
|
- `optim_target_modules`: None |
|
- `batch_eval_metrics`: False |
|
- `eval_on_start`: False |
|
- `use_liger_kernel`: False |
|
- `eval_use_gather_object`: False |
|
- `average_tokens_across_devices`: False |
|
- `prompts`: None |
|
- `batch_sampler`: batch_sampler |
|
- `multi_dataset_batch_sampler`: proportional |
|
|
|
</details> |
|
|
|
### Training Logs |
|
| Epoch | Step | Training Loss | |
|
|:------:|:-----:|:-------------:| |
|
| 0.0485 | 500 | 3.89 | |
|
| 0.0969 | 1000 | 3.373 | |
|
| 0.1454 | 1500 | 3.1715 | |
|
| 0.1939 | 2000 | 3.0414 | |
|
| 0.2424 | 2500 | 2.9462 | |
|
| 0.2908 | 3000 | 2.8691 | |
|
| 0.3393 | 3500 | 2.8048 | |
|
| 0.3878 | 4000 | 2.7501 | |
|
| 0.4363 | 4500 | 2.7026 | |
|
| 0.4847 | 5000 | 2.6601 | |
|
| 0.5332 | 5500 | 2.6247 | |
|
| 0.5817 | 6000 | 2.5951 | |
|
| 0.6302 | 6500 | 2.5692 | |
|
| 0.6786 | 7000 | 2.5447 | |
|
| 0.7271 | 7500 | 2.5221 | |
|
| 0.7756 | 8000 | 2.5026 | |
|
| 0.8240 | 8500 | 2.4912 | |
|
| 0.8725 | 9000 | 2.4732 | |
|
| 0.9210 | 9500 | 2.4608 | |
|
| 0.9695 | 10000 | 2.4548 | |
|
|
|
|
|
### Environmental Impact |
|
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon). |
|
- **Energy Consumed**: 1.944 kWh |
|
- **Carbon Emitted**: 0.717 kg of CO2 |
|
- **Hours Used**: 5.34 hours |
|
|
|
### Training Hardware |
|
- **On Cloud**: Yes |
|
- **GPU Model**: 1 x NVIDIA A100-SXM4-40GB |
|
- **CPU Model**: Intel(R) Xeon(R) CPU @ 2.20GHz |
|
- **RAM Size**: 83.48 GB |
|
|
|
### Framework Versions |
|
- Python: 3.10.16 |
|
- Sentence Transformers: 4.1.0 |
|
- Transformers: 4.48.3 |
|
- PyTorch: 2.6.0+cu126 |
|
- Accelerate: 1.3.0 |
|
- Datasets: 3.5.1 |
|
- Tokenizers: 0.21.0 |
|
|
|
## Citation |
|
|
|
### BibTeX |
|
|
|
#### Sentence Transformers |
|
```bibtex |
|
@inproceedings{reimers-2019-sentence-bert, |
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", |
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", |
|
month = "11", |
|
year = "2019", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://arxiv.org/abs/1908.10084", |
|
} |
|
``` |
|
|
|
#### CachedMultipleNegativesRankingLoss |
|
```bibtex |
|
@misc{gao2021scaling, |
|
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, |
|
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan}, |
|
year={2021}, |
|
eprint={2101.06983}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.LG} |
|
} |
|
``` |
|
|
|
<!-- |
|
## Glossary |
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Authors |
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Contact |
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
--> |