Built with Axolotl

See axolotl config

axolotl version: 0.12.2

adapter: qlora
base_model: Qwen/Qwen2.5-7B-Instruct
bf16: true
chat_template: qwen_25
datasets:
- ds_type: json
  field_messages: messages
  message_property_mappings:
    content: content
    role: role
  path: iot_train_chat.json
  split: train
  type: chat_template
embeddings_skip_upcast: true
flash_attention: true
fp16: false
gradient_accumulation_steps: 1
gradient_checkpointing: true
learning_rate: 0.0001
load_in_4bit: true
logging_steps: 1
lora_alpha: 64
lora_dropout: 0.05
lora_r: 32
lora_target_modules:
- q_proj
- k_proj
- v_proj
- o_proj
- gate_proj
- up_proj
- down_proj
micro_batch_size: 8
num_epochs: 1
optimizer: paged_adamw_8bit
output_dir: ./outputs/qwen-iot-lora
pad_to_sequence_len: true
sample_packing: true
save_steps: 50
save_strategy: steps
sequence_len: 4096
special_tokens:
  pad_token: <|endoftext|>
tokenizer_type: AutoTokenizer
trust_remote_code: true
warmup_steps: 10
xformers_attention: false

outputs/qwen-iot-lora

This model is a fine-tuned version of Qwen/Qwen2.5-7B-Instruct on the iot_train_chat.json dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.PAGED_ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 10
  • training_steps: 53

Training results

Framework versions

  • PEFT 0.17.0
  • Transformers 4.55.2
  • Pytorch 2.5.1+cu124
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for USato/qwen-iot-lora-instruct

Base model

Qwen/Qwen2.5-7B
Adapter
(735)
this model