File size: 2,085 Bytes
0ac2f7c e5fcddb 0ac2f7c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
---
base_model: zai-org/GLM-4.6
tags:
- rust
- Hyperswitch
- LoRA
- CPT
- Fine-Tuned
- Causal-LM
pipeline_tag: text-generation
language:
- en
datasets:
- AdityaNarayan/HyperSwitch-Repo-CPT-Dataset
---
# GLM-4.6-CPT-LoRA-HyperSwitch-v1
A LoRA fine-tuned model based on **zai-org/GLM-4.6** specialized for the [Hyperswitch](https://github.com/juspay/hyperswitch) Rust codebase. This model excels at understanding payment processing patterns, Hyperswitch architecture, and Rust development practices.
## π― Model Description
This LoRA adapter was trained on **16,731 samples** extracted from the Hyperswitch codebase to enhance code understanding, explanation, and generation within the payment processing domain.
- **Base Model**: zai-org/GLM-4.6
- **Training Type**: Causal Language Modeling (CLM) with LoRA
- **Domain**: Payment Processing, Rust Development
- **Specialization**: Hyperswitch codebase patterns and architecture
## π Training Details
### LoRA Configuration
```yaml
r: 8 # LoRA rank
alpha: 16 # LoRA alpha (2*r)
dropout: 0.05 # LoRA dropout
target_modules:
- "q_proj"
- "k_proj"
- "v_proj"
- "o_proj"
exclude_modules:
- "block_sparse_moe"
- "w1"
- "w2"
- "w3"
- "gate"
```
### Training Hyperparameters
- **Epochs**: 3
- **Learning Rate**: 2e-4 (cosine schedule)
- **Hardware**: 8 x NVIDIA H200
## π οΈ Technical Specifications
- **Precision**: bfloat16
- **Inference Speed**: Optimized with Flash Attention 2
## π Acknowledgments
- **Zai Team** for the excellent GLM 4.6 base model
- **Hyperswitch Team** for the open-source payment processing platform
- **Hugging Face** for the transformers and PEFT libraries
## π Citation
```bibtex
@misc{GLM-4.6-CPT-LoRA-HyperSwitch-v1,
title={AdityaNarayan/GLM-4.6-CPT-LoRA-HyperSwitch-v1},
author={Aditya Narayan},
year={2024},
publisher={Hugging Face},
url={https://huggingface.co/AdityaNarayan/GLM-4.6-CPT-LoRA-HyperSwitch-v1}
}
``` |