Description
This repo contains quantized files of Mistral-7B with the LoRA lemonilia/LimaRP-Mistral-7B-v0.1 applied at weight "0.75".
All credit goes to lemonilia
- Downloads last month
- 7
Hardware compatibility
Log In
to view the estimation
5-bit
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support