Base model: Mistral-7B, pruned to 4B.
FFT on 2K PIPPA somewhat 'clean' examples.
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Base model: Mistral-7B, pruned to 4B.
FFT on 2K PIPPA somewhat 'clean' examples.