Update README.md
Browse files
README.md
CHANGED
|
@@ -30,6 +30,14 @@ hf download huihui-ai/Huihui-gpt-oss-20b-BF16-abliterated-v2 --local-dir ./huihu
|
|
| 30 |
python convert_oai_mxfp4_weight_only.py --model_path huihui-ai/Huihui-gpt-oss-20b-BF16-abliterated-v2/ --output_path huihui-ai/Huihui-gpt-oss-20b-mxfp4-abliterated-v2/
|
| 31 |
```
|
| 32 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
## Usage
|
| 34 |
You can use this model in your applications by loading it with Hugging Face's `transformers` library:
|
| 35 |
|
|
|
|
| 30 |
python convert_oai_mxfp4_weight_only.py --model_path huihui-ai/Huihui-gpt-oss-20b-BF16-abliterated-v2/ --output_path huihui-ai/Huihui-gpt-oss-20b-mxfp4-abliterated-v2/
|
| 31 |
```
|
| 32 |
|
| 33 |
+
## ollama
|
| 34 |
+
Ollama requires the latest version: [v0.11.8](https://github.com/ollama/ollama/releases/tag/v0.11.8)
|
| 35 |
+
|
| 36 |
+
You can use [huihui_ai/gpt-oss-abliterated:20b-mxfp4](https://ollama.com/huihui_ai/gpt-oss-abliterated:20b-mxfp4) directly,
|
| 37 |
+
```
|
| 38 |
+
ollama run huihui_ai/gpt-oss-abliterated:20b-mxfp4
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
## Usage
|
| 42 |
You can use this model in your applications by loading it with Hugging Face's `transformers` library:
|
| 43 |
|