Eviation commited on
Commit
29f903a
·
verified ·
1 Parent(s): 69ca68a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -47,6 +47,7 @@ Using [llama.cpp quantize cae9fb4](https://github.com/ggerganov/llama.cpp/commit
47
  | [flux1-dev-IQ4_NL.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-IQ4_NL.gguf) | IQ4_NL | 6.79GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_IQ4_NL_512_25_woman.png) |
48
  | [flux1-dev-Q4_K_S.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q4_K_S.gguf) | Q4_K_S | 6.79GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_Q4_K_S_512_25_woman.png) |
49
  | [flux1-dev-Q4_1.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q4_1.gguf) | Q4_1 | 7.53GB | TBC | - |
 
50
  | [flux1-dev-Q5_K_S.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q5_K_S.gguf) | Q5_K_S | 8.27GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_Q5_K_S_512_25_woman.png) |
51
 
52
  ## Observations
 
47
  | [flux1-dev-IQ4_NL.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-IQ4_NL.gguf) | IQ4_NL | 6.79GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_IQ4_NL_512_25_woman.png) |
48
  | [flux1-dev-Q4_K_S.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q4_K_S.gguf) | Q4_K_S | 6.79GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_Q4_K_S_512_25_woman.png) |
49
  | [flux1-dev-Q4_1.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q4_1.gguf) | Q4_1 | 7.53GB | TBC | - |
50
+ | [flux1-dev-Q5_0.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q5_0.gguf) | Q5_0 | 8.27GB | TBC | - |
51
  | [flux1-dev-Q5_K_S.gguf](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/flux1-dev-Q5_K_S.gguf) | Q5_K_S | 8.27GB | TBC | [Example](https://huggingface.co/Eviation/flux-imatrix/blob/main/experimental-from-f16-combined/images/output_test_comb_Q5_K_S_512_25_woman.png) |
52
 
53
  ## Observations