Foxtr0t/Ministral-3-14B-Reasoning-2512-mlx-mxfp8
This model was converted to MLX format from mistralai/Ministral-3-14B-Reasoning-2512 using mlx-vlm version 0.3.11.
Refer to the original model card for more details on the model.
Parameter fix
LM Studio can't load this mxfp8 model properly right now.
Will submit PR soon.
Use with mlx
pip install -U mlx-vlm
python -m mlx_vlm.generate --model Foxtr0t/Ministral-3-14B-Reasoning-2512-mlx-mxfp8 --max-tokens 100 --temperature 0.0 --prompt "Describe this image." --image <path_to_image>
- Downloads last month
- 80
Model size
4B params
Tensor type
U8
路
U32 路
BF16 路
Hardware compatibility
Log In to add your hardware
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support
Model tree for Foxtr0t/Ministral-3-14B-Reasoning-2512-mlx-mxfp8
Base model
mistralai/Ministral-3-14B-Base-2512