MiniMax-M2-4bit not supported
when running the example command on my MAC Studio M3 Ultra with 512GB of memory I get:
(MiniMax-M2) gbobafett@bobafetts-Mac-Studio MiniMax-M2 % uv pip show mlx-lm
Name: mlx-lm
Version: 0.28.3
Location: /Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/.venv/lib/python3.12/site-packages
Requires: jinja2, mlx, numpy, protobuf, pyyaml, transformers
Required-by:
(MiniMax-M2) bobafett@bobafetts-Mac-Studio MiniMax-M2 % python example.py
Fetching 37 files: 100%|███████████████████████████████████████████████| 37/37 [00:00<00:00, 6432.45it/s]
ERROR:root:Model type minimax not supported.
Traceback (most recent call last):
File "/Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/.venv/lib/python3.12/site-packages/mlx_lm/utils.py", line 68, in _get_classes
arch = importlib.import_module(f"mlx_lm.models.{model_type}")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/garryosborne/.local/share/uv/python/cpython-3.12.9-macos-aarch64-none/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1387, in _gcd_import
File "", line 1360, in _find_and_load
File "", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'mlx_lm.models.minimax'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/example.py", line 4, in
model, tokenizer = load("mlx-community/MiniMax-M2-4bit")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/.venv/lib/python3.12/site-packages/mlx_lm/utils.py", line 266, in load
model, config = load_model(model_path, lazy, model_config=model_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/.venv/lib/python3.12/site-packages/mlx_lm/utils.py", line 178, in load_model
model_class, model_args_class = get_model_classes(config=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/AI_MODELS_2/MoreModels/MiniMax-M2/.venv/lib/python3.12/site-packages/mlx_lm/utils.py", line 72, in _get_classes
raise ValueError(msg)
ValueError: Model type minimax not supported.
I got the same issue. im assuming its because the model support is coming in the 0.28.4 version, whereas the version released and what we have available is 0.28.3
Correct, support is merged but 0.28.4 is yet to be released.