Models
To view all of the models, check out mlx-lm
on Huggingface (opens in a new tab). Our MLX server uses all of the models supported by the mlx-community (opens in a new tab).
Popular Models
Here's a list of some of the most popular models and their usage:
Mistral 7B Instruct Nous Research
from mlxserver import MLXServer
server = MLXServer(model="mlx-community/Nous-Hermes-2-Mistral-7B-DPO-4bit-MLX")
Mixtral Instruct
from mlxserver import MLXServer
server = MLXServer(model="mlx-community/Mixtral-8x7B-Instruct-v0.1")
Gemma
from mlxserver import MLXServer
server = MLXServer(model="mlx-community/quantized-gemma-7b-it")
CodeLlama 7B Instruct
from mlxserver import MLXServer
server = MLXServer(model="mlx-community/CodeLlama-7b-mlx")