prince-canuma's picture
7523aa286cd41a7c6be7a19d25a1bae5742cdd38b71c0b50ec4564bb7a0dad7e
fab178f verified
|
raw
history blame
693 Bytes
metadata
language:
  - code
license: other
tags:
  - code
  - mlx
inference: false
license_name: mnpl
license_link: https://mistral.ai/licences/MNPL-0.1.md

mlx-community/Codestral-22B-v0.1-4bit

The Model mlx-community/Codestral-22B-v0.1-4bit was converted to MLX format from bullerwins/Codestral-22B-v0.1-hf using mlx-lm version 0.14.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Codestral-22B-v0.1-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)