Runs On My 64GB Mac
Collection
The world is your oyster—for small pearls
•
80 items
•
Updated
•
7
Gee, people like this model. I'll make a larger quant.
Keep in mind, this is the brainstomed old Ernie, not the new Ernie.
Would be nice if there would be MLX support for the new Ernie, but there isn't.
So, this is it.
-G
This model ERNIE-4.5-36B-A3B-Thinking-Brainstorm20x-qx64-hi-mlx was converted to MLX format from DavidAU/ERNIE-4.5-36B-A3B-Thinking-Brainstorm20x using mlx-lm version 0.27.1.
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("ERNIE-4.5-36B-A3B-Thinking-Brainstorm20x-qx64-hi-mlx")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
Base model
baidu/ERNIE-4.5-21B-A3B-Thinking