OpenHermes 13B 40K

It's a bit language drunk at the moment, I'm going to sober it up soon. As a dyslexic, I know kin when I see it 🤡 It certainly won't help you with your spelling! I'll be fine-tuning back up again soon, so it's ickle AI flamigo legs won't bow and wobble like it has to pee anymore. Or in nerdo corperate speak Context window extended to 32K via linear RoPE scaling. Requires post-RoPE stabilization finetune for coherence.

Downloads last month
6
Safetensors
Model size
13B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Babsie/OpenHermes2-13B-32K

Finetuned
(12)
this model
Quantizations
2 models

Collection including Babsie/OpenHermes2-13B-32K