Instructions to use joy2000/mistral_instruct_generation with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use joy2000/mistral_instruct_generation with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1") model = PeftModel.from_pretrained(base_model, "joy2000/mistral_instruct_generation") - Notebooks
- Google Colab
- Kaggle
File size: 129 Bytes
21eb59b | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:724805ed4a95b009197108879cb9600e2247aa6f40849f94804451976ce2dc4c
size 4728
|