YAML Metadata Warning: empty or missing yaml metadata in repo card
Check out the documentation for more information.
Muaalem Model β TorchScript versions
This repository contains the exported TorchScript versions of the obadx/muaalem-model-v3_2 model.
Files
model_fp32.ptβ full precision (float32)model_fp16.ptβ half precision (float16)model_bf16.ptβ bfloat16 precision
Model Output
The model returns a tuple whose first element is a dictionary of logits for each CTC head. The dictionary keys correspond to the heads defined in the configuration: ['ghonna', 'hams_or_jahr', 'istitala', 'itbaq', 'phonemes', 'qalqla', 'safeer', 'shidda_or_rakhawa', 'tafashie', 'tafkheem_or_taqeeq', 'tikraar']
For example, to get phoneme logits: output[0]['phonemes'].
Loading the model in your code
Automatic precision selection based on GPU
import torch
from huggingface_hub import hf_hub_download
def load_optimized_model(repo_id="obadx/muaalem-v3_2-torchscript"):
if torch.cuda.is_available():
major, minor = torch.cuda.get_device_capability()
if major >= 8: # Ampere or newer
try:
model_file = hf_hub_download(repo_id=repo_id, filename="model_bf16.pt")
dtype = torch.bfloat16
except:
model_file = hf_hub_download(repo_id=repo_id, filename="model_fp16.pt")
dtype = torch.float16
else:
model_file = hf_hub_download(repo_id=repo_id, filename="model_fp16.pt")
dtype = torch.float16
else:
model_file = hf_hub_download(repo_id=repo_id, filename="model_fp32.pt")
dtype = torch.float32
model = torch.jit.load(model_file)
model = model.to(dtype)
model.eval()
return model, dtype
# Load processor from the same repo
from transformers import AutoFeatureExtractor
processor = AutoFeatureExtractor.from_pretrained(repo_id, subfolder="processor")
model, dtype = load_optimized_model()
print(f"Loaded model with dtype: {dtype}")
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support