Porting the SSL (self-supervised-learning) version of the Omnilingual-Asr W2V2 release from Meta to transformers. 1B checkpoint. More on the official repo.

Almost the same usage as indicated here.

Downloads last month
22
Safetensors
Model size
1.0B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train ylacombe/omniASR_W2V_1B_SSL