Add model card for text encoder
#1
by
nielsr HF Staff - opened
This PR adds a comprehensive model card for the text encoder model.
Specifically, it:
- Links the model to its corresponding paper: Should We Still Pretrain Encoders with Masked Language Modeling?.
- Includes the paper's abstract to provide immediate context.
- Adds links to the project page (https://hf.co/MLMvsCLM) and the associated codebase on GitHub (https://github.com/Nicolas-BZRD/EuroBERT).
- Adds essential metadata:
pipeline_tag: feature-extraction,library_name: transformers, andlicense: apache-2.0, making the model discoverable and properly categorized on the Hugging Face Hub. - Provides a clear Python usage example for feature extraction using the
transformerslibrary, including considerations for custom architectures (trust_remote_code) andbfloat16dtype. - Includes the correct BibTeX citation for the paper.