Update README.md
Browse files
README.md
CHANGED
|
@@ -10,7 +10,7 @@ language:
|
|
| 10 |
|
| 11 |
π [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - π [Training Code](https://github.com/llm360/k2v2_train) - π’ [Evaluation Code](https://github.com/llm360/eval360)
|
| 12 |
|
| 13 |
-
ποΈ [Pretraining Data: TxT360](https://huggingface.co/datasets/LLM360/TxT360) - ποΈ [Midtraining Data: TxT360-Midas](https://huggingface.co/datasets/LLM360/TxT360-Midas) - ποΈ [SFT Data: TxT360-3efforts](https://huggingface.co/datasets/LLM360/TxT360-
|
| 14 |
|
| 15 |
|
| 16 |
K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
|
|
|
|
| 10 |
|
| 11 |
π [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - π [Training Code](https://github.com/llm360/k2v2_train) - π’ [Evaluation Code](https://github.com/llm360/eval360)
|
| 12 |
|
| 13 |
+
ποΈ [Pretraining Data: TxT360](https://huggingface.co/datasets/LLM360/TxT360) - ποΈ [Midtraining Data: TxT360-Midas](https://huggingface.co/datasets/LLM360/TxT360-Midas) - ποΈ [SFT Data: TxT360-3efforts](https://huggingface.co/datasets/LLM360/TxT360-3efforts)
|
| 14 |
|
| 15 |
|
| 16 |
K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
|