Update README.md
Browse files
README.md
CHANGED
|
@@ -12,7 +12,7 @@ base_model:
|
|
| 12 |
|
| 13 |
π [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - π [Training Code](https://github.com/llm360/k2v2_train) - π’ [Evaluation Code](https://github.com/llm360/eval360)
|
| 14 |
|
| 15 |
-
ποΈ [Pretraining Data: TxT360](https://huggingface.co/datasets/LLM360/TxT360) - ποΈ [Midtraining Data: TxT360-Midas](https://huggingface.co/datasets/LLM360/TxT360-Midas) - ποΈ [SFT Data: TxT360-3efforts](https://huggingface.co/datasets/LLM360/TxT360-
|
| 16 |
|
| 17 |
K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
|
| 18 |
|
|
|
|
| 12 |
|
| 13 |
π [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - π [Training Code](https://github.com/llm360/k2v2_train) - π’ [Evaluation Code](https://github.com/llm360/eval360)
|
| 14 |
|
| 15 |
+
ποΈ [Pretraining Data: TxT360](https://huggingface.co/datasets/LLM360/TxT360) - ποΈ [Midtraining Data: TxT360-Midas](https://huggingface.co/datasets/LLM360/TxT360-Midas) - ποΈ [SFT Data: TxT360-3efforts](https://huggingface.co/datasets/LLM360/TxT360-3efforts)
|
| 16 |
|
| 17 |
K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
|
| 18 |
|