finbert-ft-icar-a-v0.11
This model is a fine-tuned version of ProsusAI/finbert on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8809
- Accuracy: 0.8847
- Precision: 0.8742
- Recall: 0.8438
- F1: 0.8566
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 3
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| 2.2028 | 1.0 | 704 | 0.7380 | 0.7902 | 0.8014 | 0.7093 | 0.7293 |
| 1.6961 | 2.0 | 1408 | 0.8784 | 0.8280 | 0.8359 | 0.7532 | 0.7712 |
| 1.5125 | 3.0 | 2112 | 0.9310 | 0.8318 | 0.8383 | 0.7574 | 0.7740 |
| 1.1895 | 4.0 | 2816 | 0.9300 | 0.8374 | 0.8490 | 0.7531 | 0.7737 |
| 0.9869 | 5.0 | 3520 | 0.8736 | 0.8582 | 0.8511 | 0.8102 | 0.8255 |
| 0.719 | 6.0 | 4224 | 0.8385 | 0.8620 | 0.8524 | 0.8140 | 0.8292 |
| 0.5451 | 7.0 | 4928 | 0.8900 | 0.8752 | 0.8793 | 0.8253 | 0.8451 |
| 0.4815 | 8.0 | 5632 | 0.9242 | 0.8715 | 0.8577 | 0.8359 | 0.8452 |
| 0.3935 | 9.0 | 6336 | 0.8809 | 0.8847 | 0.8742 | 0.8438 | 0.8566 |
| 0.2789 | 10.0 | 7040 | 0.9693 | 0.8696 | 0.8542 | 0.8227 | 0.8355 |
| 0.1994 | 11.0 | 7744 | 1.0208 | 0.8733 | 0.8620 | 0.8396 | 0.8492 |
| 0.2142 | 12.0 | 8448 | 1.0157 | 0.8771 | 0.8714 | 0.8307 | 0.8463 |
| 0.1583 | 13.0 | 9152 | 1.0452 | 0.8733 | 0.8649 | 0.8276 | 0.8420 |
| 0.1566 | 14.0 | 9856 | 1.0171 | 0.8790 | 0.8766 | 0.8333 | 0.8500 |
| 0.1489 | 15.0 | 10560 | 1.0512 | 0.8809 | 0.8843 | 0.8271 | 0.8465 |
| 0.1127 | 16.0 | 11264 | 1.0802 | 0.8677 | 0.8488 | 0.8337 | 0.8404 |
| 0.0828 | 17.0 | 11968 | 1.0955 | 0.8752 | 0.8622 | 0.8338 | 0.8459 |
| 0.0833 | 18.0 | 12672 | 1.1333 | 0.8752 | 0.8531 | 0.8415 | 0.8467 |
| 0.0655 | 19.0 | 13376 | 1.1919 | 0.8771 | 0.8811 | 0.8283 | 0.8468 |
| 0.0799 | 20.0 | 14080 | 1.1709 | 0.8828 | 0.8799 | 0.8377 | 0.8543 |
| 0.0778 | 21.0 | 14784 | 1.2235 | 0.8733 | 0.8633 | 0.8283 | 0.8424 |
| 0.0431 | 22.0 | 15488 | 1.3049 | 0.8677 | 0.8607 | 0.8203 | 0.8364 |
| 0.038 | 23.0 | 16192 | 1.2969 | 0.8790 | 0.8760 | 0.8288 | 0.8454 |
| 0.081 | 24.0 | 16896 | 1.3389 | 0.8790 | 0.8738 | 0.8301 | 0.8462 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 4.4.1
- Tokenizers 0.21.2
- Downloads last month
- 11
Model tree for abdiharyadi/finbert-ft-icar-a-v0.11
Base model
ProsusAI/finbert