siglip-fashion-5ep

This model is a fine-tuned version of google/siglip-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0405

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss
0.0388 0.4643 500 0.0368
0.0259 0.9285 1000 0.0268
0.0221 1.3928 1500 0.0243
0.0209 1.8570 2000 0.0216
0.0162 2.3213 2500 0.0217
0.0156 2.7855 3000 0.0209
0.0139 3.2498 3500 0.0206
0.0128 3.7140 4000 0.0206
0.0129 4.1783 4500 0.0222
0.0119 4.6425 5000 0.0205
0.0096 5.1068 5500 0.0211
0.0099 5.5710 6000 0.0223
0.0101 6.0353 6500 0.0244
0.0083 6.4995 7000 0.0241
0.0092 6.9638 7500 0.0228
0.009 7.4280 8000 0.0256
0.0087 7.8923 8500 0.0266
0.0076 8.3565 9000 0.0283
0.0071 8.8208 9500 0.0259
0.0072 9.2851 10000 0.0289
0.0068 9.7493 10500 0.0297
0.0064 10.2136 11000 0.0340
0.0066 10.6778 11500 0.0327
0.0061 11.1421 12000 0.0367
0.0055 11.6063 12500 0.0338
0.0064 12.0706 13000 0.0358
0.005 12.5348 13500 0.0369
0.0053 12.9991 14000 0.0367
0.0049 13.4633 14500 0.0381
0.0042 13.9276 15000 0.0400
0.0043 14.3918 15500 0.0400
0.0039 14.8561 16000 0.0405

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.6.0+cu124
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
49
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for turing552/siglip-fashion-5ep

Finetuned
(10)
this model