What are the metrics/eval framework?

#20
by wamreyaz - opened

Hey,

Congrats on the amazing release.

I had a question about the evals
image

What are the metrics you actually report in this table? Particularly the Referring Expressions, is it accuracy as implemented in lmms-eval? If yes, what threshold?

If not, are you using mAP, which should be the standard for Object Detection?

Looking forward to your reply!

moondream org
edited Oct 6

Thanks!

For the referral benchmarks, we use the standard accuracy @ 50% IoU. For the rest, we follow the implementation in VLMEvalKit (https://github.com/open-compass/VLMEvalKit).

Sign up or log in to comment