Open
Description
Is your feature request related to a problem? Please describe.
The torchmetrics.MeanAveragePrecision
module now supports the faster-coco-eval
backend, which is significantly faster than the default pycocotools
backend.
Performance Summary for 5000 Images:
Metric | faster-coco-eval (s) |
pycocotools (s) |
Speedup (×) |
---|---|---|---|
bbox | 5.812 | 22.72 | 3.91× |
segm | 7.413 | 24.434 | 3.30× |
Describe the solution you'd like to propose
Support for configurable backends should be added to the relevant recipes, allowing users to choose between faster-coco-eval
and pycocotools
.
Describe alternatives you've considered
- Continuing with the default
pycocotools
backend despite its slower performance. - Using a custom integration for
faster-coco-eval
outside of recipes.
Additional Context
Metadata
Metadata
Assignees
Labels
No labels