You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been training on a relatively small dataset starting from COCO-pretrained weights (dataset specs: about 5k images, over 3 classes with approx. 4000, 3000 and 24000 bounding boxes per object class). I focused on object detection, and used a lr =0.0003125 for 140 epochs with reduction at steps 80, 100, 125 and the results for N.A. testing, flipping and multiscale testing are wildly different, with AP50 at about 0.30-0.40 for N.A. and 0.60-70 for flipping and around 0.8-0.9 for multiscale testing. Any idea where this could stem from? Similar setting on a comparable dataset has different behavior: with an improvement from N.A. to F. and M.S. but nothing too drastic.
The text was updated successfully, but these errors were encountered:
I've been training on a relatively small dataset starting from COCO-pretrained weights (dataset specs: about 5k images, over 3 classes with approx. 4000, 3000 and 24000 bounding boxes per object class). I focused on object detection, and used a lr =0.0003125 for 140 epochs with reduction at steps 80, 100, 125 and the results for N.A. testing, flipping and multiscale testing are wildly different, with AP50 at about 0.30-0.40 for N.A. and 0.60-70 for flipping and around 0.8-0.9 for multiscale testing. Any idea where this could stem from? Similar setting on a comparable dataset has different behavior: with an improvement from N.A. to F. and M.S. but nothing too drastic.
The text was updated successfully, but these errors were encountered: