Abstract— Adapting detectors to new datasets is needed in scenarios where a user has a specific dataset that contains novel classes or is recorded in a setting where a pretrained detector fails. While detectors based on Convolutional Neural Networks (CNNs) are state-of-the-art and nowadays publicly available, they suffer from bad generalization capabilities when applied on datasets that notably differ from the one they were trained on. Finetuning the detector is only possible if the dataset is large enough to not destroy the underlying feature representation. We propose a method where only a few prototypes are labeled for training in a semi-supervised manner. In particular, we separate the detection from the classification step to avoid impairing the bounding box proposal generation. Our trained prototype classification network provides labels to automatically source a large dataset containing 20 to 30 times more samples without further supervision, which we then use to train a more powerful network. We evaluate our method on a private vehicle dataset with six classes and show that evaluating on a previously unseen recording site we can gain an accuracy increase of 9% at same precision and recall levels. We further show that finetuning with as few as 25 labeled samples per class doubles accuracy compared to directly using pretrained features for nearest neighbor classification.
|Titel||2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019|
|Herausgeber (Verlag)||Institute of Electrical and Electronics Engineers|
|Publikationsstatus||Veröffentlicht - Nov 2019|
|Veranstaltung||22nd IEEE International Conference on Intelligent Transportation Systems - Auckland, Neuseeland|
Dauer: 27 Okt 2019 → 30 Okt 2019
|Konferenz||22nd IEEE International Conference on Intelligent Transportation Systems|
|Zeitraum||27/10/19 → 30/10/19|