[DTC18a] SyMIL: MinMax Latent SVM for Weakly Labeled Data
Revue Internationale avec comité de lecture :
Journal IEEE Transactions on Neural Networks and Learning Systems,
pp. 1-14,
2018, (
doi:
10.1109/TNNLS.2018.2820055)
Mots clés: Weakly Supervised Learning, Multiple Instance
Learning, Latent SVM, Image Categorization and Pattern Recognition
Résumé:
esigning powerful models able to handle weakly labeled data is a crucial problem in machine learning. In this paper,
we propose a new Multiple Instance Learning (MIL) framework.
Examples are represented as bags of instances, but we depart
from standard MIL assumptions by introducing a symmetric
strategy (SyMIL) that seeks discriminative instances in positive
and negative bags. The idea is to use the instance the most distant
from the hyper-plan to classify the bag. We provide a theoretical
analysis featuring the generalization properties of our model. We
derive a large margin formulation of our problem, which is cast as
a difference of convex functions, and optimized using CCCP. We
provide a primal version optimizing with stochastic sub-gradient
descent and a dual version optimizing with one-slack cutting-
plane. Successful experimental results are reported on standard
MIL and weakly-supervised object detection datasets: SyMIL
significantly outperforms competitive methods (mi/MI/Latent-
SVM), and gives very competitive performance compared to
state-of-the-art works. We also analyze the selected instances
of symmetric and asymmetric approaches on weakly-supervised
object detection and text classification tasks. Finally we show
complementarity of SyMIL with recent works on learning with
label proportions on standard MIL datasets.