The selection of training samples for triplet network-based deep metric learning is referred to as triplet mining. It has been recently found, that the selection of these triplets has a crucial effect on the performance of the model. Based on the definition of triplet loss, different sampling techniques are known: random selection is referred to easy mining due to its low computational cost. The selecting of samples where the resulting loss is non-zero is referred to as hard mining. The maximal loss comes from selecting the hardest negative, which is also the computationally most expensive technique. In this paper a combined method for negative sampling is presented and evaluated. For each training step, an anchor is selected randomly, easy positive mining is applied, followed by choosing from the hardest, a hard or a semi-hard sampling policy. Results show that the method performs well when no initial pretraining of model parameters is done, and discriminative ability is similar when compared to models where multi-class pretraining was applied.
- Címlap
- Publikációk
- Combining Negative Selection Techniques for Triplet Mining in Deep Metric Learning