Kullback-Leibler Divergence Metric Learning

Published in IEEE Transactions on Cybernetics, 2020

Shuyi Ji, Zizhao Zhang, Shihui Ying, Xibin Zhao, Yue Gao. "Kullback-Leibler Divergence Metric Learning". IEEE Transactions on Cybernetics, 2020.

Abstract

The Kullback-Leibler divergence (KLD), which is widely used to measure the similarity between two distributions, plays an important role in many applications. In this article, we address the KLD metric-learning task, which aims at learning the best KLD-type metric from the distributions of datasets. Concretely, first, we extend the conventional KLD by introducing a linear mapping and obtain the best KLD to well express the similarity of data distributions by optimizing such a linear mapping. It improves the expressivity of data distribution, which means it makes the distributions in the same class close and those in different classes far away. Then, the KLD metric learning is modeled by a minimization problem on the manifold of all positive-definite matrices. To deal with this optimization task, we develop an intrinsic steepest descent method, which preserves the manifold structure of the metric in the iteration. Finally, we apply the proposed method along with ten popular metric-learning approaches on the tasks of 3-D object classification and document classification. The experimental results illustrate that our proposed method outperforms all other methods.

Download paper here