Why Is Cross Entropy Equal to KL-Divergence?
Exploring the Concepts of Entropy, Cross-Entropy and KL-Divergence
It is a common practice to use cross-entropy in the loss function while constructing a Generative Adversarial Network [1] even though original concept suggests the use of KL-divergence. This creates confusion often for the person new to the field. In this article we go through the concepts of entropy, cross-entropy and Kullback-Leibler Divergence [2] and see how they can be approximated to be equal.
Keep reading with a 7-day free trial
Subscribe to Azad Academy to keep reading this post and get 7 days of free access to the full post archives.