Azad Academy

Azad Academy

Share this post

Azad Academy
Azad Academy
Why Is Cross Entropy Equal to KL-Divergence?

Why Is Cross Entropy Equal to KL-Divergence?

Exploring the Concepts of Entropy, Cross-Entropy and KL-Divergence

JRS's avatar
JRS
Jul 15, 2022
∙ Paid

Share this post

Azad Academy
Azad Academy
Why Is Cross Entropy Equal to KL-Divergence?
Share
Figure 1: Two probability distributions sampled from normal distribution (Image by author)

It is a common practice to use cross-entropy in the loss function while constructing a Generative Adversarial Network [1] even though original concept suggests the use of KL-divergence. This creates confusion often for the person new to the field. In this article we go through the concepts of entropy, cross-entropy and Kullback-Leibler Divergence [2] and see how they can be approximated to be equal.

Keep reading with a 7-day free trial

Subscribe to Azad Academy to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Dr. J. Rafid Siddiqui
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share