Fisher metric vs KL-divergence
16 Oct 2016Let
Let the density
where we recognize
Thus, the Fisher information metric is the second derivative of the Kullback-Leibler divergence,
Bonus: one prominent equality for the Fisher information¶
Let’s prove the following useful equality:
Consider the argument on the right-hand side:
Compute its expectation:
The second term on the right equals zero, which concludes the proof. Derivations in this post closely follow the book by Kullback.