Jerry Han // Blog
// notes on math, ml, and things i'm figuring out
[ index ]
--------------------------------------------------------------------------------
[001]
Information Theory

A post on information theory — entropy, KL divergence, and why these ideas show up everywhere in machine learning. Written for the Gradient blog.

Read the full post →

--------------------------------------------------------------------------------