Information Theory ENSC 808 (3)
Information measures: entropy, relative entropy, mutual information, entropy rate, differential entropy. Asymptotic Equipartition Property. Lossless data compression: Kraft inequality, Huffman code, Shannon code, Arithmetic coding. Channel capacity: binary symmetric channel, binary erasure channel, Shannon's channel coding theorem, Gaussian channel, feedback. Prerequisite: STAT 270 or equivalent.
Section | Instructor | Day/Time | Location |
---|---|---|---|
Jie Liang |
Jan 6 – Apr 13, 2015: Mon, Wed, 4:30–5:50 p.m.
|
Burnaby |