Budapest University of Technology and Economics, Budapest
Department of Computer Science and Information Theory


Data Compression 2006 Fall


Lecturers:
András Antos
Classes per week:
4
Credits:
5
Exam:
written
December 13, 2006, 10:15 a.m. T.64 and
January 11, 2007, 3 p.m. SZTAKI L410
Schedule, place:
Tuesday 8:30-10:00, T.63
Wednesday 10:15-11:45, T.64
Remarks:

Tematics

Entropy:
entropy, joint entropy, conditional entropy, mutual information, conditional mutual information, relative entropy, conditional relative entropy, chain rules, Jensen inequality, logsum inequality, data processing inequality, Fano's inequality.
Data Compression:
source codes, Kraft inequality, McMillan, connection with entropy - lower bound for L, Shannon codes - upper bound for L, Huffman codes, optimality, competitive optimality of Shannon code.
Entropy Rates of a Stochastic Process:
Markov chains, entropy rate.
Channel Capacity:
noiseless binary channel, noisy channel with no overlap, noisy typewriter, binary symmetric channel, binary erasure channel, the channel coding model, channel coding theorem, .
Lempel-Ziv'78 algorithms.

Related links

Cavendish Laboratory, Cambridge, Great Britain, David J.C. MacKay: A Short Course in Information Theory in 1995.

Back to the Home Page


Updated: Jul 24, 2010
aantos NOSPAMkukacNOSPAM gmail pont com