Information Theory (E4.40/SO20)

Mike Brookes
20 lectures in the Spring Term

Syllabus

This course will set out the fundamental concepts of information theory and its application in present-day communication systems. The axiomatic approach to the development of Shannon’s measure of information will be given. Expressions for the information generated by discrete memoryless sources and sources with memory will be established and the noiseless coding theorem will be proved and the asymptotic equipartition theorem will be presented. The practical significance of the noiseless coding theorem will be examined and examples of source coding will be given. The concept of channel capacity will be introduced and the calculation of the capacity of important communication channels and systems will be dealt with. The capacity theorem will be proved for various cases and its practical significance will be examined, and simple examples of coding aimed at achieving the results promised by the capacity theorem will be outlined. Finally, the concept of source coding, subject to fidelity criteria, (rate distortion theory) will be introduced.

Textbooks

Lecture Slides

One per page, 2 per page, 4 per page, 6 per page

Problem Sheets

Problems 1 - Entropy and Mutual Information
Problems 2 - Coding

Past Exam Papers

2003, 2004, 2005, 2006, 2007

Frequently Asked Questions

  1. Is topic "xyz" examinable ?
    All topics included in the above lectures are examinable with the exception of the Kuhn Tucker conditions for constrained optimization. A previous version of the notes included a final lecture on "Multiple Access Channels"; this topic is no longer covered and is not examinable.
  2. Do I need to remember long complicated proofs ?
    You will not be asked to reproduce long proofs from the notes. However you will be expected to understand the proofs and to justify the steps they take.