===== Information Theory & Coding ===== \\ |Instructor|**Emre Telatar**| |Office|[[http://plan.epfl.ch/?room=INR117|INR 117]]| |Phone|**+41 21 69 37693**| |Email|**emre.telatar@epfl.ch**| |Office Hours|**By appointment**| | | | | | | |Teaching Assistant |**Rajai Nasser**| |Phone|**+41 21 69 37554**| |Office|[[http://plan.epfl.ch/?room=INR036|INR 036]]| |Email|**rajai.nasser@epfl.ch**| |Office Hours|| | | | | | | |Teaching Assistant |**Serj Haddad**| |Phone|**+41 21 69 31357**| |Office|[[http://plan.epfl.ch/?room=INR038|INR038]]| |Email|**serj.haddad@epfl.ch**| |Office Hours|| \\ |Lectures|**Monday** |**13:00 - 15:00** (Room: [[http://plan.epfl.ch/?reset_session&alias=ela1|ELA01]])| | |**Tuesday** |**13:00 - 15:00** (Room: [[http://plan.epfl.ch/?reset_session&alias=ela2|ELA02]])| |Exercises|**Tuesday** |**15:00 - 17:00** (Room: [[http://plan.epfl.ch/?reset_session&room=ela2|ELA02]])| \\ |**Language**:| |English| |**Credits **:| |7 ECTS| \\ See the course {{:en:courses:2014-2015:itc:info.pdf|information}}. /* [[http://isa.epfl.ch/imoniteur_ISAP/!itffichecours.htm?ww_i_matiere=1774097&ww_x_anneeAcad=2011-2012&ww_i_section=2139068&ww_i_niveau=&ww_c_langue=en|Link]] to official course description. */ ==== Special Announcements ==== **Final Exam Information:**\\ * **The final exam has been set on Thursday January 22nd from 8:15 till 11:15. It will take place in INM200 and INM202.**\\ * **The exam covers the whole course.**\\ * **You are allowed 4 sheets of paper (8 sides) of (your own) notes. No electronic devices.**\\ ==== Detailed Schedule ==== ^ Date ^ Topics Covered ^ Assignment ^ Solutions ^ Remarks ^ | Sep 15 | Intro to source coding; non-singular, | | | | | | uniquely decodable; prefix-free codes; | | | | | | Kraft's inequality for prefix-free codes. | | | | | | | | | | | | | | | | | | | | | | | Sep 16 | Kraft's inequality for uniquely decodable | {{:en:courses:2014-2015:itc:hw01.pdf|Homework 1}} | {{:en:courses:2014-2015:itc:sol01.pdf|Solutions 1}} | | | | codes; reverse Kraft; entropy as a lower | | | | | | bound on optimal codes; entropy + 1 as | | | | | | an upper bound on optimal codes. | | | | | | | | | | | | | | | | | Sep 22 | | | | Holiday :-) | | | | | | | | | | | | | | Sep 23 | Entropy is asymptotically achievable for | {{:en:courses:2014-2015:itc:hw02.pdf|Homework 2}} | {{:en:courses:2014-2015:itc:sol02.pdf|Solutions 2}} | | | | memoryless sources; Properties of | | | | | | optimal codes; intro to Huffman codes. | | | | | | | | | | | | | | | | | Sep 29 | Optimality of Huffman codes; entropy; | | | | | | joint entropy; conditional entropy. | | | | | | | | | | | | | | | | | Sep 30 | Mutual information; chain rules; | {{:en:courses:2014-2015:itc:hw031.pdf|Homework 3}} | {{:en:courses:2014-2015:itc:sol03.pdf|Solutions 3}} | | | | positivity of mutual information; | | | | | | conditioning reduces entropy; | | | | | | data-processing inequality. | | | | | | | | | | | | | | | | | Oct 06 | Entropy rate; stationary processes; | | | | | | typicality. | | | | | | | | | | | | | | | | | Oct 07 | Typicality; size of typical sets; source | {{:en:courses:2014-2015:itc:hw041.pdf|Homework 4}} | {{:en:courses:2014-2015:itc:sol04.pdf|Solutions 4}} | | | | coding via typicality; variable to fixed | | | | | | coding; valid prefix-free dictionaries. | | | | | | | | | | | | | | | | | Oct 13 | H(W)=H(U)E(length(W)); number of bits | | | | | | per source symbol used by a variable to | | | | | | fixed code. | | | | | | | | | | | | | | | | | Oct 14 | Tunstall procedure and its optimality; | {{:en:courses:2014-2015:itc:hw05.pdf|Homework 5}} | {{:en:courses:2014-2015:itc:sol05.pdf|Solutions 5}} | | | | analysis of Tunstall codes; description | | | | | | of Lempel-Ziv codes. | | | | | | | | | | | | | | | | | Oct 20 | FSM compressors. | | | | | | | | | | | | | | | | | Oct 21 | Compressibility of FSM compressors; | {{:en:courses:2014-2015:itc:hw06.pdf|Homework 6}} | {{:en:courses:2014-2015:itc:sol06.pdf|Solutions 6}} | | | | compressibility of Lempel-Ziv codes; | | | | | | LZ compressibility is less than FSM | | | | | | compressibility for every sequence. | | | | | | | | | | | | | | | | | Oct 27 | Communication problem; discrete | | | | | | memoryless channels without | | | | | | feedback; code rates; probability | | | | | | of error; converse to the channel | | | | | | coding theorem. | | | | | | | | | | | | | | | | | Oct 28 | | {{:en:courses:2014-2015:itc:midterm.pdf|Midterm}} | {{:en:courses:2014-2015:itc:midsol2.pdf|Solutions}} | | | | | | | | | | | | | | | Nov 03 | Converse to the channel coding | | | | | | theorem (cont); capacity examples; | | | | | | how to maximize concave functions. | | | | | | | | | | | | | | | | | Nov 04 | convexity, convex and concave | {{:en:courses:2014-2015:itc:hw071.pdf|Homework 7}} | {{:en:courses:2014-2015:itc:sol07.pdf|Solutions 7}} | | | | functions, convexity properties of | | | | | | entropy and mutual information. | | | | | | | | | | | | | | | | | Nov 10 | Kuhn-Tucker conditions, | | | | | | intro to random coding. | | | | | | | | | | | | | | | | | Nov 11 | Proof of the coding theorem | {{:en:courses:2014-2015:itc:hw08.pdf|Homework 8}} | {{:en:courses:2014-2015:itc:sol08.pdf|Solutions 8}} | | | | | | | | | | | | | | | Nov 17 | Differential entropy, mutual | | | | | | information; chain rules for them. | | | | | | | | | | | | | | | | | Nov 18 | Capacity with cost; Additive Gaussian | {{:en:courses:2014-2015:itc:hw09.pdf|Homework 9}} | {{:en:courses:2014-2015:itc:sol09.pdf|Solutions 9}} | | | | noise channel; converse to the coding | | | | | | theorem with cost. Relationship between | | | | | | discrete and differential entropy; | | | | | | differential mutual information as a limit | | | | | | of discrete mutual information. | | | | | | | | | | | | | | | | | Nov 24 | Coding theorem with cost. | | | | | | Waterfilling solution to parallel Gaussian | | | | | | channels with overall power constraint. | | | | | | | | | | | | | | | | | Nov 25 | Introduction to practical codes, linearity, | {{:en:courses:2014-2015:itc:hw102.pdf|Homework 10}} | {{:en:courses:2014-2015:itc:sol10.pdf|Solutions 10}} | | | | hamming distance, hamming codes. | | | | | | | | | | | | | | | | | Dec 01 | Sphere packing, singleton, | | | | | | Gilbert-Varshamov bounds. | | | | | | | | | | | Dec 02 | Reed-Solomon Codes. | {{:en:courses:2014-2015:itc:hw11.pdf|Homework 11}} | {{:en:courses:2014-2015:itc:sol11.pdf|Solutions 11}} | | | | | | | | | | | | | | | Dec 08 | Polar codes. | | | | | | | | | | | | | | | | | Dec 09 | Polar codes (cont). | {{:en:courses:2014-2015:itc:hw12.pdf|Homework 12}} | {{:en:courses:2014-2015:itc:sol12.pdf|Solutions 12}} | | | | | | | | | | | | | | | Dec 15 | lossy data compression | | | | | | (rate-distortion theory); | | | | | | | | | | | | | | | | | Dec 16 | What information theory says about | | | | | | other communication problems not | | | | | | covered in class. | | | | | Jan 22 | | {{:en:courses:2014-2015:itc:final.pdf|Final}} | {{:en:courses:2014-2015:itc:finsol.pdf|Solutions}} | | ==== Textbook ==== [[http://opac.nebis.ch/F?local_base=nebis&func=find-b&find_code=020&request=0-471-24195-4&con_lng=ENG|Elements of information theory]], Thomas M. Cover, Joy A. Thomas, 2006. ISBN:0-471-24195-4 ==== Additional Reading Material ==== Claude Shannon's {{:en:courses:2013-2014:shannon1948.pdf|A Mathematical Theory of Communication}}, published in Bell System Technical Journal, 1948 (part 1 in July, part 2 in October)\\ \\ **Notes on Tunstall codes**: {{:en:courses:2014-2015:itc:tunstall.pdf|Tunstall}}\\ **Notes on the Lempel-Ziv algorithm**: {{:en:courses:2014-2015:itc:lz.pdf|Lempel-Ziv}}\\ **Notes on coding theory**: {{:en:courses:2014-2015:itc:coding.pdf|Coding}}\\ **Notes on polar codes**: {{:en:courses:2014-2015:itc:polar.pdf|Polar}}. These notes are taken by Elie Najm during the polar coding lectures in the Fall 2013 version of the course, posted with his permission. The lectures follow the same logic as in this year, with minor differences; they also include a proof of "Mrs Gerber's" lemma for completeness.\\