Lecture 2: Entropy and Data Compression (I): Introduction to Compression, Inf.Theory and Entropy
Lecture 2 of the Course on Information Theory, Pattern Recognition, and Neural Networks.
Produced by: David MacKay (University of Cambridge)
Author: David MacKay, University of Cambridge
A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms" (Cambridge University Press, 2003, http://www.inference.eng.cam.ac.uk/mackay/itila/) which can be bought at Amazon (http://www.amazon.co.uk/exec/obidos/ASIN/0521642981/davidmackay0f-21), and is available free online (http://www.inference.eng.cam.ac.uk/mackay/itila/).
A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. The high-resolution videos and all other course material can be downloaded from the Cambridge course website (http://www.inference.eng.cam.ac.uk/mackay/itprnn/).
Snapshots of the lecture can be found here:
http://www.inference.eng.cam.ac.uk/itprnn_lectures/
These lectures are also available at
http://videolectures.net/course_information_theory_pattern_recognition/
(synchronized with snapshots and slides)
Видео Lecture 2: Entropy and Data Compression (I): Introduction to Compression, Inf.Theory and Entropy канала Jakob Foerster
Produced by: David MacKay (University of Cambridge)
Author: David MacKay, University of Cambridge
A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms" (Cambridge University Press, 2003, http://www.inference.eng.cam.ac.uk/mackay/itila/) which can be bought at Amazon (http://www.amazon.co.uk/exec/obidos/ASIN/0521642981/davidmackay0f-21), and is available free online (http://www.inference.eng.cam.ac.uk/mackay/itila/).
A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. The high-resolution videos and all other course material can be downloaded from the Cambridge course website (http://www.inference.eng.cam.ac.uk/mackay/itprnn/).
Snapshots of the lecture can be found here:
http://www.inference.eng.cam.ac.uk/itprnn_lectures/
These lectures are also available at
http://videolectures.net/course_information_theory_pattern_recognition/
(synchronized with snapshots and slides)
Видео Lecture 2: Entropy and Data Compression (I): Introduction to Compression, Inf.Theory and Entropy канала Jakob Foerster
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Lecture 3: Entropy and Data Compression (II): Shannon's Source Coding Theorem, The Bent Coin Lottery](https://i.ytimg.com/vi/0SxJl5G2bp0/default.jpg)
![Entropy (for data science) Clearly Explained!!!](https://i.ytimg.com/vi/YtebGVx-Fxw/default.jpg)
![Lecture 1: Introduction to Information Theory](https://i.ytimg.com/vi/BCiZc0n6COY/default.jpg)
![](https://i.ytimg.com/vi/kSnY2wcZN7c/default.jpg)
![The Misunderstood Nature of Entropy](https://i.ytimg.com/vi/kfffy12uQ7g/default.jpg)
![A Short Introduction to Entropy, Cross-Entropy and KL-Divergence](https://i.ytimg.com/vi/ErfnhcEV1O8/default.jpg)
![Is ENTROPY Really a "Measure of Disorder"? Physics of Entropy EXPLAINED and MADE EASY](https://i.ytimg.com/vi/mg0hueOyoAw/default.jpg)
![How Computers Compress Text: Huffman Coding and Huffman Trees](https://i.ytimg.com/vi/JsTptu56GM8/default.jpg)
![Shannon Entropy and Information Gain](https://i.ytimg.com/vi/9r7FIXEAGvs/default.jpg)
![Instantaneous Codes| Instantaneous codes Construction and Decoding| Kraft's Inequality|Prefix Codes](https://i.ytimg.com/vi/atjbwK98nfs/default.jpg)
![Intro to Information Theory | Digital Communication | Information Technology](https://i.ytimg.com/vi/_PG-jJKB_do/default.jpg)
![Lecture 4: Entropy and Data Compression (III): Shannon's Source Coding Theorem, Symbol Codes](https://i.ytimg.com/vi/eHGqNvkL4n4/default.jpg)
![(Info 1.1) Entropy - Definition](https://i.ytimg.com/vi/LodZWzrbayY/default.jpg)
![Lecture 6: Noisy Channel Coding (I): Inference and Information Measures for Noisy Channels](https://i.ytimg.com/vi/9w4LnXIip5A/default.jpg)
![Claude Shannon - Father of the Information Age](https://i.ytimg.com/vi/z2Whj_nL-x8/default.jpg)
![How We’re Fooled By Statistics](https://i.ytimg.com/vi/1tSqSMOyNFE/default.jpg)
![Maximum Entropy Methods Tutorial: A Simple Example: The Taxicab](https://i.ytimg.com/vi/5P58wHbWXBU/default.jpg)
![Lecture 5: Entropy and Data Compression (IV): Shannon's Source Coding Theorem, Symbol Codes](https://i.ytimg.com/vi/cJ_rhZ9DP9k/default.jpg)
![Quantum Computing for Computer Scientists](https://i.ytimg.com/vi/F_Riqjdh2oM/default.jpg)
![ISIT 2017 | David Tse | The Spirit of Information Theory | 2017-06-28](https://i.ytimg.com/vi/O_uBxFGk-U4/default.jpg)