Lecture 13: Approximating Probability Distributions (III): Monte Carlo Methods (II): Slice Sampling
Lecture 13 of the Course on Information Theory, Pattern Recognition, and Neural Networks.
Produced by: David MacKay (University of Cambridge)
Author: David MacKay, University of Cambridge
A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms" (Cambridge University Press, 2003, http://www.inference.eng.cam.ac.uk/mackay/itila/) which can be bought at Amazon (http://www.amazon.co.uk/exec/obidos/ASIN/0521642981/davidmackay0f-21), and is available free online (http://www.inference.eng.cam.ac.uk/mackay/itila/).
A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. The high-resolution videos and all other course material can be downloaded from the Cambridge course website (http://www.inference.eng.cam.ac.uk/mackay/itprnn/).
Snapshots of the lecture can be found here:
http://www.inference.eng.cam.ac.uk/itprnn_lectures/
These lectures are also available at
http://videolectures.net/course_information_theory_pattern_recognition/
(synchronized with snapshots and slides)
Видео Lecture 13: Approximating Probability Distributions (III): Monte Carlo Methods (II): Slice Sampling канала Jakob Foerster
Produced by: David MacKay (University of Cambridge)
Author: David MacKay, University of Cambridge
A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms" (Cambridge University Press, 2003, http://www.inference.eng.cam.ac.uk/mackay/itila/) which can be bought at Amazon (http://www.amazon.co.uk/exec/obidos/ASIN/0521642981/davidmackay0f-21), and is available free online (http://www.inference.eng.cam.ac.uk/mackay/itila/).
A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. The high-resolution videos and all other course material can be downloaded from the Cambridge course website (http://www.inference.eng.cam.ac.uk/mackay/itprnn/).
Snapshots of the lecture can be found here:
http://www.inference.eng.cam.ac.uk/itprnn_lectures/
These lectures are also available at
http://videolectures.net/course_information_theory_pattern_recognition/
(synchronized with snapshots and slides)
Видео Lecture 13: Approximating Probability Distributions (III): Monte Carlo Methods (II): Slice Sampling канала Jakob Foerster
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Lecture 16: Data Modelling With Neural Networks (II): Content-Addressable Memories And State](https://i.ytimg.com/vi/OvMGPHpa_tM/default.jpg)
![Lecture 11: Approximating Probability Distributions (I): Clustering As An Example Inference Problem](https://i.ytimg.com/vi/XJGfXuFQVNE/default.jpg)
![Lecture 7: Noisy Channel Coding (II): The Capacity of a Noisy Channel](https://i.ytimg.com/vi/vVAsh5DAe10/default.jpg)
![Jakob Foerster Faculty Talk: Agency in the Era of Learning Systems](https://i.ytimg.com/vi/9qPhrEYIRF4/default.jpg)
![Lecture 5: Entropy and Data Compression (IV): Shannon's Source Coding Theorem, Symbol Codes](https://i.ytimg.com/vi/cJ_rhZ9DP9k/default.jpg)
![Lecture 1: Introduction to Information Theory](https://i.ytimg.com/vi/BCiZc0n6COY/default.jpg)
![Lecture 4: Entropy and Data Compression (III): Shannon's Source Coding Theorem, Symbol Codes](https://i.ytimg.com/vi/eHGqNvkL4n4/default.jpg)
![Lecture 14: Approximating Probability Distributions (IV): Variational Methods](https://i.ytimg.com/vi/rkV6Wu30x4g/default.jpg)
![Lecture 6: Noisy Channel Coding (I): Inference and Information Measures for Noisy Channels](https://i.ytimg.com/vi/9w4LnXIip5A/default.jpg)
![Lecture 3: Entropy and Data Compression (II): Shannon's Source Coding Theorem, The Bent Coin Lottery](https://i.ytimg.com/vi/0SxJl5G2bp0/default.jpg)
![Lecture 2: Entropy and Data Compression (I): Introduction to Compression, Inf.Theory and Entropy](https://i.ytimg.com/vi/y5VdtQSqiAI/default.jpg)
![Lecture 10: An Introduction To Bayesian Inference (II): Inference Of Parameters And Models](https://i.ytimg.com/vi/mDVE0M-xQlc/default.jpg)
![Lecture 15: Data Modelling With Neural Networks (I): Feedforward Networks: The Capacity Of A Neuron](https://i.ytimg.com/vi/Z1pcTxvCOgw/default.jpg)
![Lecture 9: A Noisy Channel Coding Gem, And An Introduction To Bayesian Inference (I)](https://i.ytimg.com/vi/HrRNqb5C-b0/default.jpg)
![Lecture 8: Noisy Channel Coding (III): The Noisy-Channel Coding Theorem](https://i.ytimg.com/vi/KSV8KnF38bs/default.jpg)
![Lecture 12: Approximating Probability Distributions (II): Monte Carlo Methods (I)](https://i.ytimg.com/vi/sN_0iGWcyLI/default.jpg)