29:51
David Tse, The Unreasonable Effectiveness of Information Theory
Nokia Bell Labs
1:01:51
Lecture 1: Introduction to Information Theory
Jakob Foerster
LIVE
[Private video]
56:15
Entropy, Mahler Measure and Bernoulli Convolutions - Emmanuel Breuillard
Institute for Advanced Study
37:22
Michelle Effros | Shannon's Channel and Capacity
Michigan Engineering
42:05
The Shannon Total Variation - Abergel - Workshop 1 - CEB T1 2019
Institut Henri Poincaré
2:27:35
Computational Information Geometry with Frank Nielsen
MLSS Sydney 2015
1:10:57
Information Geometry
Microsoft Research
1:54:41
Sergio Verdu - Information Theory Today
Institut des Hautes Etudes Scientifiques (IHES)
3:06:20
NASIT 2019 Padovani Lecture - Kannan Ramchandran - On duality, encryption, sampling and learning
IEEE Information Theory Society
28:00
Sample Complexity of Estimating Entropy
Simons Institute
1:10:37
[Коллоквиум]: Causal inference and Kolmogorov complexity
ФКН ВШЭ
18:57
Decoding Convolutional Codes: The Viterbi Algorithm Explained
Iain Explains Signals, Systems, and Digital Comms
1:24:44
Stanford Seminar - Information Theory of Deep Learning, Naftali Tishby
Stanford Online
1:04:23
Daniel Costello | Spatially Coupled LDPC Codes: Is This What Shannon Had In Mind?
54:10
Abbas El Gamal | Randomness Generation
1:20:31
Juan Pablo Vigneaux: "Cohomological aspects of information"
Topos Institute
1:09:26
Tom Leinster: "Entropy and diversity: the axiomatic approach"
1:00:39
Undecidable Problems in Information Theory - Cheuk Ting Li
Stanford Research Talks
1:34:32
Claude Shannon et la compression des données | 11/05/2017 | Gabriel Peyre
CultureMath Des mathématiques vivantes (DMA-ENS Ulm)
1:35:53
La naissance de la théorie de l'information | 11/05/2017 | Alain Chenciner
1:18:06
Daniel Bennequin - Topos and Information
49:02
Jean-Claude Belfiore - Toposes for Wireless Networks: An idea whose time has come
59:04
ARRC Seminar Series - Jean-Claude Belfiore - Part 1
ATRC
1:12:13
ARRC Seminar Series - Jean-Claude Belfiore - Part 2
55:03
Laure Saint Raymond, What does entropy measure?
Clay Mathematics Institute
1:34:34
Introduction to Information Theory - Edward Witten
56:30
Erdős and Shannon: A Story of Probability, Communication, and Combinatorics
57:45
ISIT 2017 | David Tse | The Spirit of Information Theory | 2017-06-28
ISIT 2017
49:45
Prof. Robert G. Gallager「From Information Theory to the Information Age」
JapanPrize
44:48
Information Theoretic Concepts of 5G - Maric
The Qualcomm Institute
1:02:47
Andrea Goldsmith - To Infinity and Beyond: New Frontiers in Wireless Information Theory
1:24:58
Ep 32. Information-Theoretic Foundations of 6G (With Giuseppe Caire) [Wireless Future Podcast]
Wireless Future
1:00:25
[WOST III] Introduction to information geometry by Masafumi Oizumi
Workshop on Stochastic Thermodynamics III
20:05
But what are Hamming codes? The origin of error correction
3Blue1Brown
16:50
Hamming codes part 2: The one-line implementation
16:53
What are Reed-Solomon Codes? How computers recover lost data
vcubingx
14:01
Hamming & low density parity check codes
Art of the Problem
11:31
How space-time codes work (5G networks)
7:42
How internet communication works: Network Coding
11:23
The Beauty of Lempel-Ziv Compression
34:03
Error correcting codes, group theory, and invariant theory Part 1
Experimental mathematics
22:48
Error correcting codes, group theory, and invariant theory Part 2
26:15
iMessage, World War II, and a Mathematical Theory of Communication
David Imel
12:07
Understanding Shannon entropy: (1) variability within a distribution
Gabriele Carcassi
10:01
A Theory, A Paper, A Turning Point: Claude Shannon's 1948 'Mathematical Theory of Communication'
Fizix Rules
1:01:48
Biology as Information Dynamics - John Baez
Stanford Complexity Group
26:24
The Key Equation Behind Probability
Artem Kirsanov
7:19
Information Theory Tutorial Part 1: What is Information?
James V Stone
9:51
Information Theory Tutorial Part 2: What is Shannon Entropy?