Loading...

WII? (2a) Information Theory, Claude Shannon, Entropy, Redundancy, Data Compression & Bits

48,327 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Aug 20, 2013

What is Information? - Part 2a - Introduction to Information Theory:

Script: http://crackingthenutshell.org/what-i...

** Please support my channel by becoming a patron: http://www.patreon.com/crackingthenut...

** Or... how about a Paypal Donation? http://crackingthenutshell.org/donate

Thanks so much for your support! :-)

- Claude Shannon - Bell Labs - Father of Information Theory

- A Mathematical Theory of Communication - 1948

- Book, co-written with Warren Weaver

- How to transmit information efficiently, reliably & securely through a given channel (e.g. tackling evesdropping)

- Applications. Lossless data compression (ZIP files). Lossy data compression (MP3, JPG). Cryptography, thermal physics, quantum computing, neurobiology

- Shannon's definition not related to meaningfulness, value or other qualitative properties - theory tackles practical issues

- Shannon's information, a purely quantitative measure of communication exchanges

- Shannon's Entropy. John von Neumann. Shannon's information, information entropy - avoid confusion with with thermodynamical entropy

- Shannon's Entropy formula. H as the negative of a certain sum involving probabilities

- Examples: fair coin & two-headed coin

- Information gain = uncertainty reduction in the receiver's knowledge

- Shannon's entropy as missing information, lack of information

- Estimating the entropy per character of the written English language

- Constraints such as "I before E except after C" reduce H per symbol

- Taking into account redundancy & contextuality

- Redundancy, predictability, entropy per character, compressibility

- What is data compression? - Extracting redundancy

- Source Coding Theorem. Entropy as a lower limit for lossless data compression.

- ASCII codes

- Example using Huffman code. David Huffman. Variable length coding

- Other compression techniques: arithmetic coding

- Quality vs Quantity of information

- John Tukey's bit vs Shannon's bit

- Difference between storage bit & information content. Encoded data vs Shannon's information

- Coming in the next video: error correction and detection, Noisy-channel coding theorem, error-correcting codes, Hamming codes, James Gates discovery, the laws of physics, How does Nature store Information, biology, DNA, cosmological & biological evolution

Loading...

Advertisement
When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...