What is Information Theory? (Information Entropy)





The interactive transcript could not be loaded.



Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Sep 14, 2012

What is the essence of information? We explore the history of communication technology leading to the modern field of information theory. We'll build up towards Claude Shannon's measure of information entropy, one step at a time.

  • Category

  • License

    • Standard YouTube License

Comments • 63

The alphabet is the most powerful invention in human history? The Chinese beg to differ.
View all 5 replies
Hide replies
Typical academic approach. I do it the opposite way I start with the bolts. If i write " Anna banana golf god woods" and the write "i am a dirty boy"... The first sentence contains more information high entropy. The next sentence contains less information. And how do you decide ? The sentence that raise the most new questions contains the most information. I am a dirty boy may mean being a pervert or something. Anna banana golf god woods raise alot more new questions. Therefor it carries more information.
View reply
Hide replies
Winston Smith
Lost me at, "measure of surprise". I don't know why he said that. It just pops up in the middle of things. I have no idea what it's connected to. It just comes out of nowhere and then disappears. This is why I can't learn from lectures or videos. I need to stop the person and ask questions. Learning has to be student led. Otherwise it's just infotainment and something to pad a resume with for the producers. 
View all 5 replies
Hide replies
Rick might be insane
Thanks. By the way it felt like I was watching howtobasic for a couple of seconds 😅.
Paul TheSkeptic
Interesting video but if this is a series, don't you think that numbers giving us the order in which these videos should be played would be good information to have?
View reply
Hide replies
Afrid Gowun
This explains null and nothing about entropy. It's only clear for those that already confusing of it for tons years. But not for me, that never thinking about it, just started to grab its meaning. But, sorry, its explanation so fast, unable to attract me, and this is not an art product. Caused it makes me get confusion.
entropy is the meaning of the universe.. of it all what we know as the universere.. all existence.. is from one entropy point.
"But first, we need to talk about parallel universes."
Martin Smith
Entropy = Information | Objections Every since 1948, with the publication of Shannon's paper, there has been a growth in the assumed equivalence of heat engine entropy and the entropy of a message, as well as growth in the objections to this point of view. In the 1999, to cite one example, American chemistry professor Frank Lambert, who for many years taught a course for non-science majors called "Enfolding Entropy" at Occidental College in Los Angeles, stated that another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy" of Claude Shannon. [12] In Shannon’s 1948 paper, as discussed, the word "entropy” was adopted by the suggestion von Neumann. This step, according to Lambert, was “Wryly funny for that moment,” but “Shannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications". [13]  According to Lambert, “certainly most non-experts hearing of the widely touted information entropy would assume its overlap with thermodynamic entropy. However, the great success of information "entropy" has been in areas totally divorced from experimental chemistry, whose objective macro results are dependent on the behavior of energetic microparticles. Nevertheless, many instructors in chemistry have the impression that information "entropy" is not only relevant to the calculations and conclusions of thermodynamic entropy but may change them. This logic, according to Lambert, is not true. [12] In sum, according to Lambert, information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature. Even those who are very competent chemists and physicists have become confused when they have melded or mixed information "entropy" in their consideration of physical thermodynamic entropy. This is shown by the results in textbooks and by the lectures of professors found on the Internet.  In the 2007 book A History of Thermodynamics, for instance, German physicist Ingo Müller summarizes his opinion on the matter of von Neumann’s naming suggestion:  “No doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.” Müller clarifies the matter, by stating that: “for level-headed physicists, entropy (or order and disorder) is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.” [11]  ------------------------------------------------------------------------------------------------------------------- Exactly. So it is untenable extrapolation of ideas which mean something different in another field. And from that, even wilder extrapolations by others to far broader ideas of man and his concerns. The root untenability is compounded exponentially. Wikipedia does not reveal much but a searching, sceptical article critical of IIT is that by Scott Aaronson at http://www.scottaaronson.com/blog/?p=1799 and a full treatment in http://www.scientificamerican.com/article/a-theory-of-consciousness/?page=1 in both cases with some very searching and intelligent critiques by a variety of very well-informed and deep-thinking people.
Kirk Hodges
What a beautiful and entirely inaccurate description
When autoplay is enabled, a suggested video will automatically play next.

Up next

to add this to Watch Later

Add to

Loading playlists...