Loading...

1,488 views

Loading...

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on May 7, 2018

Short text and image presentation explaining the term entropy.
Produced by: http://complexitylabs.io
Twitter: https://goo.gl/Nu6Qap
Facebook: https://goo.gl/P7EadV
LinkedIn: https://goo.gl/3v1vwF

Entropy is a measurement of the number of degrees of freedom a system has. When the entropy goes up it requires more information to describe the state of the system and it would require work to be done in order to reconfigure the system into its original ordered state. As such entropy is a key measure in information theory where it quantifies the uncertainty involved in predicting the value of a random variable. The Second Law of Thermodynamics is an observation of the fact that over time, differences in temperature, pressure, and chemical potential tend to even out in a physical system that is isolated from the outside world. Entropy is a measure of how much this process has progressed. The entropy of an isolated system that is not in equilibrium tends to increase over time, approaching a maximum value at equilibrium. Thus the second law is one of the few if not only physical laws the differentiates between the direction of time.

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...