Podcast: Entropy

On the podcast this week, we’re delving into entropy, the thermodynamic property that causes heat to flow into cooler areas, buildings to degrade away and ultimately the heat death of the universe. In essence, entropy is how much disorder or randomness there is in a system. This year is an important year for entropy too, it’s been 150 years exactly since physicist Rudolf Clausius first published his paper that coined the term “entropy,” and formalized what it was, and how it behaved.

The idea of entropy had been around since the early 1800s when scientists like Lazare Carnot first started to realize that in any action, some amount of energy was bing lost. Perpetual motion machines are impossible because some of their energy is inevitably lost to friction and heat.

Perpetual motion machines are as old as science itself, and they all have one thing in common.
They don’t work.

Clausius started working on the principles of thermodynamics in the early 1850s, and postulated that what people observed as heat was actually the motion of atoms and molecules in an object, not some inherent particle or substance as scientists had previously inferred. This explained exactly why energy was always lost to friction in a mechanical action. Some of the mechanical energy put into a system inevitably ended up stimulating molecules, causing them to vibrate, instead of the intended task.

The intense stare of Rudolf

In 1865 he published his paper that coined the term “entropy” and designated S as its official symbol. He derived the term from the word “trope” which is Greek for “transformation,” and intentionally made it resemble the word “energy” which it’s intimately related to.

In the paper he also laid out some of the most important implications of the theory, specifically that the total energy in the universe is constant and that its net entropy is always increasing. The more entropy there is in a system, the more randomized it is and the more difficult it is to get any meaningful work out of that system.

Ultimately, entropy destroys the ability of any system to do work, which applies to our universe as a whole. We are inside a system that is gradually working its way towards equilibrium, a point where there will be no energy difference great enough to accomplish meaningful work. However, this Heat Death of the Universe is quite a ways off, roughly a googol years (the number, not the search engine).

Written out, it’s 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 years away. That’s a 1, followed by 100 zeroes, so don’t wait up.

Also, for more insight into how to calculate the entropy of a system, check out this video from CrashCourse.

You may also read these articles