Image by congerdesign

A Short and Simple Introduction to Entropy

Learn this mental model in just a few minutes

Julia Clavien
3 min readDec 3, 2019

--

If you harbor the desire to have some idea what others mean when they mention entropy (or maybe even use it in a sentence yourself) — read on! Entropy is a mental model well worth learning.

What is entropy?

Entropy is a concept from physics that initially stemmed from some ideas a guy called Rudolf Clausius had around thermodynamic processes.

The purists may hate me for this, but I’m going to really simplify the concept so that we can cover it in this short post. The goal here isn’t to understand to the level of a physicist or chemist, but just to be able to wrap our heads around entropy enough to be able to add it to our latticework of mental models!

Here’s a starting definition:

Entropy is a measure of complexity (order or disorder) in a system.

The greater the level of disorder the higher the entropy.

This example from Study makes it clearer:

Let us say you have a bag of balls. You grab one ball from the bag and put it on the table. How many ways can you arrange that ball? The answer: one way. What if we grab two balls and ask the same question? Now there are more ways to arrange…

--

--

Julia Clavien

Curious to a fault. Technology | Psychology | Philosophy. All opinion subject to change. ☺ linktr.ee/juliaclavien