Why Information Grows, by Cesar Hidalgo

Honestly can't tell you what this book was about.

One thing I remember at this point: take an expensive car worth (say) $2m, and crash it into a wall. You still have all the same weight, all the same material, but what's left can only be sold for (say) $10k as scrap metal. What you've lost in the crash is information. "The value was stored not in the car's atoms, but their arrangement."

[after going through my saved clips on audible – I wish this were easier to do, would greatly increase my retention from my reading!]

A good explanation of entropy: consider a half-full stadium, with various rows of equal length getting further from the field. You could half-fill the stadium by having everyone sit as close as possible to the field: then the bottom half will be full and the top half empty. Or you could half-fill the stadium by having everyone sit as far away as possible: then the top half will be full and the bottom half empty.

Both of these are the lowest-entropy states, because there's only one way to have everyone be as close to the bottom as possible and only one way to have everyone be as close to the top.

By contrast, the highest-entropy state is when "the average row being occupied is the center row, since there are many ways in which we could arrange people in seats so that the average row being occupied is the one in the middle." Entropy is "the multiplicity of equivalent states". While entropy is "commonly associated with disorder, it's not exactly a measure of disorder" – rather, it's a measure of multiplicity of states. But since highly disordered states tend to have high multiplicity, so high entropy states are often highly disordered, so it's not the worst simplification.