This is part of my series of commentary on the physics book The Theoretical Minimum.
The job of classical mechanics is to predict the future. (p. 1)
I love this.
The rule that dynamical laws must be deterministic and reversible is so central to classical physics that we sometimes forget to mention it when teaching the subject. In fact, it doesn’t even have a name. We could call it the first law, but unfortunately there are already two first laws — Newton’s and the first law of thermodynamics. There is even a zeroth law of thermodynamics. So we have to go back to a minus-first law to gain priority for what is undoubtedly the most fundamental of all physical laws . . . (p. 9)
As a number-lover, this sort of thing just makes me all kind of amused.
But there is another element that [Laplace] may have underestimated [when he said the laws of physics could theoretically predict the whole future]: the ability to know the initial conditions with almost perfect precision. […] The ability to distinguish the neighboring values of these numbers is called “resolving power” of any experiment, and for any real observer it is limited. In principle we cannot know the initial conditions with infinite precision. […] Perfect predictability is not achievable, simply because we are limited in our resolving power. (p. 14)
This concept, I am keenly aware, is what makes my Russell’s Attic books science fiction. My main character is only able to do the calculations she can on the world around her because I permit her to have indefinitely good resolving power. It’s kind of a required secondary power for what she does. And reading this section, it completely tickled me that it has a name!
I’ve said before that I reduce all physics to doing math. I felt like I was cheating a bit in this section, because saying a system is deterministic and reversible is the same as saying you can model it with a one-to-one function. So I bopped along just thinking of the functional invertibility of the the rules, most of which I knew off the top of my head.
Sigh. You can take the mathematician out of mathematics . . .