The Man Behind the Curtain
Physics is not always the seamless subject that it pretends to be
Perhaps our complacency is due to the fact that we have written down a plausible equation. Physicists have long believed that mathematics is the Rosetta Stone for unlocking the secrets of Nature and since a famous 1960 essay by Eugene Wigner entitled “The Unreasonable Effectiveness of Mathematics in the Natural Sciences,” the conviction has become an article of faith. It seems to me, though, that the “God is a mathematician” viewpoint is one of selective perception. The great swindle of introductory physics is that every problem has an exact answer. Not only that, students are expected to find it. Such an approach inculcates our charges with an expectation that is, in fact, exactly contrary to the true state of the world. Vanishingly few problems in physics have exact solutions and a physicist’s career is one of finding approximations and hopefully not being too embarrassed by them.
In a freshman course we introduce the simple pendulum—nothing more than a mass on the end of a string that oscillates back and forth. Initially Newton’s laws lead to an equation that is too hard to solve and so we admonish students to simplify it by assuming that the pendulum is executing small oscillations. Then the exercise becomes easy. Well, not only is the original problem too difficult for freshman, it has no exact solution, at least not in terms of “elementary functions” like sines and cosines. Advanced texts tell you that an exact solution does exist, but the use of the term exact for such animals is debatable. In any case, replace the string by a spring and the problem can easily be made impossible to everyone’s satisfaction. One must distinguish the world from the description afforded by mathematics. As Einstein famously put it, contra Wigner, “As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality.”
The maxim might be taken to heart a few weeks later in a freshman course when instructors introduce their students to Newton’s law of gravity. The famous law works exquisitely well, of course, but a singular strangeness goes unremarked. According to the equation, if two objects become infinitely close to one another, the force of attraction between them becomes infinite. Infinite forces don’t appear in Nature—at least we hope they don’t—and we dismiss this pathology with the observation that real objects have a finite size and their centers never get so close to each other that we need to worry. But the first equation in any freshman electricity and magnetism course is “Coulomb’s law,” which governs the attraction or repulsion of electrical charges and is identical in form to Newton’s law. Now, in modern physics we often tell students that electrons and protons are point particles. In that case, you really do need to worry about infinite forces and it is exactly this difficulty that led to modern field theories, such as quantum electrodynamics. Well Newton himself said, “Hypotheses non fingo”: “Look guys, the equation works, usually.”
Electricity and magnetism courses certainly hold their share of mysteries. The highlight of any beginning course on this subject, at least for the instructor, is Maxwell’s equations, the equations that unified electricity and magnetism into electromagnetism. Soon after postulating them, we demonstrate to our students that light consists of traveling electric and magnetic fields, oscillating at right angles to one another. We next assert that light exerts a pressure on matter; it is this radiation pressure that provides for the detonation of hydrogen bombs and the possibility of solar sails. A common explanation of light pressure in undergraduate texts is that the electric field of the light wave causes electrons to accelerate in one direction, then the magnetic field pushes them forward. Not only is this explanation completely wrong, despite its appearance in fifth editions of books, but to correct it requires introducing a famous ad hoc suggestion known as the Abraham-Lorentz model, which does reproduce the phenomenon. To put the Abraham-Lorentz model on a firm footing, on the other hand, led to the development of quantum electrodynamics. Quantum electrodynamics itself is, however, famously riddled with infinities, and to abolish them requires the further ad hoc procedure of renormalization, which was so distasteful to Paul Dirac that he ceased doing physics altogether. Although the theory of renormalization has advanced since those days, many physicists would echo Richard Feynman, one of the technique’s inventors, who called it “hocus-pocus.” Thus it is not entirely clear whether physics has ever provided an adequate underpinning to the wisdom so blithely dispensed in first-year texts.
» Post Comment