This is the first post in my series about Quantum Field Theory. What a let-down: I will just discuss classical mechanics.
There is a quantum mechanics, and in contrast there is good old classical, Newtonian mechanics. The latter is a limiting case of the former. So there is some correspondence between the two, and there are rules that let you formulate the quantum laws from the classical laws.
But what are those classical laws?
Chances are high that classical mechanics reminds you of pulleys and levers, calculating torques of screws and Newton’s law F = ma: Force is equal to mass times acceleration.
I argue that classical dynamics is most underrated in terms of geek-factor and philosophical appeal.
This might have been ingrained in your brain: A force is tugging at a physical object, such as earth’s gravity is attracting a little ball travelling in space. Now the ball moves – it falls. Actually the moon also falls in a sense when it is orbiting the earth.
When bodies move their positions change. The strength of the gravitational force depends on the distance from the mass causing it, thus the force felt by the moving ball changes. This is why the three-body problem is hard: You need a computer for calculating the forces three or more planets exert on each other at every point of time.
So this is the traditional mental picture associated associated with classical mechanics. It follows these incremental calculations:
Force acts – things move – configuration changes – force depends on configuration – force changes.
In order to get this going you need to know the configuration at the beginning – the positions and the velocities of all planets involved.
So in summary we need:
- the dependence of the force on the position of the masses.
- the initial conditions – positions and velocities.
- Newton’s law.
But there is an alternative description of classical dynamics, offering an alternative philosophy of mechanics so to speak. The description is mathematically equivalent, yet it feels unfamiliar.
In this case we trade the knowledge of positions and velocities for fixing the positions at a start time and an end time. Consider it a sort of game: You know where the planets are at time t1 and at time t2. Now figure out how they have moved / will move between t1 and t2. Instead of the force we consider another, probably more mysterious property:
It is called the action. The action has a dimension of [energy time], and – as the force – it has all information about the system.
The action is calculated by integrating…. I am reluctant to describe how the action is calculated. Action (or its field-y counterparts) will be considered the basic description of a system – something that is given, in the way had been forces had been considered given in the traditional picture. The important thing is: You attach a number to each imaginable trajectory, to each possible history.
The trajectory a particle traverses in time slot t1-t2 are determined by the Principle of Least Action (which ‘replaces’ Newton’s law): The action of the system is minimal for the actual trajectories. Any deviation – such as a planet travelling in strange loops – would increase the action. It is like a particle tests all possible paths and calculates their associated actions. Near the optimum path the action does hardly vary.
This sounds probably awkward – why would you describe nature like this?
(Of course one answer is: this description will turn out useful in the long run – considering fields in 4D space-time. But this answer is not very helpful right now).
That type of logic is useful in other fields of physics: A related principle lets you calculate the trajectory of a beam of light: Given the start point and the end point a beam, light will pick the path that is traversed in minimum time (This rule is called Fermat’s principle).
This is obvious for a straight laser beam in empty space. But Fermat’s principle allows for picking the correct path in less intuitive scenarios, such as: What happens at the interface between different materials, say air and glass? Light is faster in air than in glass, thus is makes sense to add a kink to the path and utilize air as much as possible.
Richard Feynman used the following example: Consider you walk on the beach and hear a swimmer crying for help. Since this is a 1960s text book the swimmer is a beautiful girl. In order to reach her you have to: 1) Run some meters on the sandy beach and 2) swim some meters in the sea. You do an intuitive calculation about the ideal point of where to enter the water: You can run faster than you can swim. By using a little more intelligence we would realize that it would be advantageous to travel a little greater distance on land in order to decrease the distance in the water, because we go so much slower in the water (Source: Feynman’s Lecture Vol. 1 – available online since a few days!)
Those laws are called variational principles: You consider all possible paths, and the path taken is indicated by an extremum, in these cases: a minimum.
Near a minimum stuff does not vary much – the first order derivative is zero at a minimum. Thus on varying paths a bit you actually feel when are close to the minimum – in the way you, as a car driver, would feel the bottom of a valley (It can only go up from here).
Doesn’t this description add a touch of spooky multiverses to classical mechanics already? It seems as if nature has a plan or as if we view anything that has ever or will ever happen from a vantage point outside of space-time.
Things get interesting when masses or charges become smeared out in space – when there is some small ‘infinitesimal’ mass at every point in space. Or generally: When something happens at every point in space. Instead of a point particle that can move in three different directions – three degrees of freedom in physics lingo – we need to deal with an infinite number of degrees of freedom.
Then we are entering the world of fields that I will cover in the next post.