The trouble with Quantum Theory

Quantum theory, QED and in particular its direct descendant the Standard Model are often described as the most successful scientific theories ever devised.  They yield answers that are correct, in some cases, to tens of decimal places.  Quantum theory has held sway for almost a century, however this longevity is no reason for complacency, there are many theories which for a time, at any rate, could claim similar status as the most successful or long lived to date.  Newton’s theory of gravity for example lasted for over three hundred years.  For all of that time it could quite legitimately lay claim to be the most successful theory of gravity ever devised.  That is until it was superseded by Einstein’s theory of General Relativity.  Empedocles’ theory of the four elements, that everything is composed in some parts of earth, air, fire and water, lasted even longer. It was the most successful model for over two thousand years.  That is until it was overturned in the 17th century. Just because we now ridicule the theory does not mean to say that the best scientists of the day believed it to be true for all of that period.  The longevity of a theory does not necessarily mean that it is correct, merely that a better theory has yet to be found.

Most of today’s physicists, if asked what is wrong with the Standard Model, would reply that it is incomplete, that it is missing some small but vital piece which, if we knew what it was, would allow us to complete the theory into a single unified, universal theory of everything.  The problem with this approach is that it fails to acknowledge that the Standard Model is fundamentally flawed, because the theory that underpins it; quantum theory, is itself fundamentally flawed.

It is important to differentiate here between quantum mechanics and quantum theory.  Quantum mechanics is what allows us to make all of the above mentioned calculations, while quantum theory is an attempt to put this into context with an explanation of what is going on.  Quantum mechanics is based largely on ideas associated with probability and this comes about mainly because of the inevitability of the uncertainty principle, which says that we cannot measure certain related variables to arbitrary accuracy.  Basically quantum mechanics says that we have to deal with averages, standard deviations and the like when making calculations about things on an atomic and sub atomic scale.  Quantum theory attempts to explain why this is necessary and the idea that has evolved is that the uncertainty of measurement comes about because uncertainty is somehow intrinsic on the scale of the atom.

While it is certainly the case that any new theory must at least match the accuracy of the Standard Model, this does not mean that the Standard Model is itself correct.  Despite its evident successes there are many shortcomings in the currently held theory.  In this blog some of these shortcomings are examined with a view to discovering just where in the past things went astray.  Since it is only by determining where the current theory went wrong that a new and better theory can be developed.

Quantum mechanic’s success in describing and predicting physical phenomena is self-evident. Its theoretical underpinnings are not.  There are many aspects of quantum theory and its descendant theories which leave much to be desired.   In talking about the Fine Structure Constant, Feynman suggested that physicists new exactly how to calculate its value, but did not understand its physical significance. The same can be said of many aspects of quantum theory; it delivers the right answers but fails to provide an adequate explanation of the physical phenomena which underlie these results.  In these instances quantum theory can best be characterised as providing a description of the phenomena rather than an explanation and leaves the explanation largely as an article of faith.

If we trace back through the history of quantum theory to its very origins: it was Planck who first discovered that the energy emitted by a black body could only do so in discrete packets or quanta,  Einstein then went on to show that light came in discrete packets or quanta, later called Photons. In a sense both of these were observations, not theories.  The first theoretical treatment came in the form of Bohr’s model for the hydrogen atom.  In order to get this model to work Bohr had to make a number of assumptions.  The first was that synchrotron radiation did not happen on the scale of the atom, since if it did the electron orbit would decay and the atom would be unstable.  The second assumption was that angular momentum could only occur in discrete steps or quanta.  It is this assumption that has been carried forward into all subsequent models and forms the foundation on which all of the rest of quantum theory rests. And yet this assumption is highly suspect.  Bohr never put forward any suggestion as to the mechanism that underlay this behaviour and if we take into account the effects of special relativity we find that the assumption does not bear scrutiny[1]. Finally Bohr assumed that relativity does not happen on the scale of the atom and chose to ignore it completely.

The response of the quantum theorists was to obfuscate by suggesting that the particle isn’t really a particle but a wave and that these waves can only exist as standing waves.  The underlying theory was put forward by the French physicist Louis de Broglie.  Closer examination of his ideas however reveals that they amount to a restatement of Bohr’s assumption but this time in the frequency domain.  De Broglie defines the wavelength of his waves as Planck’s constant divided by the linear momentum of the orbiting electron, while at the same time knowing full well that Bohr had assumed that the angular momentum of the electron was an integer multiple of Planck’s constant.  On any other scale we identify the wavelength of an orbiting object with its angular momentum divided by its linear momentum, not as de Broglie did, an integer fraction of its angular momentum divided by its linear momentum.  Little wonder then that de Broglie’s standing waves formed a harmonic sequence.  It was built in to the Bohr assumption.

De Broglie fails to address the question as to what it is that is waving in his wave model.  In the base energy state, we can identify the wavelike properties of the particle with the circular orbital motion of the particle itself, which seems a very reasonable proposition, since this is how we observe wavelike characteristics of any orbiting body.  At higher energy levels this direct physical model breaks down and we are left with the unanswered question as to just what it is that is waving.

Schrödinger then went on to develop a set of wave equations. However these were based on de Broglie’s idea of the particle as a wave, which in turn rests on the Bohr assumption.  And so we find that this assumption runs like a thread throughout the entire theory.

From the 17th century onwards, advocates of the wave theory of light had suggested the existence of a mysterious substance or ether as the medium through which waves were carried.  The idea of the ether was thoroughly discredited by Michelson and Morley in the late 19th century.  Yet despite this, physicists continually try to reinvent the ether as the medium through which the wavelike nature of particles comes about.  Latterly they change the name, so that now we hear about the “fabric of space-time” as if there was some substantive material which oscillated to form the mysterious particle waves. The trouble with this idea is that if there was such a thing as a fabric of space time, it would have to be made out of particles.

Next on the scene was Werner Heisenberg.  He demonstrated that it is not possible to know both the momentum and the position of an object to arbitrary accuracy.  There is always some degree of uncertainty.  Heisenberg originally attributed this to a practical limitation of measurement where the object being measured has the same order of magnitude as the tool used to measure it; the so called Observer Effect.  Eventually Bohr persuaded him that this was not the mechanism that underlay uncertainty, but that on the scale of the atom, in de Broglie’s wave-particles, uncertainty was somehow intrinsic to the particle.  The particle does not exist as such until it is observed in some way, whereupon it reveals itself to have whatever property is being sought. The problem with this is that nobody has ever been able to describe how intrinsic uncertainty works.  The equations work, they can make predictions which are remarkably accurate, but the theoretical model is non existent.  However the equations work equally well if we adopt Heisenberg’s original hypothesis, based on the Observer Effect.

If all of this seems far-fetched, it is, and yet physicists believe that this is an accurate model of the way the universe works. Throughout the development of the model we see a common theme.  The idea is put forward that some observed phenomena can be explained as if some underlying theoretical idea is true.  From here it is a simple semantic transformation to make it actually true.  So to explain the energy levels of the hydrogen atom, it is as if the angular momentum is quantized.  To explain the wavelike nature of the particle it is as if the particle has a wavelength of Planck’s constant divided by the linear momentum.  It is as if uncertainty, which has another perfectly rational explanation, does not follow that explanation, but instead is somehow, in ways that are never explained, intrinsic to the particle.  This transformation of a proposition into a fact is another common thread that runs throughout quantum theory. What we have in fact is a theory that describes how things behave but fails utterly to describe what they are and how they work.

At the heart of it all is Bohr’s assumption that angular momentum is quantised.  There is of course no proof that this is the case, either experimentally or mathematically.  The one experiment which is often cited as showing angular momentum to be quantised does no such thing.  In fact the Stern-Gerlach experiment shows that there are spinning objects whose angular momentum is around one million times smaller than Planck’s constant, which means that the idea that angular momentum can only exist in discrete quanta which are multiples of Planck’s constant simply cannot be true.

Mathematically too there is no evidence for the quantisation of angular momentum.  A number of so called proofs have been cited, but every single one of these begins with the proposition that angular momentum is quantised and goes on to show that therefore it must be quantised.  The very definition of an ontological proof.  The only valid proof is one that begins in the classical domain and shows a causal link between changes in orbital velocity and orbital radius, while at the same time taking into account the effects of relativity on the mass of the orbiting electron.  Were such a genuine proof to exist it author would certainly have been given the Nobel prize since it would in effect provide the missing link between the domains of quantum and classical physics.

When the Bohr model threw up the quirky idea of the quantum leap, that objects can move from one point in space to another without ever occupying anywhere in between, physicists in general and Bohr in particular should have called into question the assumptions that underlay his theoretical model.  They chose not to, instead they have spent the best part of the last one hundred years trying to define a language with which to describe the quantum leap, without ever calling it such.

To be fair to Bohr and his cohort the branch of mathematics which can offer an explanation as to why the energy levels of the atom should be discrete did not emerge until some 30 years later.  Sampling theory is the theory that bridges the gap between continuous time dependent variables and discrete solutions.  Unfortunately for physics it emerged in the late 1940’s and worse still, it was invented by engineers to solve practical problems.

The trouble with quantum theory lies in its very origins.  That it failed to follow the conventional scientific wisdom of rejecting a theory which threw up a paradox and instead tries to bend the facts to fit with the underlying assumptions which should have been discredited.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.