(Guest Post) – Hidden Variables: Just a Little Shy?

Time for a lengthy and somewhat provocative guest post on the subject of the interpretation of quantum mechanics!

–o–

Galileo advocated the heliocentric system in a socratic dialogue. Following the lifting of the Copenhagen view that quantum mechanics should not be interpreted, here is a dialogue about a way of looking at it that promotes progress and matches Einstein’s scepticism that God plays dice. It is embarrassing that we can predict properties of the electron to one part in a billion but we cannot predict its motion in an inhomogeneous magnetic field in apparatus nearly 100 years old. It is tragic that nobody is trying to predict it, because the successes of quantum theory in combination with its strangeness and 20th century metaphysics have led us to excuse its shortcomings. The speakers are Neo, a modern physicist who works in a different area, and Nino, a 19th century physicist who went to sleep in 1900 and recently awoke. – Anton Garrett

Nino: The ultra-violet catastrophe – what about that? We were stuck with infinity when we integrated the amount of radiation emitted by an object over all wavelengths.

Neo: The radiation curve fell off at shorter wavelengths. We explained it with something called quantum theory.

Nino: That’s wonderful. Tell me about it.

Neo: I will, but there are some situations in which quantum theory doesn’t predict what will happen deterministically – it predicts only the probabilities of the various outcomes that are possible. For example, here is what we call a Stern-Gerlach apparatus, which generates a spatially inhomogeneous magnetic field.i It is placed in a vacuum and atoms of a certain type are shot through it. The outermost electron in each atom will set off either this detector, labelled ‘A’, or that detector, labelled ‘B.’ All the electrons coming out of detector B (say) have identical quantum description, but if we put them through another Stern-Gerlach apparatus oriented differently then some will set off one of the two detectors associated with it, and some will set off the other.

Nino: Probabilistic prediction is an improvement on my 19th century physics, which couldn’t predict anything at all about the outcome. I presume that physicists in your era are now looking for a theory that predicts what happens each time you put a particle through successive Stern-Gerlach apparatuses.

Neo: Actually we are not. Physicists generally think that quantum theory is the end of the line.

Nino: In that case they’ve been hypnotised by it! If quantum mechanics can’t answer where the next electron will go then we should look beyond it and seek a better theory that can. It would give the probabilities generated by quantum theory as averages, conditioned on not controlling the variables of the new theory more finely than quantum mechanics specifies.

Neo: They are talked of as ‘hidden variables’ today, often hypothetically. But quantum theory is so strange that you can’t actually talk about which detector the atom goes through.

Nino: Nevertheless only one of the detectors goes off. If quantum theory cannot answer which then we should look for a better theory that can. Its variables are manifestly not hidden, for I see their effect very clearly when two systems with identical quantum description behave differently. ‘Hidden variables’ is a loaded name. What you’ve not learned to do is control them. I suggest you call them shy variables.

Neo: Those who say quantum theory is the end of the line argue that the universe is not deterministic – genuinely random.

Nino: It is our theories which are deterministic or not. ‘Random’ is a word that makes our uncertainty about what a system will do sound like the system itself is uncertain. But how could you ever know that?

Neo: Certainly it is problematic to define randomness mathematically. Probability theory is the way to make inference about outcomes when we aren’t certain, and ‘probability’ should mean the same thing in quantum theory as anywhere else. But if you take the hidden variable path then be warned of what we found in the second half of the 20th century. Any hidden variables must be nonlocal.

Nino: How is that?

Neo: Suppose that the result of a measurement of a variable for a particle is determined by the value of a variable that is internal to the particle – a hidden variable. I am being careful not to say that the particle ‘had’ the value of the variable that was measured, which is a stronger statement. The result of the measurement tells us something about the value of its internal variable. Suppose that this particle is correlated with another – if, for example, the pair had zero combined angular momentum when previously they were in contact, and neither has subsequently interacted with anything else. The correlation now tells you something about the internal variable of the second particle. For situations like this a man called Bell derived an inequality; one cannot be more precise because of the generality about how the internal variables govern the outcome of measurements.ii But Bell’s inequality is violated by observations on many pairs of particles (as correctly predicted by quantum mechanics). The only physical assumption was that the result of a measurement on a particle is determined by the value of a variable internal to it – a locality assumption. So a measurement made on one particle alters what would have been seen if a measurement had been made on one of the other particles, which is the definition of nonlocality. Bell put it differently, but that’s the content of it.iii

Nino: Nonlocality is nothing new. It was known as “action at a distance” in Newton’s theory of gravity, several centuries ago.

Neo: But gravitational force falls off as the inverse square of distance. Nonlocal influences in Bell-type experiments are heedless of distance, and this has been confirmed experimentally.iv

Nino: In that case you’ll need a theory in which influence doesn’t decay with distance.

Neo: But if influence doesn’t decay with distance then everything influences everything else. So you can’t consider how a system works in isolation any more – an assumption which physicists depend on.

Nino: We should view the fact that it often is possible to make predictions by treating a system as isolated as a constraint on any nonlocal hidden variable theory. It is a very strong constraint, in fact.

Neo: An important further detail is that, in deriving Bell’s inequality, there has to be a choice of how to set up each apparatus, so that you can choose what question to ask each particle. For example, you can choose the orientation of each apparatus so as to measure any one component of the angular momentum of each particle.

Nino: Then Bell’s analysis can be adapted to verify that two people, who are being interrogated in adjacent rooms from a given list of questions, are in clandestine contact in coordinating their responses, beyond having merely pre-agreed their answers to questions on the list. In that case you have a different channel – if they have sharper ears than their interrogators and can hear through the wall – but the nonlocality follows simply from the data analysis, not the physics of the channel.

Neo: In that situation, although the people being interrogated can communicate between the rooms in a way that is hidden from their interrogators, the interrogators in the two rooms cannot exploit this channel to communicate between each other, because the only way they can infer that communication is going on is by getting together to compare their sets of answers. Correspondingly, you cannot use pre-prepared particle pairs to infer the orientation of one detector by varying the orientation of the second detector and looking at the results of particle measurements at that second detector alone. In fact there are general no-signalling theorems associated with the quantum level of description.v There are also more striking verifications of nonlocality using correlated particle pairs,vi and with trios of correlated particles.vii

Nino: Again you can apply the analysis to test for clandestine contact between persons interrogated in separate rooms. Let me explain why I would always search for the physics of the communication channel between the particles, the hidden variables. In my century we saw that tiny particles suspended in water, just visible under our microscopes, jiggle around. We were spurred to find the reason – the particles were being jostled by smaller ones still, which proved to be the smallest unit you can reach by subdividing matter using chemistry: atoms. Upon the resulting atomic theory you have built quantum mechanics. Since then you haven’t found hidden variables underneath quantum mechanics in nearly 100 years. You suggest they aren’t there to be found but essentially nobody is looking, so that would be a self-fulfilling prophecy. If the non-determinists had been heeded about Brownian motion – and there were some in my time, influenced by philosophers – then the 21st century would still be stuck in the pre-atomic era. If one widget of a production line fails under test but the next widget passes, you wouldn’t say there was no reason; you’d revise your view that the production process was uniform and look for variability in it, so that if you learn how to deal with it you can make consistently good widgets.

Neo: But production lines aren’t based on quantum processes!

Nino: But I’m not wedded to quantum mechanics! I am making a point of logic, not physics. Quantum mechanics has answered some questions that my generation couldn’t and therefore superseded the theories of my time, so why shouldn’t a later generation than yours supersede quantum mechanics and answer questions that you couldn’t? It is scientific suicide for physicists to refuse to ask a question about the physical world, such as what happens next time I put a particle through successive Stern-Gerlach apparatuses. You say you are a physicist but the vocation of physicists is to seek to improve testable prediction. If you censor or anaesthetise yourself, you’ll be stuck up a dead end.

Neo: Not so fast! Nolocality isn’t the only radical thing. The order of measurements in a Bell setup is not Lorentz-invariant, so hidden variables would also have to be acausal – effect preceding cause.

Nino: What does ‘Lorentz-invariant’ mean, please?

Neo: This term came out of the theory that resolved your problems about aether. Electromagnetic radiation has wave properties but does not need a medium (‘aether’) to ‘do the waving’ – it propagates though a vacuum. And its speed relative to an observer is always the same. That matches intuition, because there is no preferred frame that is defined by a medium. But it has a counter-intuitive consequence, that velocities do not add linearly. If a light wave overtakes me at c (lightspeed) then a wave-chasing observer passing me at v still experiences the wave overtaking him at c, although our familiar linear rule for adding velocities predicts (c – v). That rule is actually an approximation, accurate at speeds much less than c, which turns out to be a universal speed limit. For the speed of light to be constant for co-moving observers then, because speed is distance divided by time, space and time must look different to these observers. In fact even the order of events can look different to two co-moving observers! The transformation rule for space and time is named after a man called Lorentz. That not just the speed of light but all physical laws should look the same for observers related by the Lorentz transformation is called the relativity principle. Its consequences were worked out by a man called Einstein. One of them is that mass is an extremely concentrated form of energy. That’s what fuels the sun.

Nino: He was obviously a brilliant physicist!

Neo: Yes, although he would have been shocked by Bell’s theorem.viii He asserted that God did not play diceix – determinism – but he also spoke negatively of nonlocality, as “spooky actions at a distance.” x Acausality would have shocked him even more. The order of measurements on the two particles in a Bell setup can be different for two co-moving observers. So an observer dashing through the laboratory might see the measurements done in reverse order than the experimenter logs. So at the hidden-variable level we cannot say which particle signals to which as a result of the measurements being made, and the hidden variables must be acausal. Acausality is also implied in ‘delayed choice’ experiments, as follows.xi Light – and, remarkably, all matter – has both particle properties (it can pass through a vacuum) and wave properties (diffraction), but only displays one property at a time. Suppose we decide, after a unit of light has passed a pair of Young’s slits, whether to measure the interference pattern – due to its diffractive properties as a wave propagating through both slits – or its position, which would tell us which single slit it traversed. According to quantum mechanics our choice seems to determine whether it traverses one slit or both, even though we made that choice after it had passed through! Acausality means that you would have to know the future in order to predict it, so this is a limit on prediction – confirming the intuition of quantum theorists that you can’t do better.

Nino: That will be so in acausal experimental situations, I accept. I believe the theory of the hidden variables will explain why time, unlike space, passes, and also entail a no-time-paradox condition.

Neo: Today we say that a theory must not admit closed time-like trajectories in space-time.

Nino: But a working hidden-variable theory would still give a reason why the system behaves as it did, even if we can’t access the information needed for prediction in situations inferred to be acausal. You can learn a lot from evolution equations even if you don’t know the initial conditions. And often the predictions of quantum theory are compatible with locality and causality, and in those situations the hidden variables might predict the outcome of a measurement exactly, outdoing quantum theory.

Neo: It also turned out that some elements of the quantum formalism do not correspond to things that can be measured experimentally. That was new in physics and forced physicists to think about interpretation. If differing interpretations give the same testable predictions, how do we choose between them?

Nino: Metaphysics then enters and it may differ among physicists, leading to differing schools of interpretation. But non-physical quantities have entered the equations of a theory before. A potential appears in the equations of Newtonian gravity and electromagnetism, but only differences in potential correspond to something physical.

Neo: That invariance, greatly generalised, lies behind the ‘gauge’ theories of my era. These are descriptions of further fundamental forces, conforming to the relativity principle that physics must look the same to co-moving observers related by the Lorentz transformation. That includes quantum descriptions, of course.xii It turned out that atoms have their positive charge in a nucleus contributing most of the mass of an atom, which is orbited by much lighter negatively charged particles called electrons – different numbers of electrons for different chemical elements. Further forces must exist to hold the positively charged particles in the nucleus together against their mutual electrical repulsion. These further forces are not familiar in everyday life, so they must fall off with distance much faster than the inverse square law of electromagnetism and gravity. Mass ‘feels’ gravity and charge feels electromagnetic forces, and there are analogues of these properties for the intranuclear forces, which are also felt by other more exotic particles not involved in chemistry. We have a unified quantum description of the intranuclear forces combined with electromagnetism that transforms according to the relativity principle, which we call the standard model, but we have not managed to incorporate gravity yet.

Nino: But this is still a quantum theory, still non-deterministic?

Neo: Ultimately, yes. But it gives a fit to experiment that is better than one part in a thousand million for certain properties of the electron – which it does predict deterministically.xiii That is the limit of experimental accuracy in my era, and nobody has found an error anywhere else.

Nino: That’s magnificent, and it says a huge amount for the progress of experimental physics too. But I still see no reason to move from can-do to can’t-do in aiming to outdo quantum theory.

Neo: Let me explain some quantum mechanics.xiv The variables we observe in regular mechanics, such as momentum, correspond to operators in quantum theory. The operators evolve deterministically according to the Hamiltonian of the system; waves are just free-space solutions. When you measure a variable you get one of its eigenvalues, which are real-valued because the operators are Hermitian. Quantum mechanics gives a probability distribution over the eigenspectrum. After the measurement, the system’s quantum description is given by the corresponding eigenfunction. Unless the system was already described by that eigenfunction before the measurement, its quantum description changes. That is a key difference from classical mechanics, in which you can in principle observe a system without disturbing it. Such a change (‘collapse’) makes it impossible to determine simultaneously the values of variables whose operators do not have coincident eigenfunctions – in other words, non-commuting operators. It has even been shown, using commuting subsets of the operators of a system in which members of differing sets do not commute, that simultaneous values of differing variables of a system cannot exist.xv

Nino: Does that result rest on any aspect of quantum theory?

Neo: Yes. Unlike Bell setups, which compare experiment with a locality criterion, neither of which have anything to do with quantum mechanics (it simply predicts what is seen), this further result is founded in quantum mechanics.

Nino: But I’m not committed to quantum mechanics! This result means that the hidden variables aren’t just the values of all the system variables, but comprise something deeper that somehow yields the system variables and is not merely equivalent to the set of them.

Neo: Some people suggest that reality is operator-valued and our perplexities arise because of our obstinate insistence on thinking in – and therefore trying to measure – scalars.

Nino: An operator is fully specified by its eigenvalues and eigenfunctions; it can be assembled as a sum over them, so if an operator is a real thing then they would be real things too. If a building is real, the bricks it is constructed from are real. But I still insist that, like any other physical theory, quantum theory should be regarded as provisional.

Neo: Quantum theory answered questions that earlier physics couldn’t, such as why electrons do not fall into the nucleus of an atom although opposite charges attract. They populate the eigenspectrum of the Hamiltonian for the Coulomb potential, starting at the lowest energy eigenfunction, with not more than two electrons per eigenfunction. When the electrons are disturbed they jump between eigenvalues, so that they cannot fall continuously. This jumping is responsible for atomic spectrum lines, whose vacuum wavelength is inversely proportional to the difference in energy of the eigenvalues. That is why quantum mechanics was accepted. But the difficulty of understanding it led scientists to take a view, championed by a senior physicist at Copenhagen, that quantum mechanics was merely a way of predicting measurements, rather than telling us how things really are.

Nino: That distinction is untestable even in classical mechanics. This is really about motivation. If you don’t believe that things ‘out there’ are real then you’ll have no motivation to think about them. The metaphysics beneath physics supposes that there is order in the world and that humans can comprehend it. Those assumptions were general in Europe when modern physics began. They came from the belief that the physical universe had an intelligent creator who put order in it, and that humans could comprehend this order because they had things in common with the creator (‘in his image’). You don’t need a religious faith to enter physics once it has got going and the patterns are made visible for all to see; but if ever the underlying metaphysics again becomes relevant, as it does when elements of the formalism do not correspond to things ‘out there,’ then such views will count. If you believe there is comprehensible and interesting order in the material universe then you will be more motivated to study it than others who suppose that differentiation is illusion and that all is one, i.e. the monist view held by some other cultures. So, in puzzling why people aren’t looking for those not-so-hidden variables, let me ask: Did the view that nature was underpinned by a divine creator get weaker where quantum theory emerged, in Europe, in the era before the Copenhagen view?

Neo: Religion was weakening during your time, as you surely noticed. That trend did continue.

Nino: I suggest the shift from optimism to defeatism about improving testable prediction is a reflection of that change in metaphysics reaching a tipping point. Culture also affects attitudes; did anything happen that induced pessimism between my era and the birth of quantum mechanics?

Neo: The most terrible war in history to that date took place in Europe. But we have moved on from the Copenhagen ‘interpretation’ which was a refusal of all questions about the formalism. That stance is acceptable provided it is seen as provisional, perhaps while the formalism is developed; but not as the last word. Physicists eventually worked out the standard model for the intranuclear forces in combination with electromagnetism. Bell’s theorem also catalysed further exploration of the weirdness of quantum mechanics, from the 1960s; let me explain. Before a measurement of one of its variables, a system is generally not in an eigenstate of the corresponding operator. This means that its quantum description must be written as a superposition of eigenstates. Although measurement discloses a single eigenvalue, remarkable things can be done by exploiting the superposition. We can gain information about whether the triggering mechanisms of light-activated bombs are good or dud triggers in an experiment in which light might be shone at each, according to a quantum mechanism.xvi (This does involve ‘sacrificing’ some of the working bombs, but without the quantum trick you would be completely stuck, because the bomb is booby-trapped to go off if you try to dismantle the trigger.) Even though we have electronic computers today that do millions of operations per second, many calculations are still too large to be done in feasible lengths of time, such as integer factorisation. We can now conceive of quantum computers that exploit the superposition to do many calculations in one, and drastically speed things up.xvii Communications can be made secure in the sense that eavesdropping cannot be concealed, as it would unavoidably change the quantum state of the communication system. The apparent reality of the superposition in quantum mechanics, together with the non-existence of definite values of variables in some circumstances, mean that it is unclear what in the quantum formalism is physical, and what is our knowledge about the physical – in philosophical language, what is ontological and what is epistemological. Some people even suggest that, ultimately, numbers – or at least information quantified as numbers – are physics.

Nino: That’s a woeful confusion – information about what? As for deeper explanation, when things get weird you either give up on going further – which no scientist should ever do – or you take the weirdness as a clue. Any no-hidden-variables claim must involve assumptions or axioms, because you can’t prove something is impossible without starting from assumptions. So you should expose and question those assumptions (such as locality and causality). Don’t accept any axioms that are intrinsic to quantum theory, because your aim is to go beyond quantum theory.

Neo: Some people, particularly in quantum computing, suggest that when a variable is measured in a situation in which quantum mechanics predicts the result probabilistically, the universe actually splits into many copies, with each of the possible values realised in one copy.xviii We exist in the copy in which the results were as we observed them, but in other universes copies of us saw other results.

Nino: We couldn’t observe the other universes, so this is metaphysics, and more fantastic than Jules Verne! What if the spectrum of possible outcomes includes a continuum of eigenvalues? Furthermore a measurement involves an interaction between the measuring apparatus and the system, so the apparatus and system could be considered as a joint system quantum-mechanically. There would be splitting into many worlds if you treat the system as quantum and the apparatus as classical, but no splitting if you treat them jointly as quantum. Nothing privileges a measuring apparatus, so physicists are free to analyse the situation in these two differing ways – but then they disagree about whether splitting has taken place. That’s inconsistent.

Neo: The two descriptions must be reconciled. As I said, a system left to itself evolves according to the Hamiltonian of the system. When one of its variables is measured, it undergoes an interaction with an apparatus that makes the measurement. The system finishes in an eigenstate of the operator corresponding to the variable measured, while the apparatus flags the corresponding eigenvalue. This scenario has to be reconciled with a joint quantum description of the system and apparatus, evolving according to their joint Hamiltonian including an interaction term. Reconciliation is needed in order to make contact with scalar values and prevent a regress problem, since the apparatus interacts quantum-mechanically with its immediate surroundings, and so on. Some people propose that the regress is truncated at the consciousness of the observer.

Nino: I thought vitalism was discredited once the soul was found to be massless, upon weighing dying creatures! The proposal you mention actually makes the regress problem worse, because if the result of a measurement is communicated to the experimenter via intermediaries who are conscious – who are aware that they pass on the result – then does it count only when it reaches the consciousness of the experimenter (an instant of time that is anyway problematic to define)? If so, why?

Neo: That’s a regress problem on the classical side of the chain, whereas I was talking about a regress problem on the quantum side. This suggests that the regress is terminated where the system is declared to have a classical description.xix I fully share your scepticism about the role of consciousness and free will. Human subjects have tried to mentally influence the outcomes of quantum measurements and it is not accepted that they can alter the distribution from the quantum prediction. Some people even propose that consciousness exists because matter itself is conscious and the brain is so complex that this property is manifest. But they never clarify what it means to say that atoms may have consciousness, even of a primitive sort.

Nino: Please explain how our regress terminates where we declare something classical.

Neo: For any measured eigenvalue of the system there are generally many degrees of freedom in the Hamiltonian of the apparatus, so that the density of states of the apparatus is high. (This is true even if the quantum states are physically large, as in low temperature quantum phenomena such as superconductivity.) Consider the apparatus variable that flags the result of the measurement. In the sum over states giving the expectation value of this variable, cross terms between quantum states of the apparatus corresponding to different eigenvalues of the system are very numerous. These cross terms are not generally correlated in amplitude or phase, so that they average out in the expectation value in accordance with the law of large numbers.xx Even if that is not the case they are usually washed out by interactions with the environment, because you cannot in practice isolate a system perfectly. This is called decoherence,xxi and nonlocality and those striking quantum-computer effects can only be seen when you prevent it.

Nino: Remarkable! But you still have only statistical prediction of which eigenvalue is observed.

Neo: Your deterministic viewpoint has been disparaged by some as an outmoded, clockwork view of the universe.

Nino: Just because I insist on asking where the next particle will go in a Stern-Gerlach apparatus? Determinism is a metaphysical assumption; no more or less. It inspires progress in physics, which any physicist should support. Let me return to nonlocality and acausality (which is a kind of directional nonlocality with respect to time, rather than space). They imply that the physical universe is an indivisible whole at the fundamental level of the hidden variables. That is monist, but is distinct from religious monism because genuine structure exists in the hidden – or rather shy – variables.

Neo: Certainly space and time seem to be irrelevant to the hidden interactions between particles that follow from Bell tests. As I said, we have a successful quantum description of electromagnetic interactions and have combined it with the forces that hold the atomic nucleus together. In this description we regard the electromagnetic field itself as an operator-valued variable, according to the prescription of quantum theory. The next step would be to incorporate gravity. That would not be Newtonian gravity, which cannot be right because, unlike Maxwell’s equations, it only looks the same to co-moving observers who are related by the Galilean transform of space and time – itself only a low-speed approximation to the correct Lorentz transform. Einstein found a theory of gravity that transforms correctly, known as general relativity, and to which Newton’s theory is an approximation. Einstein’s view was that space and time were related to gravity differently than to the other forces, but a theory that is almost equivalent to his (predicting identically in all tests to date) has since emerged that is similar to electromagnetism – a gauge theory in which the field is coupled naturally to matter which is described quantum-mechanically.xxii Unlike electromagnetism, however, the gravitational field itself has not yet been successfully quantised, hindering the marrying of it to other forces so as to unify them all. Of course we demand a theory that takes account of both quantum effects and relativistic gravity, for any theory that neglects quantum effects becomes increasingly inaccurate where these are significant – typically on the small scale inside atoms – while any theory that neglects relativistic gravitational effects becomes increasingly inaccurate where they are significant – typically on large scales where matter is gathered into dense massive astronomical bodies. Not even light can escape from some of these bodies – and, because the speed of light is a universal speed limit, nor can anything else. Quantum and gravitational effects are both large if you look at the universe far enough back in time, because we have learned that the universe was once very small and very dense. So a complete theory is indispensable for cosmologists who seek to study origins. The preferred program for quantum gravity today is known as string theory. But it has a deeply complicated structure and is infeasible to test experimentally, rendering progress slow.

Nino: But it’s still not a complete theory if it’s a quantum theory. Please say more about that very small dense stage of the universe which presumably expanded to give what we see today.

Neo: We believe the early part of the expansion underwent a great boost, known as inflation, which explains how the universe is unexpectedly smooth on the largest scale today and is also not dominated by gravity. Everything in the observed universe was, in effect, enormously diluted. Issues of causality also arise. But the mechanism for inflation is conjectural, and inflation raises other questions.

Nino: Unexpected large-scale smoothness sounds to me like a manifestation of nonlocality. Furthermore the hidden variables are acausal. Perhaps you cannot do without them at such extreme densities and temperatures. Then you wouldn’t need to invoke inflation.

Neo: We believe that inflation took place after the ‘Planck’ era in which a full theory of quantum theory of gravity is indispensible for accuracy. In that case our present understanding is adequate to describe the inflationary epoch.

Nino: You are considering the entire universe, yet you cannot predict which detector goes off next when consecutive particles having identical quantum description are fired through a Stern-Gerlach apparatus. Perhaps you should walk before you run. Then your problems in unifying the fundamental forces and applying the resulting theory to the entire universe might vanish.

Neo: That’s ironic – the older generation exhorting the younger to revolution! To finish, what would you say to my generation of physicists?

Nino: It is magnificent that you can predict properties of the electron to nine decimal places, but that makes it more embarrassing that you cannot tell something as basic as which way a silver atom will pass through an inhomogeneous magnetic field, according to its outermost electron. That incapability should be an itch inside your oyster shell. Seek a theory which predicts the outcome when systems having identical quantum specification behave differently. Regard all strange outworkings of quantum mechanics as information about the hidden variables. Purported no-hidden-variables theorems that are consistent with quantum mechanics must contain extra assumptions or axioms, so put such theorems to work for you by ensuring that your research violates those assumptions. Ponder how to reconcile the success of much prediction upon treating systems as isolated with the nonlocality and acausality visible in Bell tests. Don’t let anything put you off because, barring a lucky experimental anomaly, only seekers find. By doing that you become part of a great project.

Anthony Garrett has a PhD in physics (Cambridge University, 1984) and has held postdoctoral research contracts in the physics departments of Cambridge, Sydney and Glasgow Universities. He is Managing Editor of Scitext Cambridge (www.scitext.com), an editing service for scientific documents.

i Gerlach, W. & Stern, O., “Das magnetische Moment des Silberatoms”, Zeitschrift für Physik 9, 353-355 (1922).

ii Bell, J.S., “On the Einstein Podolsky Rosen paradox”, Physics 1, 195-200 (1964).

iiiGarrett, A.J.M., “Bell’s theorem and Bayes’ theorem”, Foundations of Physics 20, 1475-1512 (1990).

iv The most rigorous test of Bell’s theorem to date is: Giustina, M., Mech, A., Ramelow, S., Wittmann, B., Kofler, J., Beyer, J., Lita, A., Calkins, B., Gerrits, T., Nam, S.-W., Ursin R. & Zeilinger, A., “Bell violation using entangled photons without the fair-sampling assumption”, Nature 497, 227-230 (2013). For a test of the 3-particle case, see: Bouwmeester, D., Pan, J.-W., Daniell, M., Weinfurter, H. & Zeilinger, A., “Observation of three-photon Greenberger-Horne-Zeilinger entanglement”, Physical Review Letters 82, 1345-1349 (1999).

v Bussey, P.J., “Communication and non-communication in Einstein-Rosen experiments”, Physics Letters A123, 1-3 (1987).

viMermin, N.D., “Quantum mysteries refined”, American Journal of Physics 62, 880-887 (1994). This is a very clear tutorial recasting of: Hardy, L., “Nonlocality for two particles without inequalities for almost all entangled states”, Physical Review Letters 71, 1665-1668 (1993).

vii Mermin, N.D., “Quantum mysteries revisited”, American Journal of Physics 58, 731-734 (1990). This is a tutorial recasting of the ‘GHZ’ analysis: Greenberger, D.M., Horne, M.A. & Zeilinger, A., 1989, “Going beyond Bell’s theorem”, in Bell’s Theorem, Quantum Theory and Conceptions of the Universe, ed. M. Kafatos (Kluwer Academic, Dordrecht, Netherlands), p.69-72.

viii Einstein, A., Podolsky, B. & Rosen, N., “Can quantum-mechanical description of physical reality be considered complete?”, Physical Review 47, 777-780 (1935).

ixEinstein, A., Letter to Max Born, 4th December 1926. English translation in: The Born-Einstein Letters 1916-1955 (MacMillan Press, Basingstoke, UK), 2005, p.88.

x Einstein, A., Letter to Max Born, 3rd March 1947. Ibid., p.155.

xiWheeler, J.A., 1978, “The ‘past’ and the ‘delayed-choice’ double-slit experiment”, in Mathematical Foundations of Quantum Theory, ed. A.R. Marlow (Academic Press, New York, USA), p.9-48. Experimental verification: Jacques, V., Wu, E., Grosshans, F., Treusshart, F., Grangier, P., Aspect, A. & Roch, J.-F., “Experimental Realization of Wheeler’s Delayed-Choice Gedanken Experiment”, Science 315, 966-968 (2007).

xii Weinberg, S., 2005, The Quantum Theory of Fields, vols. 1-3 (Cambridge University Press, UK).
Hanneke, D., Fogwell, S. & Gabrielse, G., “New measurement of the electron magnetic moment and the fine-structure constant”, Physical Review Letters 100, 120801 (2008); 4pp.

xiii Dirac, P.A.M., 1958, The Principles of Quantum Mechanics (4th ed., Oxford University Press, UK).

xiv Mermin, N.D., “Simple unified form for the major no-hidden-variables theorems”, Physical Review Letters 65, 3373-3376 (1990); Mermin, N.D., “Hidden variables and the two theorems of John Bell”, Reviews of Modern Physics 65, 803-815 (1993). This is a simpler version of the ‘Kochen-Specker’ analysis: Kochen, S. &

xvSpecker, E.P., “The problem of hidden variables in quantum mechanics”, Journal of Mathematics and Mechanics, 17, 59-87 (1967).

xvi Elitzur, A.C. & Vaidman, L., “Quantum mechanical interaction-free measurements”, Foundations of Physics 23, 987-997 (1993).

xvii Mermin, N.D., 2007, Quantum Computer Science (Cambridge University Press, UK).

xviii DeWitt, B.S. & Graham, N. (eds.), 1973, The Many-Worlds Interpretation of Quantum Mechanics (Princeton University Press, New Jersey, USA). The idea is due to Hugh Everett III, whose work is reproduced in this book.

xixThis immediately resolves the well known Schrödinger’s cat paradox.

xxVan Kampen, N.G., “Ten theorems about quantum mechanical measurements”, Physica A153, 97-113 (1988).

xxi Zurek, W.H., “Decoherence and the transition from quantum to classical”, Physics Today 44, 36-44 (1991).

xxii Lasenby, A., Doran, C. & Gull, S., “Gravity, gauge theories and geometric algebra”, Philosophical Transactions of the Royal Society A356, 487-582 (1998). This paper derives and studies gravity as a gauge theory using the mathematical language of Clifford algebra, which is the extension of complex analysis to higher dimensions than 2. Just as complex analysis is more efficient than vector analysis in 2 dimensions, Clifford algebra is superior to conventional vector/tensor analysis in higher dimensions. (Quaternions are the 3-dimensional version, a generalisation that Nino would doubtless appreciate.) Nobody uses the Roman numeral system any more for long division! This theory of gravity involves two fields that obey first-order differential equations with respect to space and time, in contrast to general relativity in which the metric tensor obeys second-order equations. These gauge fields derive from translational and rotational invariance and can be expressed with reference to a flat background spacetime (i.e., whenever coordinates are needed they can be Cartesian or some convenient transformation thereof). Presumably it is these two gauge fields, rather than the metric tensor, that should satisfy quantum (non-)commutation relations in a quantum theory of gravity.

35 Responses to “(Guest Post) – Hidden Variables: Just a Little Shy?”

  1. […] Hidden Variables: Just a Little Shy? […]

  2. “If you believe there is comprehensible and interesting order in the material universe then you will be more motivated to study it than others who suppose that differentiation is illusion and that all is one, i.e. the monist view held by some other cultures. So, in puzzling why people aren’t looking for those not-so-hidden variables, let me ask: Did the view that nature was underpinned by a divine creator get weaker where quantum theory emerged, in Europe, in the era before the Copenhagen view?”

    Thank you very much for this clearly enunciated summary on some of the mysteries of the quantum realm; I found it somewhat helpful. However, you clearly misunderstand the so-called monist view held by some other cultures and seem more than a wee bit disparaging towards it. Perhaps I could help clarify the subject.

    The independent postulate set of Madhyamika Buddhism:

    1) There is nothing in existence with intrinsic existence. This is called sunyata or emptiness and is basically equivalent to strong emergence in science. They don’t say that things are illusory, rather, things are emergent, hence, they have no “self” nature; they do not exist unto themselves.

    That’s really it! And this is really backed up by Quantum Physics. I think the problem that most people have with the Quantum is that they insist on matter being constructed from “solid” and “concrete” and “elementary” particles and if Quantum Physics demonstrates anything it is that there is no such thing! The “Democritin atoms” of existence are patterns, and ephemeral patterns to boot . . .

    And Buddhists do not accept the existence of a divine creator, this runs counter to their postulate set! A divine, hence, absolute creator would, by definition, exist unto itself! Buddhist cosmology is explored from the perspective of enlightened beings, hence, there are a number of subtleties involved but it in no way or form runs counter to modern day science; in fact, it conforms rather well to ideas expressed by John Wheeler and the formalism of Geoffrey Chew amongst others.

    To get a better introduction to sunyata, I would recommend the commentary to Shantideva’s “Guide to the Bodhisattva Way of Life” by His Holiness the Dalai Lama, “Transcendent Wisdom” (http://www.shambhala.com/transcendent-wisdom.html). This little book was translated by B. Alan Wallace, a Buddhist monk with a BA in physics and philosophy of science, so the notes can be rather illuminating. I would also highly recommend the excellent book by high energy particle physicist, Fritjof Capra, called “The Tao of Physics” (http://www.shambhala.com/the-tao-of-physics-3110.html) for elucidation on the parallels between physics and the “monist view.” They are really quite complementary as Dr. Capra demonstrates; you have to consider, both approaches study the same reality and both, though different, are equally legitimate . . .

    With regards,
    Wes Hansen

    • Anton Garrett Says:

      Hi Wes

      I am open to correction about buddhism (although I’ve read a couple of nontrivial books about it), but am mystified that you think I reckon buddhists accept the existence of a divine creator. I am aware that they don’t and if my essay suggests it then please tell me where as I must correct it.

      I disagree, however, that buddhism accords well with quantum theory. I take the view that quantum theory is part of physics which emerged (noncoincidentally) out of Western civilisation, not out of eastern. I regard the Tao of Physics as nonsense.

      • “If you believe there is comprehensible and interesting order in the material universe then you will be more motivated to study it than others who suppose that differentiation is illusion and that all is one, i.e. the monist view held by some other cultures. So, in puzzling why people aren’t looking for those not-so-hidden variables, let me ask: Did the view that nature was underpinned by a divine creator get weaker where quantum theory emerged, in Europe, in the era before the Copenhagen view?”

        It was in the piece I quoted in the previous comment and re-quote here; it’s just an implication in the way you connect the monist view with the divine creator. Of course the Buddhists and the Samkya are the only two non-theistic “religions” I am aware of so . . .

        I don’t find the Tao of Physics to be ridiculous; consider the following:

        These two excellent experiments:

        Click to access ABIOMAC_5%20-%20pesquisa%20-%20evidencias%20eletrofisiologicas%20da%20intuicao%20intuition-part1.pdf

        Click to access intuition-part2.pdf

        which are both part of this meta-analysis:

        http://journal.frontiersin.org/article/10.3389/fpsyg.2012.00390/full

        demonstrate that the human heart becomes aware of an emotionallly stimulating event almost 5 seconds before that event manifests in conscious awareness. Now I know there’s a lag between receptors receiving signals and an integrated construct forming in conscious awareness but this lag is milliseconds. So what this suggests to me is that the heart becomes aware of an event before that event even enters the light-cone, which clearly necessitates super-luminal information transfer.

        Is there anything within the current theoretical framework which allows such a thing? In fact there is: DeBroglie’s pilot wave construct! DeBroglie demonstrated that the pilot wave travels at the same velocity as the “particle” but what most physicists sweep under the rug is the fact that the wave packet forming the pilot wave must be super luminal! The exact relationship is v = (c^2)/v, where the left v is the velocity of the wave packet and the right the velocity of the pilot wave/”particle” pair. Stanford physicist, William Tiller, provides a straightforward analysis of the relevant wave mechanics here: http://www.tillerinstitute.com/pdf/White%20Paper%20V.pdf

        So now take two identical and synchronized atomic clocks and leave one stationary while you accelerate the other to near the speed of light, what happens? The one travelling near the speed of light slows down relative to the stationary one, why is this so? Does it not seem plausible that the answer lies in v = (c^2)/v? As the right v approaches c so does the left but the left is slowing down! Is it not possible that the atomic clock is somehow correlated with the wave packet velocity? This would certainly be in agreement with the “Zitter Model” of David Hestenes: http://fqxi.org/community/forum/topic/339

        If you read Dr. Hestenes’ short FQXi essay pay special attention to the speculative remarks he makes at the very end; these remarks are almost identical to the speculation made by the Russian astrophysicist Kozyrev at the conclusion of his torsion experiments as disclosed in this paper by William Tiller: http://www.tillerinstitute.com/pdf/White%20Paper%20IX.pdf

        Of course Dr. Tiller provides a rather illuminating counter-argument but . . . Dr. Tiller, in addition to being a highly regarded Stanford physicist, is also a Qi Gong master (he and his wife have been practicing since the 50’s). Tiller and one of his former grad students, Walter Dibble, have conducted a number of very interesting experiments, some peer replicated, demonstrating that highly developed spiritual persons can change the state of matter simply by holding a conscious intent to do so: http://tillerinstitute.com/pdf/White%20Paper%20I.pdf

        Now, if you consider that the Buddhist Kalachakra (world wheel), the Hindu Yugas (world cycles), the Mayan Long Count, and etc. all track a galactic limit cycle where each phase transition is accompanied by a corresponding limit cycle in the human community, i.e. the flood hero, then the current hero is the aether hero and that’s what Hestenes is describing with his electron clock and what Tiller is describing with his R-space!

        Now, I can’t help but wonder why Philip Helbig has left numerous comments on your post; he is one of Sabine Hossenfelder’s trolls and is well aware that I recently left the following two comments on a recent post on Hossenfelder’s backreaction blog (Sabine is also aware of much if not all of the afore-mentioned information; http://backreaction.blogspot.com/2015/07/the-number-crunchers-how-we-learned-to.html):

        1)I thought this issue was settled long ago!?! Numerical methods have really been indispensable to science since the 70’s with the parallel advent of more efficient computation and dynamical chaos. How did Feigenbaum discover Universality in phase transitions? And in my opinion it was Feigenbaum’s discovery which put the renormalization group on a solid foundation, at least philosophically. And even when Lanford, a pure mathematician, developed a proof, which the community deemed rigorous, the proof depended to a large degree on numerical methods.

        I think people who express disdain for numerical methods are just inherently dishonest, with themselves and others. All knowledge is provisional in that it rests on a foundation of induction and numerical methods bring this to the forefront. And that, to me, is a good thing; it dispels dogma! As the early chaos pioneers liked to say, numerical methods develop intuition. Dogmatists quite often see causation where only correlation exists anyway!

        This is really what made me wonder if perhaps James Gates hadn’t discovered the reason why Universality appears with his adinkras. If you’re not familiar with Gates, he works with SUSY and his adinkras are Feynman diagram analogs which represent oftentimes complicated systems of super-differential equations. To evolve the system you fold the adinkra but this folding process can be quite complex and if you’re not careful you can lose SUSY. So what Gates did was assign each node in the adinkra a binary word and he discovered, quite by “accident,” that the folding process which maintains SUSY conforms to one of Hamming’s error-correction codes! So perhaps one sees Universality in phase transitions due to some error-correction process?

        Click to access 0408004v1.pdf

        Click to access PWJun10gates.pdf

        The last link is to an article which appeared in Physics World, 2014.

        2)You know it was this whole train of thought which led me to the idea of bi-simulation on non-well-founded sets – which I have unsuccessfully tried to convey to you and, through you, to Renate Loll. So, I’ll express it here and then drop it forever!

        As so eloquently expressed in Smolin’s Three Roads to Quantum Gravity, the high degree of fine-tuning we witness defies probabilities to the contrary; this, to me, strongly suggests retro-causation or, in other words, a distinct final condition. So why couldn’t you use the adinkras of James Gates to develop a bi-simular model? I don’t see why you couldn’t because essentially adinkras are analogs to graphs and the folding process establishes relations between nodes:

        Click to access bisimulation.pdf

        So, you develop one adinkra which evolves from a distinct initial condition “forward” in time and another bi-simular adinkra which evolves from a distinct final condition “backward” in time. These adinkras are not symmetrical, rather, they simulate one another, hence, bi-simulation. At forward time step t = a the adinkra folding “forward” in time may have gone through y folds while the adinkra folding “backward” in time may have gone through xy folds but both processes result in the same system state at forward time step t = a. They simulate one another. So there’s an equivalence relation between the nodes from each adinkra present at t = a. Would this not put an interesting constraint on the initial and final conditions? And what if what we think of as initial and final conditions are in actuality phase transitions? Could such a model perchance be illuminating?

        Some scientists say non-well-founded sets are incompatible with quantum theory but Ben Goertzel, an expert on non-well-founded sets, dispenses with that myth in chapter seven of his book Chaotic Logic.

        In case you missed the subtleties in my last comment, I’m not suggesting that the error-correction takes place in the world we perceive, what you call Minkowski space and Will Tiller calls D-space, rather, the error-correction occurs in Will Tiller’s R-space. It occurs in the “electron-clock” described by the Zitter Model of David Hestenes and it’s super-luminal. This is why the world we perceive appears coherent and consistent.

        Now, if you consider that Will Tiller’s work gets consistently ignored and if you consider that the Finnish dissident physicist, Matti Pitkanen, has a consistent theory of Quantum Gravity which also gets consistently ignored, one begins to wonder. Matti, by the way, is considered a dissident because he long ago stopped thinking of Topological GeoMetroDynamics as a physical theory and instead thinks of it as a theory of quantum consciousness! There’s nothing wrong with his theory; it’s certainly a subset of the Langland’s Program and hadronic string theory and General Relativity appear as special cases:

        https://play.google.com/store/books/details?id=A5-IW1M3YrEC&source=productsearch&utm_source=HA_Desktop_US&utm_medium=SEM&utm_campaign=PLA&pcampaignid=MKTAD0930BO1&gl=US

        One thing I like about numerical methods, unlike humans they tend not to try to obstruct the truth. If you would like more information on the limit cycle in the human community which corresponds to that on the galactic scale you can send me an email:

        PonderSeekDiscover@gmail.com

        With regards,

        Wes Hansen

      • Anton Garrett Says:

        Wes,

        I know Dave Hestenes personally and quite well and have very high respect for his work, which i have followed closely. His Zitterbewegung conjecture might be a heroic failure, though. I was initially interested in adinkras until I saw that they don’t make a Clifford Algebra.

        I’ve made some changes to my essay to clarify that paragraph about religious belief; thank you for the feedback.

        Pilot wave theory is designed to reproduce the predictions of quantum theory, so it is ultimately an interpretation rather than a physical theory. I don’t see that you can use it to get anything different. Regarding superluminality, I prefer to opt for acausality rather than superluminality where you are forced to a choice. That is because acausality is demonstrated anyway (see my essay), and the Lorentz transformation goes haywire at velocities above c.

        I don’t believe that heart tissue can be acasual and/or a superluminal receptor and I dispute that “highly developed spiritual persons” can change the state of matter by willing it. Regarding the first of these, meta-analysis is crap and I’ve published a paper in a probability conference proceedings explaining why; don’t trust its conclusions. Regarding the latter, I spent a considerable part of the 1980s looking at such claims and found that they always run into the sand. Any “highly developed spiritual person” who could do that consistently would be, willingly or not, world famous. Yet none is.

  3. Anton Garrett Says:

    I consider that I skirted it rather than giving short shrift to it. Are you reading into what I wrote something that isn’t there? Please quote the relevant bit and I’ll gladly respond.

  4. Adrian Burd Says:

    Being playfully provocative here (and with tongue firmly in cheek) and taking a leaf from the playbook of the merchants of doubt, we would now have to be harshly skeptical of EVERY prediction made by quantum theory (it’s just a theory after all). If it cannot even make such an elementary prediction as the one described above, then every “prediction” made by so-called quantum scientists should be disregarded, even those that, by some fluke, turn out to agree with observation. Obviously the science of the quantum is not well understood, and there are large gaps in what meagre knowledge there is, so we must therefore cast doubt on every prediction made by quantum theory.

  5. Allen Schiano Says:

    Thank you for that thoughtful discussion. The general distaste for the ‘shy variables’ interpretation follows from the very strange physical world one would have to envision to create a fully non-local and acausal physical model of the ‘clandestine conversations’ that would be going on behind our backs, as it were. But isn’t that just a problem for the non-gravitational forces? Doesn’t the non merging of gravity with the standard model hint that there a radical problem with the quantum mechanics?

    • Anton Garrett Says:

      I think that if you use a gauge theory of gravity rather than privileging it via the GR formulation then incorporating it into the standard model is no different in principle from incorporating the strong nuclear force – although the technical details are more formidable and have not been mastered yet. But I agree with your last sentence, that there is a radical problem with quantum mechanics, and I reckon I know what: it can’t predict properly. It can’t tell me where the next particle will go in my coupled Stern-Gerlach apparatuses. Not good enough!

      One or two people opt for renunciation of the relativity principle rather than the quantum principle. Not me; if the Lorentz transformation turned out to break down in certain (local) situations then it would be the most profound shock in physics that I could imagine – whereas if hidden variables are found then I’d open the champagne.

      • Allen Schiano Says:

        With non-local, acausal ‘shy variables’ are we just talking about very minimally interacting (with ‘localizable’ stuff) tachyonic material? I would even postulate that the Planck constant is some sort of coupling constant between the two realms. Why is a nonlocal/acasual QM okay but it’s child – tachyons – not?

        Also your statement about cosmic inflation is another example of people suggestion something drastic – a nearly untestable ‘creation’ of energy from a false vacuum – to account for a phenomena (cosmic flatness) that could be addressed by saying that our quantum mechanical based calculation of the cosmological constant is off by a factor of 10**120. Saying QM is incomplete and there are nonlocal aspects to the Universe seems much more pleasing than shrugging off a 10**120. Or a ‘we don’t know where the particle went’.

      • Nice post, Anton.

        But I do wonder: why all this fuss about demanding quantum mechanics makes predictions of classical variables? Wouldn’t it be cleaner to look for an interpretation of classical variables in terms of quantum mechanics?

        Certainly I agree we should keep looking. But I’m rather convinced at this point that our insistence on classical ideas like ‘position’ – or the trajectory of a particle as you put it – being somehow fundamental is ultimately wrong. Those hidden variables would have to be really weird, as you point out.

      • Anton Garrett Says:

        Allen,

        I don’t go with tachyons because they send the Lorentz transformation haywire. If you can get round that then I’d certainly consider them as an alternative explanation to acausality in the delayed-choice and Bell experiments.

        Inflation is not my field, but it seems to me to be like democracy – lousy until you consider the alternatives. Perhaps hidden variables will shed light on it; let’s find them first, before we apply them in difficult problems.

      • Anton Garrett Says:

        Brendan,

        I tried and failed to respond to the question in your first paragraph and decided that I don’t understand it. Do say more!

        Space and time appear to mean nothing to the hidden variables and that has to be some kind of clue. They don’t seem to mean much to photons, either; a pure photon is a wave of indefinite extent, while time is infinitely slowed for a photon as you let v approach c in the Lorentz transformation. Furthermore, light is electromagnetic radiation and it is only by convention that we select the retarded rather than the advanced solutions of Maxwell’s equations. I’d focus on light, but I didn’t want to put my own guesses in that essay.

        I’ve read Belinfante’s fairly old book on hidden variable theories and find it useful in closing off dead ends; some HV theories are local and/or causal and are therefore doomed; others are incapable of predicting differently from QM and are therefore interpretations in disguise.

      • Anton,

        What mean is that much of the discussion surrounds the inability to predict classical observables, and the interpretation inevitably turns to interpreting the framework of quantum theories in terms of those classical observables. I genuinely think this is the wrong way about things, and we’d be better served trying (however difficult) to understand how classical observables arise in terms of what appears to be the much more fundamental quantum description of the world. Many of the objections (e.g. incompleteness) then seem moot.

        Is that any clearer?

      • Anton Garrett Says:

        Brendan,

        I’m not sure if you mean how does the classical limit come about, but – after 20 years of searching – I was satisfied by Nico van Kampen’s explanation, and I reproduce it – perhaps in to compressed form – in the main post above.

  6. Allen Schiano Says:

    There were a whole series of preprints that came out just after the ‘superluminal neutrino’ debacle of the OPERA project that were trying to augment the Lorentz transformation in some way to deal with these now non-existant superluminal neutrinos. The most interesting ones came from theorists who have been grappling on how to change Lorentz invariance in the Planck regime where everyone assumes it must change to allow for quantization. The whole Standard Model Extension program – albeit with nothing but upper bounds – hints that it may be possible to extend our understanding of Lorentz invariance.

    I also agree with your statement that space and time seem to mean vastly different things to the hidden variables. Or that we haven’t measured a scale where they do, i.e., the ‘speed’ of the communications is so fast that it appears instantaneous to us. Would the Stern-Gerlach apparatus act the same time it parts were separated by light years?

    Lastly, I think another new clue that will help in this mystery is the non-zero nature of the cosmological constant and/or the presence of ‘dark energy’. This observation flies in the face of QM since QFT calculations of its expected value are again off by a factor of 10**120! The ‘negative pressure’ nature of this ‘energy’ is nothing like any other ‘denizen’ of our realm. That seems to be to have been brushed off. Dark matter can be partially explained with a whole lot of objects and particles that we know of or could extrapolate, but it takes a lot of quantum field theory hand waving using a ‘field’ that we have no experimental data upon to conjure up this energy. And why is it just the right size to only now ‘cosmologically speaking’ start making major effects on spacetime.

    There’s a lot more going on here that QM, or QFT as its offspring, cannot explain. Coming up with a theory of ‘shy variables’ seems like something physicists should be all over but they aren’t.

    • Anton Garrett Says:

      I couldn’t agree more with your last sentence! I think it should be elementary things like Stern-Gerlach that motivate the search for HVs but if it proves to be cosmology then I’ll still be happy. I’ll bet the Lorentz transformation makes it through unscathed though. Quantisation must allow for that, not vice-versa. (Dirac put a lot of thought into canonical quantisation, but it is superseded by the from-symmetry approach in Weinberg’s recent book on basic QM, as Weinberg says in it.)

    • Allen V. Schiano Says:

      Thanks for the pointer to this paper – do you know if it was ever published? The ’10**120 stuff’ is basically a statement that QFT doesn’t really explain the cosmological constant (or ‘dark energy’ that they disdain) observation and in effect says QFT and GR are in mortal combat to explain it. The fact that the constant exists and has its value is the ‘mystery’. After all Einstein ‘chose’ a value to negate any expansion/contraction shows that at least mathematically we are free to choose a value for it. Einstein pthen set it to zero as a ‘mistake’. Then QFT tried to calculate it and came up with a ridiculously large value. Then observations shows that its value is neither. The ’10**120′ just says that QM/QFT is not compatible with GR despite a lot of work by many people. The cosmological constant/dark energy observation is a ‘in your face’ reminder of that.

  7. Andrew Wells Says:

    Sweet post, thanks for this.

  8. Neo : We believe the early part of the expansion underwent a great boost, known as inflation, which explains how the universe is unexpectedly smooth on the largest scale today and is also not dominated by gravity. Everything in the observed universe was, in effect, enormously diluted. Issues of causality also arise. But the mechanism for inflation is conjectural, and inflation raises other questions.

    PostNeo: The nonlocal arguments with a conversation between two time travelers is what makes this a great dialogue. I hope you don’t mind if I interrupt your conversation. The CMB is cold and isotropic but have you considered that the CMB could be the thermal reservoir needed to create particles in a steady state theory? Without the CMB we would have to violate the sacrosanct law of conservation of energy. When the CMB was discovered it was incorrectly interpreted as the echo of the big bang rather than an energy reservoir at maximum entropy. If the universe is older than 12 billion years then this allows enough time for the CMB to reach thermal equilibrium without the need for inflation. You might have some questions about the apparent age of galaxies but remember galaxies at the edge of our telescopes are traveling at relativistic velocities and therefore the farthest galaxies, those moving fastest relative to us appear younger to us and we appear younger to them.

    Neo: What about dark matter, dark energy, singularities and the rest of modern physics? What about the redshift of the Lyman-alpha forest?

    PostNeo: Dark matter is iron from stellar nucleosynthesis and the decay of heavier elements. Iron is the most stable element and a large amount is created due to the iron limit in fusion. The amount of dark matter in a galaxy helps us which helps us to determine the galaxy’s age. Dark energy is an electrostatic repulsion due to a charge imbalance from plasma in galactic magnetic fields which repel galaxies from each other. With respect to singularities you knew deep down that you could not divide by zero. The Lyman-alpha forest by itself cannot allow us to determine if space itself is expanding or if two galaxies have been accelerating away from each other due to electrostatic repulsion. It is almost embarrassing that the only known force with an infinite range, that can push objects apart and is stronger than the gravitational force binding clusters of galaxies together was completely overlooked as an explanation for dark energy.

    • telescoper Says:

      Dark Matter can’t possibly be iron, for a number of reasons the most important of which is that it must be non-baryonic. Also dark energy can’t be electrostatic repulsion because that would require an enormous charge separation which would have other drastic observational consequences which aren’t observed.

  9. Anton Garrett Says:

    Lubos Motl has commented negatively on this post on his blog at

    http://motls.blogspot.co.uk/2015/08/zombie-nino-is-deluded-about-hidden.html

    He also deleted my reply there, which was as follows:

    *********************************

    Dear Lubos

    I have just read your critique; thank you. Please be assured that I scan all no-hidden-variable arguments, and whenever one has nontrivial content I extract its axioms, to compare hidden-variable theories against. Hopefully your comments will provide further constraints on hidden variables and thereby narrow the search. In that way your critique would have been a valuable step forward.

    One question: Please would you define what you mean by “random”?

    Yours sincerely
    Anton [Anthony] Garrett

    PS To those who malign socratic dialogues: the great Galileo used one to propose heliocentrism against his blinkered critics. That is good enough precedent for me.

    ********************************

    I support Lubos Motl’s freedom to do whatever he likes on his own blog, but the deletion of reasonably toned responses comes close to being an act of moral cowardice – especially when he lets stand comments that label Sean Carroll (who was open-minded enough to reblog my essay even though I argue against Carroll’s preferred “many-worlds” view) as a “regressive shit lord”. (In case that insult vanishes, it was made by “Charlie” 7 days ago at the time I write this.)

    Is Lubos Motl’s deletion of my response because he is unable to respond scientifically? Certainly he resorted to personal insult in his comments, which is usually used to cover up lack of content. If he wishes to respond here then he will find, courtesy of Peter Coles (Telescoper), the level playing field upon which scientific discourse should be played.

    • telescoper Says:

      It’s sadly typical of Lubos Motl to delete any contrary opinions, however politely expressed. Readers can make their own mind up about what to infer from that.

      • Anton Garrett Says:

        Lubos Motl emailed me stating that he hadn’t deleted me. I replied on his blog and my first paragraph to him was as follows:

        *************************************

        At last I have some time! First, thank you for your reasonableness about the misunderstanding. I wrote a comment for this thread and pressed the upload button and I then saw the comment plus a message saying something like “waiting to be okayed by blog owner”. I didn’t know whether that message and my upload was visible just to you and me or to the world, so I hit Refresh and still saw it, so I presumed the world. I then hit Refresh an hour or so later and my comment had vanished, so I presumed you had deleted it without comment. I have no understanding of the software that permitted this to happen without your intervention, but I gladly accept your email saying that you didn’t delete it, and I unreservedly withdraw my negative comments about that presumed action.

        *************************************

        For the rest of my reply, responding to Motl’s critique of my essay, please go to his blog.

  10. I recommend watching all of the following video. The part having to do with ‘exposed variable theories’ is at the 2:10 mark.

    ‘The pilot-wave dynamics of walking droplets’

    Due to conservation of momentum the downconverted pair are propagating with opposite angular momentums. Each of the pair can determine the position and momentum of the other based upon their own position and momentum.

    Entanglement is each of the pair being able to determine the state of the other.

    Pilot-wave theories are exposed variable theories.

    ‘Empty’ space has mass which is displaced by the particles of matter which exist in it and move through it.

    In a double slit experiment it is the mass which fills ’empty’ space that waves.

    The wave of wave-particle duality is a wave in the mass that fills ’empty’ space.

  11. […] very busy day lies in store so I only have time for a quick morning post. If you enjoyed the recent guest post on the “hidden variables” interpretation of Quantum Mechanics, then you will probably […]

  12. Chris Mannering Says:

    I strongly agree with the centrepiece points (if I read them right).

    I’m not personally familiar with it, but the hidden variables interpretation/s appear motivated by restoration of realism & locality. The action at a distance is not physical, instead something else like motivation or metaphor; in any case it is not literally physically true.

    This matter can surely be answered by experiment. Here’s an example.

    1. Entangle 256 pairs of particles, send 1 of each pair to the Moon, along with 256 nuclear bombs and a technology to observe the 256 pairs sequentially one every 5 seconds in a specified sequence immediately on arrival.

    2. With the space caravan on its way, begin observation of the 256 particles earth-side in exactly the same sequence as instructed to the Moon side (on arrival).

    3. For each next observation, record the resulting 256 as a binary number (i.e. say ‘0’ for ‘spin-up’ and ‘1’ for ‘spin-down’)

    4. The instructions given previously for the moon-side on arrival of the caravan are:
    – line up the 256 bombs 5 miles apart
    – begin observing the 256 particles in correct sequence one per 5 seconds
    – if the next particle is ‘spin-up’ do not explode the bomb.
    – if the particle is ‘spin-down’ immediately explode the bomb.

    Surely that works. We read a 256 code on the sky, and if its perfect we reckon action at a distance happens for real, physically

    • Anton Garrett Says:

      Locality is ruled out by Bell. But I’m a determinist realist, although these are philosophical assumptions on which physics rests, rather than observable axioms. Junk those and I question whether someone is a committed physicist, although I didn’t put it so bluntly in the post.

  13. Adam Freese Says:

    Neo: Some people suggest that reality is operator-valued and our perplexities arise because of our obstinate insistence on thinking in – and therefore trying to measure – scalars.

    I’m intrigued by this idea, but I feel like it’s a bit vague as it is. Can you point me towards any sources which make this suggestion, and perhaps expound on it further?

  14. Anton Garrett Says:

    Ah, not the one I meant; David Mermin wrote a Rev Mod Phys article about “the two theorems of John Bell”.

Leave a comment