Modified Newtonian Dynamics versus Elusive Dark Matter

A Philosophical Approach to MOND: according to the MIlgromian research program in cosmology”, by David Merritt, ISBN 978-1-108-49269-0, 270 pages (2020).

Publishers warn authors that each successive equation & graph in a book cuts the size of the remaining audience in half. And putting “Philosophical” in the title is a sure way to scare off most of the rest. However, Dr. Merritt’s approach requires his careful definitions of terms and a solid mathematical basis. Consequently, very few people will ever read this book – which is unfortunate, because massive computer simulation models which assume the existence of never-directly-observed “Dark Matter” seem to be having the same kind of stultifying effect on cosmology that massive computer Global Circulation Models have had on rational discussion about alleged Catastrophic Anthropogenic Global Warming.

The “philosophy” part relates to Dr. Merritt’s review of how science really aims to work. Nothing in science is ever “settled”, contrary to the ravings of an Algore. E.g., there was a time when the best science postulated that atoms were indivisible. The primary test of the adequacy of a scientific hypothesis is its ability to make testable predictions; for example, the inability of String Theory to provide any testable predictions is what has relegated that hypothesis to the sidelines after several decades.

But – Dr. Merritt reminds us – the ability to make accurate predictions does not prove that a hypothesis is correct, only that it is useful. Before Kepler, astronomers subscribed to the Aristotelian theory that heavenly bodies move in perfectly circular orbits. However, that hypothesis was obviously inconsistent with the observed retrograde movements of the planets. Ancient astronomers doggedly held on to the assumption of perfect circular motion and invented the concept of epicycles – heavenly bodies move in small perfect circles centered on large perfect circles. One can make accurate predictions about planetary motions based on that epicyclic model – but that does not mean it describes reality.

“Dark Matter” was originally proposed by Zwicky in 1937 – the same Zwicky who proposed the “Tired Light” hypothesis – to explain certain observed divergencies in the movements of stars from the predictions of Newtonian mechanics. The Dark Matter hypothesis assumes that the universe contains rather a lot of invisible stuff with unmeasurable properties which interacts with observable matter only through gravitational effects.

Correcting predictions to match observation through invoking the invisible presence of non-observable Dark Matter sounds like what an engineer would call a “fudge factor”. Not that an engineer would say there is anything necessarily wrong with fudge factors when properly used. For example, the stupefying mathematical complexity of turbulent fluid flow in pipes means that most fluid flow calculations are based on fudging a much simpler physical model with an empirically-derived Fanning friction factor.

The issue with assuming that certain apparently-anomalous cosmological observations are due to Dark Matter is reminiscent of the story about the lawyer who was asked the value of 2 + 2: his response was “What do you need it to be?” In the absence of actual measurements of Dark Matter, astronomers are free to assume whatever amount & distribution of hypothesized Dark Matter is required to “explain” the anomaly.

In 1983, Milgrom proposed an alternative to this “hidden mass hypothesis”: when gravitational accelerations are very low, such as in the wide-open cosmological spaces on the outskirts of galaxies and between galaxies, real behavior deviates from standard Newtonian gravity. This alternative hypothesis is also capable of explaining apparently-anomalous cosmological observations. (However, a cynic – not Dr. Merritt – might point out that MOND does not provide a rationale for academics to spend literally $Billions of mostly taxpayer money on career-enhancing satellites, telescopes, & particle colliders searching for Dark Matter).

Although Dr. Merritt does not make this point, there is a relevant historical analogy. Newton’s Laws of Motion in the 17th Century were wildly successful in predicting movements of the planets. When 19th Century astronomers became aware that Newtonian dynamics did not properly predict the movements of the planet Mercury, they assumed that Newtonian dynamics were correct and postulated the existence of a never-observed planet Vulcan between Mercury and the Sun. Later Einstein demonstrated that the divergence between Newtonian theory and observation lay in the fact that Newtonian dynamics become a poor approximation to reality when a planet is travelling as fast as Mercury. The never-observed planet Vulcan does not exist.

Today, astronomers observe a divergence between Newtonian predictions about the motions of stars around distant galaxies and hypothesize the existence of a huge amount of non-observable matter in the Universe to account for the divergence. Perhaps the issue is just another limitation to the range of applicability of Newtonian dynamics?

Dr. Merritt clearly tries to be scientifically very even-handed in his assessment of the competing theories. Neither hypothesis explains all observations, and both theories are still appropriately evolving. However, he concludes that MOND has made some predictions which were subsequently validated by observations, whereas the Dark Matter hypothesis has yet to make a prediction (versus making retroactive matches to observations).

Is MOND correct? In this reader’s view, MOND suffers from the same issue as Newton’s original theory of gravity – there is no physical explanation for the mathematical relationship. But the difficulties with both MOND and Dark Matter suggest that neither hypothesis is likely to be the whole story. From the perspective of human progress, the critical issue is that we all should retain a scientific open-mindedness based on a humble recognition of the limits to our current understanding.

2 Likes

I think it’s great that there’s an effort to challenge Einstein, just because science should be a back and forth, not a declaration from above.

And while MOND does attempt to challenge the need for dark matter to explain galactic rotation rates, it doesn’t attempt to explain extra gravitational lensing, etc. Recent studies of wide binaries seem to definitively retire MOND as well.

2 Likes

One of Dr. Merritt’s themes is that neither MOND nor invisible Dark Matter currently gives completely satisfactory explanations, and both theories are still evolving. It is probably too early to say that the wide binaries studies have “definitively” retired MOND.

What has disturbed me personally about the convenient invisible Dark Matter hypothesis is the astonishingly large amounts of money which have been committed to pursuing it, and the many careers that are wrapped up in continuing the search. It has the aroma of Catastrophic Anthropogenic Global Warming about it. But if definitive evidence for the existence of invisible Dark Matter ever appears, that will resolve the issue.

1 Like

This is confused and confounds the question of the geocentric and heliocentric models with whether orbits are circular or elliptical. Epicycles were introduced to attempt to reconcile the observation of retrograde motion with geocentric orbits, but after the heliocentric model was accepted, they continued to be used because Copernicus continued to believe orbits must be circular (what else could be consistent with the “heavenly spheres”?) In fact, the Copernican model had as many epicycles as the Ptolemaic system. It was only after Kepler discovered, by patient reduction of observations, particularly of the orbit of Mars, that planetary orbits were elliptical, that the motion of planets could be described without epicycles. The reason orbits were elliptical remained unknown until Newton described universal gravitation, which explains elliptical motion as a consequence of the inverse square law of attraction.

No. Zwicky’s hypothesis of dark matter (in 1933, not 1937) were based upon the motion of galaxies within the Coma cluster of galaxies, not stars. He observed that the velocities of galaxies in the cluster were much faster than were consistent with the galaxies being gravitationally bound to the cluster, and that if they were not bound, the cluster would have dispersed long in the past. This led him to suggest there must be unseen mass much greater than that for which visible galaxies could account in order for the cluster to remain gravitationally bound. All subsequent observations have confirmed this supposition.

It wasn’t until the 1970s that precision measurements of the rotation curves of spiral galaxies confirmed earlier observations that motion of stars in galaxies did not behave as expected for Newtonian gravitation by the visible objects in the galaxies, suggesting the presence of an invisible halo surrounding galaxies which is far more massive than the visible stars and gas.

Modified Newtontain Dynamics (MOND) was developed in the 1980s to explain the anomalous rotation curves of galaxies without resort to dark matter. It assumes that Newtonian gravity departs from the inverse square law at very low accelerations which are never encountered in the solar system or in laboratory experiments. The parameter at which the departure occurs is a “fudge factor” chosen to make the model agree with observations without any physical mechanism being suggested for the departure or the value of the parameter. MOND was not developed to explain the other rationales for supposing the existence of dark matter such as galaxy motion in clusters, gravitational lensing, the observed process of structure formation in the early universe, and other conflicts with observation, some of which MOND makes worse.

No, it isn’t. It does not explain either the anisotropies in the cosmic background radiation or the observed hierarchy of structure formation in the universe, both of which are completely consistent with the Lambda-CDM (dark matter and energy) model.

If he concludes this, he is wrong. MOND makes predictions, such as the behaviour of galaxies in clusters, gravitational lensing, the dynamics of stellar motion in ultra diffuse galaxies, and widely separated binary star systems which have been falsified by observations. You can work around some of these by piling more and more complexity onto the theory, but then that’s like adding epicycles to a flawed model of the solar system. The dark matter model, conversely, not only has made predictions, but one of the most successful predictions in observational astronomy in its long history. Before the first observation of the cosmic background radiation, the dark matter and dark energy model made a precise prediction of the power spectrum of anisotropies in the cosmic background radiation. Here is the spectrum predicted by the Lambda-CDM model (solid line) and observations by four independent experiments on Earth and in space made years after the prediction.

image

No. A “fudge factor” is an arbitrary correction added to a theoretical prediction to make it agree with observation, which is precisely what the a_0 parameter in MOND does to Newtonian gravity. Positing something unobserved as an alternative to discarding a theory which has passed every experimental test for decades or centuries has a long and often successful history. The neutrino was originally postulated by Wolfgang Pauli in 1930 to explain observations which seemed to indicate the process of nuclear beta decay violated the laws of conservation of energy, momentum, and angular momentum. It was believed at the time that it might never be possible to detect the neutrino, but the neutrino fit so well into the rest of particle physics while preserving conservation laws which had never been seen to be violated in any other circumstance that it became widely accepted. It was not until 1956, a quarter century later, that the neutrino was directly detected in the emissions from a nuclear reactor, a technology which was not imagined when the neutrino was proposed. The neutrino, which does not interact with other matter except through the weak interaction (and gravity, which is negligible for a particle with so little mass as the neutrino), is similar to dark matter, which is assumed to interact with other particles only via gravity. This makes it difficult to detect directly, but not impossible, which is why experiments to detect it have been and are being conducted.

Such an observer would not only be cynical, but ignorant. Direct funding of experiments for direct detection of dark matter is minuscule by the standards of Big Science. The satellites, telescopes, and particle physics experiments which have produced results considered relevant to dark matter were almost entirely built for other purposes, and the observations they conduct would be almost identical if nobody had ever heard of dark matter.

5 Likes

[Citation needed.] The Wikipedia page, “Direct detection of dark matter”, lists all of the experiments dedicated to detecting various dark matter candidates. There are about five or six of them, all modestly funded, and mostly piggybacking onto other projects such as underground laboratories, particle physics experiments built for other purposes, or astronomical observations made as general surveys. There are probably fewer than 50 people involved in this work, and only a minority of them have careers “wrapped up” in the quest for dark matter.

2 Likes

The reference provided in Dr. Merritt’s book is:
Zwicky, F. 1937 On the masses of nebulae and of clusters of nebulae. The Astrophysical Journal 86 217 - 245.

But I have not gone to the library to check that 1937 volume.

Rather than getting hot under the collar about my observations, Mr. Walker, it would be really great if you would track down a copy of Dr. Merritt’s book and read it yourself. He is clearly trying to get to the truth about a confusing (and confused) issue, and is even-handed and non-polemical (qualities which I unfortunately do not share). Your views would be interesting & informative.

One of the intriguing points Dr. Merritt makes about cosmology is that many of our “measurements” actually require considerable assumptions to draw out the parameter from what is actually measured. That does not mean our derived measurements are wrong – only that we should be aware of the limitations, and avoid giving science another case of Global Warming.

We will simply have to agree to disagree about cosmologists’ use of invisible “Dark Matter” as a funding/career device. I have listened to too many talks where astronomers have emphasized how critical their latest telescope/satellite/etc would be for the search for invisible “Dark Matter”. Maybe the writers of Wikipedia have missed that?

A “fudge factor” is an arbitrary correction added to a theoretical prediction to make it agree with observation, which is precisely what the a0 parameter in MOND does to Newtonian gravity.

Exactly! We are in complete agreement about that. Now explain how that is different from assuming the existence of arbitrarily large amounts of unobservable material in order to make the results of calculations with conventional Newtonian dynamics fit reality?

Impressive fit.

A procedural nit:

In generating that graph they did admit to leaving out some data. While I trust them to have been intellectually honest in doing so, this illustrates the importance of the distinction between “data selection”, “model generation” and “model selection”.

What, in fact, “data selection” does is select not only the data that will go into “model generation” (as distinct from “model selection”), but it selects scientific colleagues: Those scientists that agree with the data selection criteria that selected the data that will go into “model generation”.

In the present instance, you have a number of more-or-less “raw” observations of the CMB that look something like:

WMAP CMB Fluctuations

They are put through a variety of processes to produce something that looks more like what they think the CMB looks like:

Then these “multipole spectra” get abstracted from the CMB – this one from 25 years ago with much bigger error bars on the data points than at present:

Primary CMB Anisotropy at Arcminute Scales

What elimination of these data points in a recent paper says to the scientific community is:

“If you don’t agree with our data selection criteria, then you aren’t a colleague worth talking to in the generation of models let alone the selection of which model best fits the data we’ve selected.”

The Algorithmic Information Criterion for model selection explicates this by, in essence, saying:

“If you can’t losslessly compress “the data” and do so better than any of your colleagues who are worth talking to, then your model isn’t the best basis for generating hypotheses.”

2 Likes