Is dark matter the caloric of our time?
I think its great the Wolfram’s approach has no free parameters and is not only fitting GR but is contributing to the algorithms used by physicists to model gravity. The thing about “free parameters” is really just saying something more like “This isn’t a class of theories, it is a particular theory.” Whenever you introduce a parameter you are really just stating a constraint on the class of theories implied by that parameter.
Regarding the “caloric” aphorism, I was really kind of disappointed that what Wolfram was really saying with his aphorism is more like:
“The ruliad will turn out to be the theory of everything including dark matter.”
He didn’t provide any specific argument for this explanation for dark matter in the video – it is pure conjecture. For instance, MOND has been dispatched by observations of clouds of dark matter from gravitational lensing. He didn’t address this. This “aphorism” it seems to me, is on the same order of rhetorical posturing as his repetitive resort to “computational irreducibility” which it would also seem to me is something as trivial as “there is no closed form solution to the vast majority of dynamical systems equations”. Oh? We didn’t know that to be the case ever since Newton/Leibniz, Stephen?
Having said that, it seems pretty obvious to me that whatever dark matter turns out to be, it had damn well better play the greatest part in alpha_G_proton. This is something everyone seems to be missing for some strange reason. They think they get away with not talking about the proton when talking about dark matter. What’s up with that? I mean it isn’t as though alpha_G_proton isn’t a well known parameter and it isn’t as though its relationship to 2^127 isn’t also known to be even more remarkable than Dirac’s or Eddington’s conjectures.
What’s wrong with these physicists?
My understanding of Wolfram’s conjecture about dark matter is that it is the gravitational effect of the underlying computation (of his multi-way graph of “atoms of space”) which causes what we observe as continuous spacetime to emerge. This computation (occurring at an unknown scale, but necessarily much smaller than the Planck length) is what creates the persistent structures we observe as particles and the evolution of those structures that we interpret as time. Energy, in his scheme, is just the density of the computation in a given region of the graph (or, coarse-grained, volume of space).
So, one might imagine that in regions which have concentrations of particles, there might be more intense computation going on to create the structures that compose them, and that this computation gives rise to the additional gravitational effects we attribute to dark matter.
As far as I know, none of the details of this have been worked out. At the moment it is all arm-waving, but then much the same could have been said of explaining heat by the motion of unobservable tiny molecules before we were able to directly detect them.
Maybe I should have been more specific in my critique. What really disappointed me was Wolfram’s failure to mention observations that decouple visible matter from dark matter – the most difficult of which to explain away is the Bullet Cluster:
Such observations eliminate entire classes of dark matter theories including any that rely on proximity to light matter.
Jack Sarfatti (@jacksarfatti) just brought a 2011 paper by Gerard 't Hooft (Nobel Prize in Physics, 1999) to my attention, “How a wave function can collapse without violating Schroedinger’s equation, and how to understand Born’s rule” (full text at link). Here is the abstract.
It is often claimed that the collapse of the wave function and Born’s rule to interpret the square of the norm as a probability, have to be introduced as separate axioms in quantum mechanics besides the Schroedinger equation. Here we show that this is not true in certain models where quantum behavior can be attributed to underlying deterministic equations. It is argued that indeed the apparent spontaneous collapse of wave functions and Born’s rule are features that strongly point towards determinism underlying quantum mechanics.
The key idea is that what we observe as probabilistic behaviour in quantum mechanics and the apparent mystery of the “collapse of the wave function” by a measurement may actually be coarse-grained behaviour of deterministic (classical) “sub-microscopic states” which underlie the microscopic (particles and fields) states we measure.
The degrees of freedom in terms of which we usually describe atoms, molecules, subatomic particles and their fields will be referred to as microscopic degrees of freedom. It is these that have to be described as superpositions of the sub-microscopic states, and in turn, the macroscopic states are superpositions of microscopically defined states. Perhaps the most accurate way to describe the situation is to say that the states we use to describe atoms, quantum fields, etc., serve as templates. A particle in the state |x\rangle or in the state |p\rangle or whatever, will nearly always be a superposition of many of the sub-microscopic states; as such, they evolve exactly according to Schrödinger equations. In contrast, the sub-microscopic states evolve classically. The macroscopic states also evolve classically, but the details of their evolution laws are far too complicated to follow, which is what we need the microscopic template states for. The macroscopic states, such as people and the indicators of measuring devices, are probabilistic distributions of the sub-microscopic “hidden variables”. We actually only observe macro-states.
When you think about how this might actually work, it is remarkably similar to the picture that’s emerging from the Wolfram Physics Project: a structure below that of the fields and particles we observe which behaves deterministically, but when aggregated into our observables appears to be probabilistic, just as a gas made up of many molecules undergoing deterministic, reversible collisions manifests statistical and irreversible behavior in the aggregate.
But what about entanglement and Bell’s theorem? Didn’t that rule out any such local, realistic hidden variable theory? Well, yes, but t’Hooft notes that the underlying (sub-microscopic) structure may well be correlated over spacelike intervals.
Spacelike correlations can of course be understood if systems have a common distant past; therefore, by itself, the finding that there must be strong spacelike entanglement constitutes no contradiction, but it seems quite amazing. What must be kept in mind is that, in our cellular automaton models, the state commonly referred to as the vacuum is by no means featureless; there are vacuum fluctuations, and this means that the vacuum is a highly complex solution of local field equations. Thus, the vacuum may show all sorts of correlations with matter, even over large (spacelike) distances.
This is essentially the same as Wolfram’s distant connections in the graph being observed as entanglement.