I’ve been admonished regarding my use of the acronym “AIC” to mean “Algorithmic Information Criterion”.
In defense of “AIC”, and with acute awareness that Akaike Information Criterion has not only occupied that acronym, but is the most widely cited information criterion with the possible exception of Bayesian:
The gold standard information criterion requires Turing complete codes and none of Wikipedia’s 20 information criteria qualify. Even “GIC” (which is not so-listed) for Gold standard Information Criterion yields “Generalized Information Criterion”.
It is no hyperbole to characterize this as a disastrous state of affairs. The veritable explosion of information criteria is symptomatic of an underlying intellectual malaise qualitatively on par with physics getting stuck with ancient Greek kinematic mode of description. Imagine relegating Newton’s dynamics so esoteric as to elicit puzzlement at those “fringe” physicists who insist on it.
Statistics, in accord with the name, rarely ventures into the recurrent (let alone recursive) mode of description required by dynamics hence reality hence the very notion of truth.
Choosing to violate the memetic “turf” of Akaike is not in spite of the fact that it is the most widely cited information criterion, but precisely because it is the most widely cited information criterion. The time is long past-due to put into perspective the entire field of statistical “information” criteria.
“Parameter” counts? Good grief. Give me one “parameter” and I can, with arithmetic coding, encompass the universe.
Every single bit of algorithmic information is a parameter in the Gold Standard.