Will Machine Learning Spark the Return of Analogue Computers?

I believe Betteridge’s law applies here. We’ve heard it twice before: first with perceptrons in the 1960s, then with the flurry of analogue connectionism in the 1980s. Will the third time be the charm? Don’t bet on it—every time somebody proposes analogue computation as a way to surpass the speed of digital computing, you just wait a couple years and the exponentially growing performance of digital catches up to and blows past the painfully-achieved accomplishments of analogue. Analogue computation is hard to scale because of its inherent sloppiness: you don’t just have to distinguish a one from a zero; you need to track the behaviour of the function you’re evaluating over its entire domain, and that requires tight tolerances which are difficult to maintain when you’re making features smaller and smaller and packing more and more onto a chip. Digital can just beat the problem into submission with massive parallelism, surpassing whatever advantage individual analogue devices may have.

2 Likes