You don’t need to apologise to disagree! Disagreements are the stepping stones to advance science. :)
The WWII analogue tech is truly fascinating for me. But when I talk about analogue computers making a comeback, I’m not talking about the past ones, but the next generation of analogue computing. It is true that if you are trying to simulate the exact phenomenon one-to-one, the number of parameters significantly increases. But think ‘analogy’; when a child plays with a toy car, the toy does not have the complex parts of the real thing. But it does the job of making the child understand that a car has 4 wheels, runs on the ground, cannot float on water, etc. For the child’s purposes, this suffices. What I mean to say here is that useful analogies reduce dimensionality of the problem being studied; otherwise in order to simulate the universe, we will have to build another one. If you are sceptical about this point, there are cases like when research scientists built analogical models that took advantage of the fact that sound cannot escape in the reverse direction of a water fall, just like how light cannot escape a black hole. I cannot go into detail, but I personally use similar technology to gain edges in financial markets. I can go on and on with examples, but the point is: the next generation analogue computers need not necessarily resemble the past ones. Think of computers that will enable rapid architecture switching based on new analogies developed by the user on the fly; this cannot of course replace general purpose computers.
Current digital computers are very good at parallel computing, but they are running into several limitations. Two of the prominent ones are: a) transistor density issue that I mentioned in the article, and b) Power draw. There are currently analogue computer developers who are trying to use the neuron-network of the human brain to create a non-deterministic parallel processing analogue computer. Their inspiration is the fact that the human brain is a reasonable (super?)computer that runs on 20–30W, whereas the best digital computers we have run on megawatt scales.
Currently, digital (general) computation is arguably abused for single purpose tasks such as crypto mining, population modelling, epidemic / pandemic drug development, autonomous driving, AI, etc. These are fields that can directly benefit from next gen. analogue computing (I’m not entirely sure about crypto mining as I haven’t studied the math behind it yet though). Any product that interfaces with human beings is likely to benefit from this development as well. Currently, more than half the components (density-wise) of a smartphone are analogue. Next gen. tech like neural link etc., would only require more analogue computing. Power draw becomes critical once the computers start entering the human body, and current digital architectures are bottle-necking when it comes to power draw.
All this said and done, I am not trying to convince you for or against analogue computing. I just find the whole computation space so fascinating, and we have very clever people diverting their attention to analogue and hybrid computing (this is an observation). I don’t know what the future holds. So, while I have my opinions, I would not bet my money against digital computing or for analogue. We need the best solutions for humanity, whichever way it goes. :)