To illustrate the most basic advantages of analog computing, consider processing analog signals that are described by a set of differential equations. Because continuous time doesn?t exist in a digital computing paradigm, a digital computer must sample the input every clock cycle to develop a sample signal. This can result in many, many computations, which has the cascading effect of higher latency, increased power consumption, and so on.