[Physics] Does a capacitor ever get fully charged

capacitanceelectricity

The time $t$ taken by a capacitor of capacitance $C$ in a charging circuit with a resistance $R$ in series with it to accumulate charge $q$ is given by the equation

$$t = \tau \ln\left(\frac{q}{Q-q}\right), $$

where $\tau$ is the time constant given by $\tau = RC$ and $Q$ is the maximum charge the capacitor can have when fully charged in that circuit.

In order to find the time taken by the capacitor to get fully charged we have to put $q = Q$ in the right side of the above equation that gives

\begin{align}t &= τ \ln (Q/0) \\
\implies t &= τ \ln \infty \\
\implies \infty &=\lim_{q\to Q} t\end{align}

This gives me a feeling that a capacitor never gets charged fully.
Am I right?
Why not?

Best Answer

This gives me a feeling that a capacitor never gets charged fully. Am I right? Why not?

In the context of ideal circuit theory, it is true that the current through the capacitor asymptotically approaches zero and thus, the capacitor asymptotically approaches full charge.

But this is of no practical interest since this is just an elementary mathematical model that cannot be applied outside the context in which its assumptions hold.

For example, this model assumes that $Q(t)$ is continuous and that there is no noise. However,

  • electron current is not continuous; the charge changes in discrete units, the charge of an electron.

  • there is ever-present and random noise and, after some number of time constants, the 'charge current' predicted by the simple model is below the noise floor.

Since the capacitor goes from zero charge to better than 99% charged in $5\tau$, we typically use this as the time required to 'fully' charge the capacitor.

Related Question