[Physics] How does a resistor affect the voltage on a capacitor

capacitanceelectric-circuitselectrical-resistancehomework-and-exercises

I am working on a problem involving a capacitor in series with a resistor. Basically there is a circuit, with a capacitor of capacitance $200\mu F$, connected in series to a resistor of resistance $470k\Omega$, which is all powered by a cell of $1.5V$. There is also a switch to turn the flow of current on and off.

If I wanted to calculate the maximum energy stored on the capacitor I would assume I can use $E = \frac{1}{2}CV^2$, but I am not sure if I can use $V = 1.5$.

I know voltage splits in a series circuit, but I'm not sure how it would split with the capacitor because technically it does not have resistance in the normal sense. The current value constantly changes. Am I also correct in saying that the resistor will only affect the rate of charge, and not the total charge/energy stored?

Basically I'm asking what the difference in this would be compared to if the resistor wasn't in the circuit.

Diagram:
circuit

Best Answer

The voltage drop, $V_R$, across a resistor $R$ is simply given by:

$$ V_R = IR $$

When the capacitor is fully charged the current through the resistor is zero so the voltage drop $V_R = 0$. That means the full 1.5V is across the capacitor.