Resistance of a material is $$R=\rho\frac {l}{a} $$ where $\rho $ is the resistivity of the material, $l $ is the length of the conductor, and $a $ is the cross sectional area.
For a given conductor, all these are constant. So resistance of a given conductor does not depend on current or voltage.
It however, depends on temperature: $$R=R_0 (1+\alpha×t) $$ where $R $ is the new resistance, $R_0$ is the initial resistance, $t $ is the change in temperature, and $\alpha $ is the coefficient for increase in resistance per unit rise in temperature.
For copper, resistivity is very low, so resistance is also low.
For a given current, the heat loss in the transmission line is indeed proportional to the transmission line's resistance:
$$P_T=I^2 R_T\ \,$$
where $P_T$ is the power lost in the transmission line as heat, $I$ is the current, and $R_T$ is the resistance of the transmission line.
The transmission line's resistance actually isn't very high; the question arrives at that incorrect conclusion by using the wrong voltage in Ohm's law. What would be valid is
$$R_T=\frac{V_T}{I}\ \ ,$$
where $R_T$ is the resistance of the transmission line, $I$ is the current, and crucially, $V_T$ is the voltage across the transmission line, not the 400 kV source voltage. The 400 kV source voltage is the sum of $V_T$ and the voltage across the load,
$$V_S=V_T + V_L\ \ ,$$
and $V_T \ll V_S$, i.e. $V_L \approx V_S$.
According to Watt's law, the current through the load is
$$I=\frac{P_L}{V_L}\ \ ,$$
where $P_L$ is the power needed by the load. (For simplicity, I'm treating the load as being purely resistive, with no capacitive or inductive reactance.) But the current through the load is the same as the current through the transmission line, so you can plug $I$ into the first equation above to give
$$P_T=\left (\frac{P_L}{V_L}\right )^2 R_T\ \ .$$
I.e., for a given $P_L$ and $R_T$, the way to make $P_T$ be small is to make $V_L$ be large, which means you need to make $V_S$ be large.
Best Answer
They’re describing the situation where the wires are carrying power to a load. It’s the load that (mostly) determines the current in the wires leading to it.
A $1200$W oven on $120$V needs $10$A.
Once the load has determined the current, the heat in the wires is given by their resistance via $I^2 R_{wire}$.
A $0.02$ ohm wire to the oven will have $2$W of heat; a $0.01$ ohm will have less: $1$W.
That difference in wire resistance doesn’t change the current much because the current is really controlled by the ~$10$ ohm heater resistance. But it changes the wire heat a lot.