I reached the same conclusion as Henry (+1) in a more roundabout manner. What follows is the gist of my thoughts:
Recall that a geometric random variable models the number of independent Bernoulli trials until a success occurs. Hence to every realization of a geometric random variable we can canonically associate a realization of a string of Bernoulli random variables. For example, let $X_1,\dots, X_{20}$ be a sequence of geometric random variables and $\omega$ be such that the $(X_1(\omega),\dots,X_{20}(\omega))$ is equal to $(6, 3, 2, 12, 1, 17, 1, 1, 1, 1, 2, 2, 3, 6, 1, 2, 1, 4, 5, 3)$. To this realization we can associate the following realization of a string of Bernoulli random variables:
$00000100101000000000001100000000000000001111101010010000011011000100001001$
Where $1$ denotes success. Note that the number of $1$'s in this string is equal to the number of geometric random variables. We are interested in the minimal number of geometric random variables (the minimal number of $1$'s) such that the string up to and including the last $1$ has length $\geq 5000$. Consider the first $5000$ entries in the string. Clearly, the number of $1$'s in the first $5000$ entries (or trials) can be written as a binomial random variable.
Let $Y$ denote the number of $1$'s in the first $5000$ entries of the random string that is obtained from the geometric random variables as described above. $Y$ is a random variable and we have $Y\sim \text{Bin}(5000,p)$. Denote by $Y_{5000}$ the outcome of the $5000$'th Bernoulli trial.
Claim. We have the following equality
$$N=Y \mathbf{1}({Y_{5000}=1})+(Y+1) \mathbf{1}(Y_{5000}=0), \qquad (1)$$
where $\mathbf{1}()$ is an indicator function. In particular, $$\mathsf E(N) = 4999p+1.$$
Proof. The reasoning is as follows. Consider the first $5000$ trials. Either we have a success on the $5000$'th trial, in which case we have that exactly $Y$ geometric RV have been added up to reach $5000$, and hence $N=Y$; or we do not have a success on the 5000'th trial. In this case we have to wait until the next success occurs for the current geometric RV, which will add $1$ to the tally of $Y$. So in this case we must have $N=Y+1$.
Rewriting $(1)$ as
$$N=Y+\mathbf{1}(Y_{5000}=0).$$
And taking expectations gives the result.
$\square$
The exercise is a direct application of the following proposition, with $X=(X_1,X_2,\cdots,X_n)$ and $Y=(X_{i_1},X_{i_2},\cdots,X_{i_n})$ :
If $X=(X_1,X_2,\cdots,X_n)$ and $Y=(Y_1,Y_2,\cdots,Y_n)$ are identically distributed and $f$ and $g$ are measurable functions, then:
$$ E[f(X)\mid g(X)=z]=E[f(Y)\mid g(Y)=z] \tag{1}$$
Proof:
$$\int E[f(X)\mid g(X)]I_A(g(X))dP_{\Omega}=\int f(X)I_A(g(X))dP_{\Omega} $$
But: $$\int E[f(X)\mid g(X)]I_A(g(X))dP_{\Omega}=\int E[f(X)\mid g(\mathbf x) ]I_A(g(\mathbf x))dP_X$$
And: $$\int f(X)I_A(g(X))dP_{\Omega}=\int f(\mathbf x)I_A(g(\mathbf x))dP_X$$
Therefore, for $(1)$ to be valid, it is enough that: $$\int E[f(X)\mid g(\mathbf x)]I_A(g(\mathbf x))dP_X=\int f(\mathbf x)I_A(g(\mathbf x))dP_X$$
But this last equation, and therefore $E[f(X)\mid g(\mathbf x)]$, only depends on the distribution of $X$.
Best Answer
If $(\Omega,\mathcal{F},\mathsf{P})$ is a probability space, $A\in\mathcal{F}$ with $\mathsf{P}(A)>0$, and $X$ is a r.v., then $$ \mathsf{E}[X\mid A]=\frac{\mathsf{E}[X1_A]}{\mathsf{P}(A)}. $$
Applying this to your case and assuming that $\mathsf{P}(N=m)>0$, one gets $$ \mathsf{E}\left[\sum_{i=1}^N X_i\mid N=m\right]=\mathsf{E}\left[\sum_{i=1}^m X_i 1\{N=m\}\right]/\mathsf{P}(N=m). $$
If $N$ is independent of $\{X_i\}_{i\ge 1}$, then the RHS reduces to $\sum_{i=1}^m\mathsf{E}X_i$.