This paper 10.1103/PhysRevE.70.056110 calculates analytically the characteristic length (= Average shortest path) $l_{ER}$ of an Erdös-Renyi Random Network with $N$ vertices and $ \langle k \rangle $ average degree (#edges $ m= \frac{N \langle k \rangle}{2}$).
$$
l_{ER} = \frac{\ln{N} - \gamma}{\ln{\langle k \rangle}} + \frac{1}{2},
$$
or, equally
$$
l_{ER} = \frac{\ln{N} - \gamma}{\ln(\frac{2m}{N})} + \frac{1}{2},
$$
where $\gamma \approx 0.57722$ is Euler's Constant.
The covariance computation is key to applying Chebyshev's inequality; it can't be left out as a minor detail. In the case of any sum of indicator variables $X_1, X_2, \dots, X_k$, you'll have $$\mathbb E[X^2] = \mathbb E[X] + 2\sum_{i=1}^k \sum_{j=1}^{i-1} \mathbb E[X_i X_j].$$ If we want to prove the upper half of a threshold, we are in a case where $\mathbb E[X] \to \infty$, and we want to show that $\mathbb E[X^2] = (1 + o(1)) \mathbb E[X]^2$. In this case, we'll always have $\mathbb E[X] \ll \mathbb E[X^2]$, so the contribution of the diagonal terms is insignificant. It's the second sum, where $\mathbb E[X_i X_j]$ appears, that we'll worry about.
To analyze $\mathbb E[X_i X_j]$, we consider two cases.
- Cases where paths $i$ and $j$ share at most one vertex (and no edges). In that case, $\mathbb E[X_i X_j] = \mathbb E[X_i] \mathbb E[X_j] = p^4$.
- Cases where paths $i$ and $j$ share one edge. In that case, $\mathbb E[X_i X_j] = p^3$, because there are only three edges.
It's important to count the cases of the second type. In those cases, the union of the two paths forms either a $P_4$ subgraph (in which there are $2$ ways to pick the two paths) or a $K_{1,3}$ subgraph (in which case there are $6$ ways to pick the two paths). Some counting tells us that there are $6 \binom n4$ potential $P_4$ subgraphs and $4 \binom n4$ potential $K_{1,3}$ subgraphs. So there are $6 \binom n4 \cdot 2 + 4 \binom n4 \cdot 6 < \frac32 n^4$ cases of the second type.
For cases of the first type, it's enough to say that there are fewer than $k^2$ cases (because there are $k^2$ cases of any type). Most pairs of paths will have the first type, so we don't need to be too careful there.
So we get
$$
\mathbb E[X^2] \le \underbrace{3 p^2 \binom n2}_{\text{diagonal terms}} + \underbrace{p^4 k^2}_{\text{first type}} + \underbrace{\frac32 n^4 p^3}_{\text{second type}}.
$$
Meanwhile, $\mathbb E[X]^2 = p^4 k^2$, which neatly cancels the second term. So $\operatorname{Var}[X] \le 3 p^2 \binom n2 + \frac32 p^3 n^4$ and now you should be able to apply Chebyshev's inequality.
(This is the part that tells us that when $p \gg n^{-3/2}$, we have $X > 0$ w.h.p. Meanwhile, to prove that $X = 0$ w.h.p. when $p \ll n^{-3/2}$, showing that $\mathbb E[X] \to 0$ is enough.)
Best Answer
Since each of $n-1$ possible edges out of the distinguished node is (independently) present with probability $p$, the distribution of its degree is a $\text{Binomial}(n-1,p)$ random variable.
So one way to proceed would be to analyze the behavior of $(1-p)^{n-1}$ - the probability that the degree is $0$ - as $n \to \infty$. In this case, that's not too terrible. But it goes against the spirit of random graphs; for harder problems, such methods will no longer work, because the exact probability of events is too hard to find.
Instead, here's a more typical approach. Let $\mathbf X$ be the degree of the distinguished node.
(To compute quantities like $\mathbb E[\mathbf X^2]$, think of this random variable as counting ordered pairs of edges out of the distinguished node. There are $(n-1)(n-2)$ pairs $(e_1, e_2)$ which are present with probability $p^2$ and $n-1$ pairs $(e_1, e_1)$ present with probability $p$, so $\mathbb E[\mathbf X^2] = (n-1)(n-2)p^2 + (n-1)p$, which can be rewritten as $(n-1)p((n-1)p+1-p)$.)