If $p$ is close to $0$ or $1$ you get perturbations because the binomial can't go beyond $0$ or $n$, while the normal distribution goes on forever. When $p$ is "reasonable" the missing tails are ignorably small, but if you have a significant chance of hitting the end, it won't be so close. Of course, if $p$ is exactly $0$ or $1$ you are fine, as the variance becomes zero.
Okay, after some investigating, I have learned some things about statistics. I will post this answer here in the hope that someone finds it helpful in the future. Thanks to helpful comments by @spaceisdarkgreen.
Essentially what this boils down to is the Central Limit Theorem (Wikipedia). The ``usual'' CLT applies to the sum of identically distributed random variables--that is, variables drawn from the same distribution. That does not apply in the case of the Poisson-Binomial distribution, since each variable in the sum is drawn from a Bernoulli distribution with a different mean. Thus, we need a generalization of the CLT for non-identically distributed random variables. Of course, this will require some additional assumptions on the variables, but fortunately they are easily satisfied by Bernoulli random variables.
The necessary modification is provided by the Lyapunov Central Limit Theorem (Wikipedia), (MathWorld) (note that Wikipedia uses $\mathbf{E}[\cdot]$ to denote expectation, while MathWorld uses $\langle\cdot\rangle$). Also, see this answer for a related discussion.
Anyway, the Poisson-Binomial distribution satisfies the Lyapunov condition, and hence, loosely speaking, the Poisson-Binomial distribution will converge to the normal distribution with mean and variance
$$\mu = \sum_{i=1}^n p_i, \quad \sigma^2 = \sum_{i=1}^n p_i(1-p_i)$$
respectively. To confirm this, I tested with means $p_i$ spaced uniformly between $p_0 = 0.35$ and $p_{n-1}=0.65$ for multiple values of $n$. Two plots are shown below (note that the probability mass function of the Poisson-Binomial distribution was computed via Monte Carlo sampling with $N=50\ 000%$ points, since computing the pmf explicity can become a little tricky.
These results suggest that in practice, the convergence may be relatively quick, with reasonable agreement after only $n=6$ (NOTE that in the case where you wish to sum an infinite sequence of Bernoulli random variables, you would require that the $p_i$ be bounded away from 0 and 1! This didn't bother me because I am interested in a finite sequence).
Best Answer
If you want to approximate $X_n\sim\mathsf{Bin}(n,p)$ with a normal distribution then you must go for $X_n=\sum_{i=1}^nB_i$ where the $B_i$ are iid and have Bernoulli distribution with parameter $p$.
Then: $$\frac{X_n-\mathbb EX_n}{\mathsf{SD}(X_n)}=\frac{X_n-np}{\sqrt{np(1-p)}}\to U\text{ a.s.}$$where $U$ has standard normal distribution.
Note that here:$$\frac{X_n-np}{\sqrt{np(1-p)}}=\frac{\bar B_n-p}{\sigma/\sqrt{n}}$$for $\sigma=\mathsf{SD}(B_1)=\sqrt{p(1-p)}$, showing the connection with CLT.
Edit concerning questions in comments on this question:
If a random variable $Y$ has a second moment then it has a standardized form: $$Y^*:=\frac{Y-\mu}{\sigma}$$where $\mu:=\mathbb EY$ and $\sigma^2:=\mathsf{Var}Y$.
Characteristic for this form are: $$\mathbb EY^*=0\text{ and }\mathsf{Var}Y^*=1$$
Formulation of CLT: If $X_1,X_2,\dots$ are iid random variables that have a second moment and: $$S_n:=X_1+\cdots+X_n$$ then standard form $S_n^*$ converges to a random variable $Z$ that has standard normal distribution.
If in this context $\mathbb EX_1=\mu$ and $\mathsf{Var}(X_1)=\sigma^2$ then we find $\mathbb ES_n=n\mu$ and $\mathsf{Var}(S_n)=n\sigma^2$ so that we find:$$S_n^*:=\frac{S_n-n\mu}{\sigma\sqrt{n}}$$
Applying this on special case where $X_i$ have Bernoulli distribution with parameter $p$ we get:$$S_n^*:=\frac{S_n-np}{\sqrt{np(1-p)}}$$ In this situation $S_n$ has binomial distribution with parameters $n$ and $p$.
Applying this on special case where $X_i$ have Poisson distribution with parameter $\lambda$ we get:$$S_n^*:=\frac{S_n-n\lambda}{\sqrt{n\lambda}}$$ In this situation $S_n$ has Poisson distribution with parameter $n\lambda$.