The chi-squared limiting distribution is valid only for a special type of composite hypotheses. $H_0: \theta_1=\theta_1^0,\ldots,\theta_r=\theta_r^0,\theta_{r+1},\ldots,\theta_k$, that is when the first $r$ parameters are specified and the rest are not, versus $H_a: \theta_1,\ldots,\theta_k$ that leaves all of them unspecified.
Your null hypothesis does not have that form, and it is easy to see that the limiting distribution is not chi-squared under the null hypothesis. Suppose $p$ is small (less than 0.2), and $n$ is large. Then the maximum likelihood estimate will be almost always less than 0.2 either under the null or the without restriction. Then your test statistic will be almost always 0! In fact, its limiting distribution is degenerate.
In general, likelihood-ratio tests are not convenient for one-sided alternatives, because the direction of the difference is obliterated by the squaring. However the correct way for developing a one-sided likelihood-ratio test is to note that $p=0.2$ is closest to the alternative, so we test $H_0: p=0.2$ versus $H_a: p\geq 0.2$. In this case, however, the value of interest is on the edge of the parameter space, so the limiting distribution is not chi-squared. In fact, for this case it is a mixture distribution of $0.5 I({0}) + 0.5\chi^2_1$.
Best Answer
Yes, the log likelihood ratio must take on both positive and negative values. If $\log \frac{f_1(x)}{f_0(x)}$ were positive for all possible values of the observed data $x$, then it would be true that $f_1(x) > f_0(x)$ for all $x$, and this is impossible. Remember that $f_1(x)$ and $f_0(x)$ are the probability density functions (or probability mass functions) of the observations under the two hypotheses, and if $f_1(x) > f_0(x)$ were to hold for all $x$, then it would be true that $$\int_{-\infty}^{\infty} f_1(x)\,\mathrm dx > \int_{-\infty}^{\infty} f_0(x)\,\mathrm dx$$ in contradiction of the fact that the integrals displayed above must both equal $1$. A similar argument applies (with sums instead of integrals) for probability mass functions.