I'm not saying the distribution of $L$ is inappropriate, but I think it will be more easy to work with another one.
Let me give an answer that works with a general class of distributions. I also only assume that the tessellation is only made from convex polytopes.
First remark that if I denote by $T$ the union of all faces of cells of the tessellation (edges of $[0,1]^n$ included), you are interested in the variable $$F=card (T\cap L)-1$$
(because a.s. each time the line touches a cell, it only touches its border twice, and the exit point is actually the entrance point for a new cell, except at the end-points).
Let $\mathcal{L}$ be the class of all lines of $\mathbb{R}^n$, and $\mu$ some measure on $\mathcal{L}$ (endowed with a topology coming from a parametrization of $\mathcal{L}$), and for a set $C$ of $\mathbb{R}^n$, denote by $[C]$ the class of all lines intersecting $C$. Instead of working with a single line $L$, consider an independant Poisson point process of lines $\Pi$ on $\mathcal{L}$ with intensity measure $\mu$. I emphasize that a.s. only a finite number of lines of $\Pi$ will hit $[0,1]^n$, so I can label them independantly $N_1, N_2, ...$ so that the $L_i$ are iid. Call $N$ then number of lines touching $[0,1]^n$. Given any fixed tessellation $T$, we have
$$\mathbb{E}\sum_i card(T\cap L_i)=\mathbb{E}\sum_n \mathbb{P}(N=n) \sum_{i=1}^n card(T\cap L_i)=\sum_n \mathbb{P}(N=n) n\mathbb{E} card(T\cap L_1)$$
$$=(\mathbb{E}(F)+1)\mathbb{E}(N).$$
So we will compute $\mathbb{E}(N)$ and $\mathbb{E}\sum_i card(T\cap L_i)$. We have $\mathbb{E}(N)=\mu([[0,1]^n])$ because $\Pi$ is a Poisson point process.
Let $T$ be a deterministic tessellation, written as $T=\cup f$, where the $f$ are the facets of the tessellations. Remarking that any given line can hit a facet no more than one time, we have $$\mathbb{E}\sum_i card(T\cap L_i)=\mathbb{E}\sum_i\sum_f card(f\cap L_i)=\sum_f \mathbb{E} card(i:f\cap L_i\neq \emptyset)=\sum_f \mu([f])$$
because $(L_1,L_2,...)$ is a Poisson point process with intensity $\mu$.
At this point you need to make assumptions on the distribution of the lines, so I make the assumption that $\mu$ is translation invariant. In this case it is a standard fact from integral geometry that $\mu([f])=c |f|$ where $c$ is a constant depending solely on $\mu$ and $|f|$ is the $(n-1)$-dimensional measure of $f$. Thus you have
$$\mathbb{E}F=\mathbb{E}\frac{\sum_f c|f| }{\mu([[0,1]^n])}-1=c\frac{\mathbb{E}|T|}{\mu([[0,1]^n])}-1.$$
With normalisation conditions you can probably compute $c/\mu([[0,1]^n])$. Thus the above results holds for any random tessellation with facets as convex polytopes and a random line segment that is the restriction of a stationary measure $\mu$ on $\mathcal{L}$ to $[[0,1]^n$. For example the uniform distribution works, but you can also choose a different directional distribution, for instance "only horizontal lines and vertical lines", etc...I don't know if the iid intersection points enters this framework.
By symmetry I can assume $p\le\frac12$. I will use natural logs.
Put $f(x)=h(p)-h(x)$ and $b_k = \binom{n}{k} p^k(1-p)^{n-k}$.
Define $k_0=\lceil pn/2\rceil$ and $k_1=n-k_0$.
The plan is: find polynomials $f_0(x),f_1(x)$ such that $f_0(x)\le f(x)\le f_1(x)$ for $k_0/n\le x\le k_1/n$. Then make bounds on the various parts of
$$\sum_{k=0}^{k_0-1} b_k(f(k/n)-f_0(k/n)) + \sum_{k=0}^n b_k f_0(k/n) + \sum_{k=k_1+1}^n b_k (f(k/n)-f_0(k/n)) \le \sum_{k=0}^n b_k f(k/n) \le \sum_{k=0}^{k_0-1} b_k(f(k/n)-f_1(k/n)) + \sum_{k=0}^n b_k f_1(k/n) + \sum_{k=k_1+1}^n b_k (f(k/n)-f_1(k/n))
.$$
Since $f^{(iv)}(x) = 2/x^3+2/(1-x)^3$, we can use Taylor's theorem with remainder to get
$$f_j(x) = (\ln p -\ln(1-p))(x-p) + \frac{(x-p)^2}{2p(1-p)}
- \frac{(1-2p)(x-p)^3}{6p^2(1-p)^2} + \frac{j(8-8p+2p^2+p^3)(x-p)^4}{3p^3(2-p)^2}$$
for $j=0,1$. Maple now tells us
$$\sum_{k=0}^n b_k f_j(k/n)=\frac{1}{2n}-\frac{(1-2p)^2}{6p(1-p)n^2} + \frac{Bj}{n^2},$$
where $$B = \frac{(1-p)^2(8-8p+2p^2+p^3)}{p(2-p)^2}
+ \frac{(1-p)(8-8p+2p^2+p^3)(1-6p+6p^2)}{3p^2(2-p)^2n}.$$
This much is $1/(2n)+O(1/(pn^2))$.
In the range $0\le k\le k_0-1$, $b_k$ is dominated by a geometric series with ratio $\frac12$ and $f(x)-f_j(x)=O(p\ln p)$. Using the Stirling approximation for $b_k$, we find that
$$\sum_{k=0}^{k_0-1} b_k(f(k/n)-f_j(k/n))=o(1/n)$$ for $p\ge (2+\epsilon)\ln\ln n/n$.
In the range $k_1+1 \le k\le n$, $b_k\le 2^{-3n/4}$ (using the assumption $p\le \frac12$) and $f(x)-f_j(x)=O(p^{-3})$, so this part is negligible compared to the previous part.
In summary,
$$\sum_{k=0}^n b_k f(k/n) = \frac{1+o(1)}{2n}$$
for $(2+\epsilon)\ln\ln n/n\le p\le 1-(2+\epsilon)\ln\ln n/n$, any $\epsilon\gt 0$.
For the case $p=a/n$ with $a=O(\ln\ln n)$, use $\binom{n}{k}=\frac{n^k}{k!}\left(1-\binom{k}{2}/n -O(k^3/n^2)\right)$ and simple bounds on the tail to find
$$\sum_{k=0}^n b_k f(k/n) = \sum_{k=0}^{(\ln n)^2} \frac{a^k}{k!}\left(1-\frac{\binom{k}{2}}{n}-\frac{ak}{n}\right)f(k/n) + O((\ln n)^{O(1)}/n^2).$$
Best Answer
This is a generalization of the binomial transform of the function $f(k)$. See, for instance, the Wikipedia article on binomial transform, and, in particular, the generalizations given therein. The Prodinger reference deals specifically with your expression for $F(n)$. Or, if you rewrite it as $$F(n) = (1-p)^n \sum_{k=0}^n \binom{n}{k} \left(\frac{p}{1-p}\right)^k f(k),$$ then you having a scaled version of the rising $k$-binomial transform of $f(k)$ as described in my 2006 paper with Laura Steil. At any rate, it appears the term you want is "binomial transform," and there is a small literature on its properties.