Thanks for all the very quick responses,they were incredibly useful! Based on these responses, I think the conjecture is now settled in the affirmative, as follows.
For each n, let $F_-(n)$ and $F_+(n)$ be the minimal and maximal values of $\sum_{A \in {\mathcal D}} (-1)^{|A|}$ respectively. The conjecture is that $F_-(n), F_+(n)$ are the extremal values of $(-1)^r \binom{n-1}{r}$ for $r=0,\ldots,n-1$. More explicitly,
$$ F_-(n) = -\binom{n-1}{n/2}, F_+(n) = \binom{n-1}{n/2}$$
when $n$ is even,
$$ F_-(n) = -\binom{n-1}{(n-1)/2}, F_+(n) = \binom{n-1}{(n+1)/2}$$
when $n=3 \mod 4$, and
$$ F_-(n) = -\binom{n-1}{(n+1)/2}, F_+(n) = \binom{n-1}{(n-1)/2}$$
when $n=1 \mod 4$. As mentioned in the post, these bounds would be best possible.
By slicing an n-dimensional downset into two n-1-dimensional downsets, one obtains the inequalities
$$ F_-(n-1)-F_+(n-1) \leq F_-(n) \leq F_+(n) \leq F_+(n-1) - F_-(n-1)$$
which already gives most of the conjecture by induction and Pascal's identity; the only remaining cases that need separate verification are
$$F_+(n) = \binom{n-1}{(n+1)/2} \qquad (1)$$
when n is 3 mod 4, and
$$F_-(n) = -\binom{n-1}{(n+1)/2} \qquad (2)$$
when n is 1 mod 4.
Let's show (1), as the proof of (2) is similar. Fix n equal to 3 mod 4, and let ${\mathcal D}$ be a downset which attains the maximal value $F_+(n)$ of $\sum_{A \in {\mathcal D}} (-1)^{|A|}$:
$$ \sum_{A \in {\mathcal D}} (-1)^{|A|} = F_+(n).$$
Now introduce the "f-vector" $(f_0,\ldots,f_n)$ of $A$, with $f_i := |\{ A \in {\mathcal D}: |A|=i\}|$ defined as the number of elements of ${\mathcal D}$ of cardinality $i$. (This is shifted by one from the polytope conventions, I guess because i points determine an i-1-dimensional simplex.) Then we have
$$ f_0 - f_1 + \ldots - f_n = F_+(n).$$
Let r be the largest index for which $f_r$ is non-zero, or equivalently the largest cardinality of an element of ${\mathcal D}$. (We can treat the degenerate case when ${\mathcal D}$ is empty by hand.) If $r$ was odd, we could simply remove all $r$-element sets from ${\mathcal D}$ and increase the alternating sum, so we may assume that $r$ is even, so the alternating sum looks like $f_0 - f_1 + \ldots - f_{r-1} + f_r$.
The case r=0 can also be treated by hand and will be ignored. Now, we double-count. Observe that each $r$-element set in ${\mathcal D}$ has $r$ "children" as $r-1$-element subsets of ${\mathcal D}$, by removing one of the r elements from that set. On the other hand, each $r-1$-element set can have at most $n-r+1$ "parents", and so
$$ r f_r \leq (n-r+1) f_{r-1}.$$
(EDIT: Actually we didn't need to remove the r=0 case if we adopted the convention $f_{-1}=0$ here.)
In particular, if $r > \frac{n+1}{2}$, then $f_r < f_{r-1}$ we could remove both the r and r-1-element sets from the downset and again increase the sum; so we have $r \leq \frac{n+1}{2}$. In fact the same argument shows that, by changing the extremum ${\mathcal D}$ if necessary, we may assume that $r < \frac{n+1}{2}$, thus (since $n$ is 3 mod 4 and r is even) $r \leq \frac{n-3}{2}$. In other words, every element of ${\mathcal D}$ has cardinality at most $(n-3)/2$.
Now we flip the downset to look at the complementary downset ${\mathcal D}' := \{ A \in [n]: [n] \backslash A \not \in {\mathcal D} \}$. As n is odd, we have $\sum_{A \in {\mathcal D}'} (-1)^{|A|} = \sum_{A \in {\mathcal D}} (-1)^{|A|}$, and so ${\mathcal D}'$ is also an extremiser. Thus, by the above argument, every element of ${\mathcal D}'$ has cardinality at most $(n+1)/2$. Equivalently (as $n$ is odd), ${\mathcal D}$ contains every element of cardinality at most $(n-3)/2$. Combining this with the previous analysis, we see that the extremum is attained at the set consisting precisely of all subsets of [n] of cardinality at most $(n-3)/2$, which gives the required value of $F_+(n)$.
Here is a streamlined description of the optimal strategies that have been found. In both strategies, there is a team that plays a special role -- call them $x$ -- and the other teams are numbered modulo $n-1$.
With $n=2k$ teams: We play $2k-1$ rounds of $k$ games each. In the $i$-th round, the games are
$$(x,i), (i-1,i+1), (i-2, i+2), \ldots, (i-k+1, i+k-1).$$
Team $x$ always waits $k$ games between playing; every other team either waits $k-1$ or $k+1$ games. (Here I mean the difference between time slots: If team $x$ plays at time $t$, then they play again at time $t+k$.)
With $n=2k+1$ teams: We play $2k$ rounds which are alternately "long" ($k+1$ games) and "short" ($k$ games), starting with a long round. In the $i$-th long round, the games are
$$(x,i), (i+1,i-1), (i+2, i-2), \ldots, (i+k-1, i-k+1), (i+k, x).$$
In the $i$-th short round, the games are
$$(i,i+1), (i-1, i+2), (i-2, i+3), \ldots, (i-k+1, i+k).$$
Every wait is either $k$ or $k+1$ games.
I wish I had time to make some animated graphics of these. I'd put players $1$ through $n-1$ around a circle and $x$ at the center, and draw lines between the pairs as they occur.
Best Answer
To restate the question in probabilistic language, each of the $n$ chldren's parents independently and with uniform probabilities chooses a $k$-element subset of $[n]$; say $X_i$ is the choice of child $i$'s parents. You want the probability that there is a successful assignment of parents to slots, corresponding to a permutation $\pi$ of $[n]$, such that $\pi(i) \in X_i$ for all $i$.
I don't know how to answer this question, but here's a related one that gives a bound. Consider a given permutation $\pi$. The probability that $\pi(i) \in X_i$ for all $i$ is $(k/n)^n$. Since there are $n!$ permutations, the expected number of successful permutations is $n! (k/n)^n$. As $n \to \infty$, by Stirling's approximation $n! \sim \sqrt{2\pi} n^{n+1/2} e^{-n}$. Of course the expected number of successful permutations is an upper bound on the probability of existence of at least one of them. Thus as long as $k \ge 3 > e$, the expected number of successful permutations should be greater than $1$ if $n$ is large enough, but if $k \le 2$ it will be less than $1$ (and go to $0$ as $n \to \infty$). For the example given with $k=4$ and $n=18$, $18! (4/18)^n \approx 11182$, while with $k=3$ and $n=18$, $18! (3/18)^n \approx 63$ .
I tried simulations to find the actual probabilities for $k = 3$ and $k=4$ with $n=18$. For $k=4$ I found success in $817$ cases out of $1000$, while for $k=3$ I found success in only $388$ out of $1000$.
Of course, in real life the probabilities are not likely to be uniform: there will be some very popular slots and others that nobody wants.
EDIT: I am not suggesting that with $k = O(1)$ we have a solution with high probability. The probability that a given day is not chosen by anyone (which obviously implies there is no successful permutation) is $(1 - k/n)^n \to e^{-k}$ as $n \to \infty$, and thus the expected number of days not chosen is $n e^{-k}$. If we want high probability of a successful permutation it seems reasonable to require this to go to $0$ as $n \to \infty$, and thus $k - \log(n) \to \infty$.