a) Yes, you're correct, the value of $\lambda$ should be 12.
What have you done for the other two?
b and c both look right to me.
The direct answer to the question - how to deal with small expected counts - is that one might either
(a) combine ranges of cells at the end (a very common approach),
(b) use a different (and perhaps better) goodness of fit test, or
(c) consider dropping the chi-square approximation, and see if one can deal with the discrete distribution of the test statistic more directly, perhaps by simulation.
Approach (a) can be found in many texts. There are many ways you can go about combining cells, but many people simply work from one end or the other, combining cells either to the left, or to the right, until the expected counts are sufficiently high for their purpose.
(However, the most commonly cited rule of thumb for the expected number - that it should be at least 5 for the chi-square approximation to hold - is unnecessarily strict for the sort of approximation most people would require. Many subsequent papers have suggested less stringent rules.)
The other answer by user36381 suggests that with such a large sample size, goodness of fit tests are almost certain to reject; this is true. However, I'm not so sure comparing to other reference distributions will help, since they, too, would almost certainly be rejected by a decent goodness of fit test.
(Why are you testing whether it's Poisson? If you have around a million data points, the sample itself contains a lot of information about distributional shape - do you actually need a name for the distribution?)
Best Answer
Let $X$ and $Y$ denote the number of large claims and small claims respectively in an year. Then, $N = X+Y$ is the total number of claims and is known to be a Poisson$(\lambda)$ random variable. Now, given that $N=n$, the conditions stated in the problem tell us that the conditional distribution of $X$ is binomial with parameters $(n,p)$ and that of $Y$ is binomial with parameters $(n,1-p)$. Note that conditioned on $N=n$, $X$ and $Y$ are very much dependent random variables since $Y=n-X$. But, unconditionally, $X$ and $Y$ are independent Poisson$(\lambda p)$ and Poisson$(\lambda(1-p))$ random variables.
To see why all this is so, consider that for $0 \leq r \leq n$, $$P\{X=r, Y=s \mid N=n\} = \begin{cases} \displaystyle\binom{n}{r}p^r(1-p)^{s}, & \text{if} ~s = n-r,\\ 0, & \text{if} ~ s \neq n-r. \end{cases}$$ Consequently, for any $r, s \geq 0$, $$\begin{align} P\{X = r, Y = s\} &= \sum_{n=0}^\infty P\{X=r, Y=s, N=n\}\\ &= \sum_{n=0}^\infty P\{X=r, Y=s \mid N=n\}P\{N =n\}\\ &=\binom{r+s}{r}p^r(1-p)^{s}e^{-\lambda}\frac{\lambda^{r+s}}{(r+s)!} &\scriptstyle{\text{only the $n=r+s$ term is nonzero in the sum}}\\ &= \frac{(r+s)!}{r!s!}(\lambda p)^r(\lambda(1-p))^{s} e^{-\lambda p - \lambda(1-p)}\frac{1}{(r+s)!}\\ &= e^{-\lambda p}\frac{(\lambda p)^r}{r!}\cdot e^{-\lambda(1-p)}\frac{(\lambda(1-p))^s}{s!}\\ &= P\{X=r\}P\{Y = s\} \end{align}$$ showing that $X$ and $Y$ are independent Poisson$(\lambda p)$ and Poisson$(\lambda(1-p))$ random variables respectively. Consequently, conditioned on $X = r$, $Y$ continues to be a Poisson$(\lambda(1-p))$ random variable.