Let us first center everything, using $\bar X_k=X_k-\lambda$ and $\bar M=M-\lambda$. Then
$$
\mathrm{Cov}(M,S^2)=\mathbb E(\bar MS^2)=\mathbb E(\bar X_1S^2)=\frac1{n-1}\mathbb E(\bar X_1U),
$$
where
$$
U=\sum\limits_{k=1}^n(\bar X_k-\bar M)^2.
$$
Note that $U$ is a linear combination of products $\bar X_k^2$ and $\bar X_k\bar X_i$ for $i\ne k$. Amongst these products, many will not contribute to the expectation of $\bar X_1U$ since $\mathbb E(\bar X_1\bar X_k\bar X_i)=0$ for every $k\ne i$ and $\mathbb E(\bar X_1\bar X_k^2)=0$ for every $k\ne1$.
Hence, one needs only the coefficient of $\bar X_1^2$ in $U$, which is $c_n=\left(\frac{n-1}n\right)^2+(n-1)\frac1{n^2}=\frac{n-1}n$. This yields $\mathbb E(\bar X_1U)=c_n\mathbb E(\bar X_1^3)$ and $\mathrm{Cov}(M,S^2)=\frac1{n-1}c_n\mathbb E(\bar X_1^3)=\frac1n\mathbb E(\bar X_1^3)$.
Finally, the third central moment of the Poisson distribution with parameter $\lambda$ is $\mathbb E(\bar X_1^3)=\lambda$ hence
$$
\mathrm{Cov}(M,S^2)=\frac\lambda{n}.
$$
Let $n$ be the sample size. Since $S^2$ is calculated from the sample generated by the $X_i$ we know that
$$X_i \sim N(\mu, \sigma^2)$$
$$\frac{(n-1)S^2}{\sigma^2}\sim \chi^2(n-1)$$
The distribution of the sum of a Gaussian rv and a Chi-Squared rv is an instance of the Generalized Chi-Squared Distribution. A variable $\xi$ with the Generalized Chi-Squared Distribution can be defined as follows:
$$\xi = x +\sum_1^n w_i y_i \text { where } x\sim N(m,s),\;\; w_i \in \mathbb{R},\;\;y_i \sim \chi'^2(k_i,\lambda_i)\text{ and } y_i \text { independent}$$
Note that $\chi'^2(k_i,\lambda_i)$ is the non-central Chi-squared distribution, it is related to the Chi-squared as follows:
$$\chi^2(n) = \chi'^2(n,0)$$
In your case, we want the sum of a single Chi-Squared variable and a Normal, where the Chi-Squared is based on the sample variance of the sample from the Normal random variable:
$$\xi = x + wy \;\;\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0)$$
What about the weight variable $w$? If $w=1$ then $\xi$ represents the following sum:
$$\bar{X} + \frac{(n-1)S^2}{\sigma^2}$$
But we want:
$$\bar{X} + S^2$$
Therefore, to get to this sum, we need to alter the weight applied to the chi-squared variable $y$:
$$w = \frac{\sigma^2}{n-1}$$
With this we have what we need:
$$\bar{X} + S^2 \sim \xi = x + wy \;\;\\\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0),\;w = \frac{\sigma^2}{n-1}$$
In summary
$$\bar{X} + S^2 \sim \tilde{\chi}^2\left(\frac{\sigma^2}{n-1},n-1,0,\mu,\sigma\right) $$
Best Answer
Consider the random vectors $Z=(Z_i)_{1\leqslant i\leqslant n}$ and $-Z=(-Z_i)_{1\leqslant i\leqslant n}$. Then $X=\xi(Z)$ for a given odd function $\xi$ and $Y=\eta(Z)$ for a given even function $\eta$. The function $\zeta=\xi\cdot\eta$ is odd as well, hence $\xi(-Z)=-\xi(Z)$ and $\zeta(-Z)=-\zeta(Z)$. Now, $Z$ and $-Z$ follow the same distribution, hence $\xi$ and $\zeta$ being odd functions yields that $\mathrm E(\xi(Z))=\mathrm E(\zeta(Z))=0$. In particular, the covariance of $X$ and $Y$ is $\mathrm E(\zeta(Z))-\mathrm E(\xi(Z))\mathrm E(\eta(Z))=0$.
To sum up, the result you ask a simple proof of (that the empirical mean and empirical variance are uncorrelated) has nothing to do with gaussianity since it holds for every symmetric distribution.