Suppose $\theta$ is the unknown quantity of interest. A necessary and sufficient condition for an unbiased estimator (assuming one exists) of some parameteric function $g(\theta)$ to be UMVUE is that it must be uncorrelated with every unbiased estimator of zero (assuming of course the unbiased estimator has finite second moment). We can use this result to prove uniqueness of UMVUE whenever it exists.
If possible, suppose $T_1$ and $T_2$ are both UMVUEs of $g(\theta)$.
Then $T_1-T_2$ is an unbiased estimator of zero, so that by the result above we have
$$\operatorname{Cov}_{\theta}(T_1,T_1-T_2)=0\quad,\,\forall\,\theta$$
Or, $$\operatorname{Var}_{\theta}(T_1)=\operatorname{Cov}_{\theta}(T_1,T_2)\quad,\,\forall\,\theta$$
Therefore, $$\operatorname{Corr}_{\theta}(T_1,T_2)=\frac{\operatorname{Cov}_{\theta}(T_1,T_2)}{\sqrt{\operatorname{Var}_{\theta}(T_1)}\sqrt{\operatorname{Var}_{\theta}(T_2)}}=\sqrt\frac{\operatorname{Var}_{\theta}(T_1)}{\operatorname{Var}_{\theta}(T_2)}\quad,\,\forall\,\theta$$
Since $T_1$ and $T_2$ have the same variance by assumption, correlation between $T_1$ and $T_2$ is exactly $1$. In other words, $T_1$ and $T_2$ are linearly related, i.e. for some $a,b(\ne 0)$, $$T_1=a+bT_2 \quad,\text{ a.e. }$$
Taking variance on both sides of the above equation gives $b^2=1$, or $b=1$ ($b=-1$ is invalid because that leads to $T_1=2g(\theta)-T_2$ a.e. on taking expectation, which cannot be true as $T_1,T_2$ do not depend on $\theta$). So $T_1=a+T_2$ a.e. and that leads to $a=0$ on taking expectation. Thus, $$T_1=T_2\quad,\text{ a.e. }$$
By Cochran's theorem,
$$\frac{6S^2}{\sigma^2} =\sum_{i=1}^{7}\left(\frac{X_i-\overline{X}}{\sigma}\right)^2 \sim \chi^2_6$$
Denote this r.v. by $Z$.
The p.d.f. of $\chi^2_6$ equals to
$$
f_Z(x)=\frac{1}{2^3\Gamma(3)}x^{2} e^{-x/2}=\frac{1}{16}x^{2} e^{-x/2}, \ x>0
$$
Calculate
$$
E\left(\frac1{S^2}\right)=E\left(\frac{\sigma^2}{6S^2}\cdot\frac{6}{\sigma^2}\right)=\frac{6}{\sigma^2}E\left(\frac{1}{Z}\right).
$$
$$
E\left(\frac{1}{Z}\right) = \int\limits_0^\infty \frac1x \frac{1}{16}x^{2} e^{-x/2}\, dx = \frac{1}{16}\int\limits_0^\infty x e^{-x/2}\, dx = \frac{4}{16}=\frac14.
$$
Finally,
$$E\left(\frac1{S^2}\right)= \frac{6}{4\sigma^2}$$
and the constant $A=6/4=1.5$.
Best Answer
Your reasoning is correct except MLE is not the UMVUE of the population variance.
A complete sufficient statistic for $p$ is $T=\sum\limits_{i=1}^N X_i$, which has a $\mathsf{Bin}(nN,p)$ distribution.
Now $E_p[T]=nNp$ and $\operatorname{Var}_p[T]=nNp(1-p)$ for all $p\in(0,1)$.
Again, $$E_p[T^2]=\operatorname{Var}_p[T]+(E_p[T])^2=nNp(1-p)+n^2N^2p^2$$
Or, $$E_p[T^2-T]=nNp^2(nN-1)$$
That is, $$E_p\left[\frac{T(T-1)}{N(nN-1)}\right]=np^2$$
So you have an unbiased estimator of population variance based on $T$ (and hence UMVUE):
$$E_p\left[\frac TN-\frac{T(T-1)}{N(nN-1)}\right]=np-np^2=np(1-p)\quad,\forall\,p\in(0,1)$$
With $\overline X=\frac TN$, the sample variance $S^2=\frac1{N-1}\sum\limits_{i=1}^N (X_i-\overline X)^2$ is unbiased for population variance. So by Lehmann-Scheffe, $E\left[S^2\mid T\right]$ is also UMVUE of $np(1-p)$.
As UMVUE is unique whenever it exists, you can say
$$E\left[S^2\mid T\right]=\frac TN-\frac{T(T-1)}{N(nN-1)}\tag{*}$$
This can be rewritten in terms of $\overline X$ of course.
A direct way to obtain $(*)$ would be to proceed using linearity of expectation.
I think it should be something like
\begin{align} E\left[S^2\mid T=t\right]&=E\left[\frac{1}{N-1}\sum_{i=1}^N\left(X_i-\frac tN\right)^2\mid T=t\right] \\&=E\left[\frac{1}{N-1}\left(\sum_{i=1}^N X_i^2-\frac{t^2}{N}\right)\mid T=t\right] \\&=\frac{1}{N-1}\sum_{i=1}^N E\left[X_1^2\mid T=t\right]-\frac{t^2}{N(N-1)} \end{align}
Now we only have to recall that $X_1$ conditioned on $T$ has a hypergeometric distribution.