The possible values of the triplet $A,B,C$ are $\{(1,1,1),(1,1,-1),\cdots,(-1,-1,-1)\}.$ Based on a sample the probabilities of the the different $8$ outcomes could be estimated. Let those probabilities be denoted by $p_{1,1,1},p_{1,1,-1},\cdots,p_{-1,-1,-1}$.
The entropy of $(A,B,C)$, by definition, is
$$H(A,B,C)=-(p_{1,1,1}\log (p_{1,1,1})+p_{1,1,-1}\log(p_{1,1,-1})...+p_{-1,-1,-1}\log (p_{-1,-1,-1}).$$
Or, in general, if $\{p_1,p_2,\cdots p_n\}$ is the pmf of a discrete random variable then the corresponding entropy is
$$H=-\sum_{i=1}^np_i\log(p_i).$$
(The base ofe $\log$ is considered to be $2$ in this contexts.)
Edited
Example for $A,B$:
For instance the estimate for $p_{1,1}=\frac{2}{8}=\frac{1}{4},$ because in the given sample of $8$ elements the number of occurrences of $1,1$ is $2$. Also, $p_{1,-1}=\frac{1}{4}$,$p_{-1,1}=\frac{1}{4}$, $p_{-1,-1}=\frac{1}{4}$. So
$$H(A,B)=-\log\left(\frac{1}{4}\right)=2.$$
But this is only a very poor estimate!! The sample is small.
Best Answer
Here's my attempt. We're using this definition for $H(X)$, for a discrete r.v. $X$ with probabilities $p_1, p_2, \ldots, p_n$: $$ H(X)=-\sum_{i=1}^n p_i \log_2(p_i) $$
We know that the sum of two i.i.d. discrete uniform r.v. is a discrete triangular r.v., call it $Z$. $Z$ is symmetrical, so even though the support extends to $2n$ we need only to evaluate the sum until midway. We also know that the probabilities $z_i$ of $Z$ are $\frac{1}{n^2}, \frac{2}{n^2},\ldots,\frac{n}{n^2},\frac{n-1}{n^2},\ldots,\frac{1}{n^2}$. Hence, we have, $$ \begin{aligned} H(Z) &= -2\sum_{i=1}^{n-1} z_i \log_2(z_i) - z_n \log_2(z_n)\\ &= -2\sum_{i=1}^{n-1} \frac{i}{n^2} \log_2 \left( \frac{i}{n^2} \right) -\frac{n}{n^2}\log_2 \left( \frac{n}{n^2}\right) \\ &= - \frac{2}{n^2} \sum_{i=1}^{n-1} i [\log_2(i) - 2\log_2(n)] + \frac{\log_2(n)}{n}\\ &= - \frac{2}{n^2} \left(\sum_{i=1}^{n-1} i \log_2(i) - 2\log_2(n) \sum_{i=1}^{n-1} i \right) + \frac{\log_2(n)}{n}\\ &= - \frac{2}{n^2} \left[\sum_{i=1}^{n-1} \log_2(i^i) - 2\log_2(n) \frac{n(n-1)}{2} \right] + \frac{\log_2(n)}{n}\\ &= \frac{2(n-1) \log_2(n)}{n}- \frac{2}{n^2} \log_2 \left(\prod_{i=1}^{n-1} i^i \right) + \frac{\log_2(n)}{n}\\ &= \frac{(2n-1) \log_2(n)}{n}- \frac{2}{n^2} \log_2 \mathcal{H}(n-1). \end{aligned} $$ In the derivation, $\mathcal{H}(x)$ is the hyperfactorial function. And you'll probably need to adjust the formula if $n$ is odd.
As a check, if $n=6$ (a fair dice), then the information entropy associated with the event that observes the sum of two dice is about $3.2744$ (checked with WolframAlpha). I am not familiar with this area, so I do not know if this is the actual result, but it seems reasonable.