You have to be more careful with what your outcomes are and what their probabilities are. From what I see you have 6 outcomes, let's call them $x_1,\dots,x_6$, with probabilities $p_1,\dots,p_6$ given in your list.
The outcomes can have cardinal values, e.g. throwing an (unfair) dice -> $x_1 = 1,\dots, x_6 = 6$.
They can also be nominal, such as ethnicity -> $x_1 =$ black, $x_2 =$ caucasian etc.
In the first case, it makes sense to define mean and variance
$$
\overline x = \sum_{i=1}^{6} p_ix_i,
\qquad
\mathbb V = \sum_{i=1}^{6} p_i (x_i-\overline x)^2.
$$
The variance measures the (quadratic) spread around the mean.
Note, that this definition is different from yours.
In the second case, mean and variance do not make any sense, since you cannot add black to caucasian or scale them, square them etc.
The entropy, on the other hand, can be defined in both cases! Intuitively, it measures the uncertainty of the outcome.
Note that, as Mike Hawk pointed out, it does not care what the outcomes actually are. They can be $x_1 = 1,\dots, x_6 = 6$ or $x_1 = 100,\dots, x_6 = 600$ or ($x_1 =$ black, $x_2 =$ caucasian etc.), the result will only depend on the probabilities $p_1,\dots,p_6$. The variance on the other hand will be very different for the first two cases (by the factor of 10000) and not exist in the third case.
Your definition of variance is very unconventional, it measures the spread of the actual probability values instead of the outcomes. I think that theoretically this can be made sense of, but I very much doubt that this is the quantity you wish to consider (especially as a medical doctor).
It is definitely not meaningful to compare it to entropy, which measures the uncertainty of the outcome. The entropy is maximal if all outcomes have equal probability $1/6$, whereas this would yield the minimal value 0 for your definition of variance...
Hope this helps.
Best Answer
A discrete distribution (with infinite outcomes) can have an infinite entropy. I don't see much to interpret here. Actually, one might argue that, on the contrary, the surprising thing is that some (most) discrete distributions with infinite outcomes have finite entropy, i.e. they produce a finite amount of information, so they can be described (in average) with a finite number of bits.
I think that the most natural example of infinite entropy is a continuous distribution (say, uniform on $[0,1]$), which can be regarded as the limit of a discrete uniform distribution for increasingly large number of values. It's not surprising that a real number in $[0,1]$ has infinite entropy, because to describe it you need an infinite number of bits (think of its binary fractional representation). Put in other way, in a real number in $[0,1]$ you can code an arbitrarily big amount of information.