[Math] Mixed Strategy Nash Equilibrium of Rock Paper Scissors with 3 players

game theorynash-equilibrium

It seems like most game theory tutorials focus on 2-player games and often algorithms for finding Nash equilibria break down with 3+ players. So here is a simple question:

Is $(\frac{1}{3},\frac{1}{3},\frac{1}{3})$ the only Nash equilibrium in a 3-player game of Rock Paper Scissors? How can we discover this analytically?

Edit: Payoff matrices below, in terms of P1 payoff.

P1=Rock
                         P3

                Rock    Paper   Scissors
             ----------------------------
    Rock     |   0   |   -1   |    0.5  |
             |--------------------------|
P2  Paper    |  -1   |   -1   |    0    |
             |--------------------------|
    Scissors |  0.5  |    0   |    2    |
             ----------------------------

P1=Paper
                         P3

                Rock    Paper   Scissors
             ----------------------------
    Rock     |   2   |   0.5  |    0    |
             |--------------------------|
P2  Paper    |  0.5  |    0   |   -1    |
             |--------------------------|
    Scissors |   0   |   -1   |   -1    |
             ----------------------------

P1=Scissors
                         P3

                Rock    Paper   Scissors
             ----------------------------
    Rock     |  -1   |    0   |   -1    |
             |--------------------------|
P2  Paper    |   0   |    2   |   0.5   |
             |--------------------------|
    Scissors |  -1   |   0.5  |    0    |
             ----------------------------

Best Answer

The procedure for finding mixed-strategy nash equilibrium should not be different when there are three players than when there are 2.

As in the two players' case, the key point is that if it is optimal for you to randomize between different actions, the expected payoff of each action must be the same (assuming that agents are expected utility maximizers). For example, if you randomize over two actions, say Rock and Paper, but ${U}((1,0,0), s_{-i}) > U((0,0,1), s_{-i})$ then you are definitely not optimizing.

Let $p_i(s)$ be the probability that player $i = 1,2,3$ plays action $s = r,p,s$. To get a mixed-strategy nash equilibrium,

$U_1((1,0,0),s_{-i}) = 0*p_2(r)*p_2(r) + (-1)*p_2(r)*p_3(p) + ...$

must be be equal to

$U_1((0,1,0),s_{-i}) = 2*p_2(r)*p_3(r) + (0.5)*p_2(r)*p_3(p) + ...$

which must itself be equal to

$U_1((0,0,1),s_{-i}) = (-1)*p_2(r)*p_3(r) + (0)*p_2(r)*p_3(p) + ...$

This should be true for every $i=1,2,3$ which leaves you with a simple system of equations to solve. A profile of strategy is a mixed-strategy Nash equilibrium if and only if it solves this system.

Based on that, it should not be too hard to determine whether there are other mixed-strategy nash equilibrium.