I didn't find in books, so I'm asking – Mixed-strategy Nash equilibria is always only one or doesn't exist for the one certain game? And I know that there can be several(and can not be at all) pure strategy Nash equilibria.
[Math] Mixed-strategy Nash equilibria
game theorynash-equilibrium
Related Solutions
In general, when you face with the problem of finding mixed Nash equilibrium in a 2-player game, you must use the best response functions (BRF). With the BRFs you can solve both finite table games and continuous game.
Let $S_1$ and $S_2$ be the sets of strategies for player 1 and 2 respectively, and let $x_1 \in S_1$ and $x_2 \in S_2$ the strategies played by each player. The payoff functions are $f_1(x_1, x_2)$ for player 1 and $f_2(x_1,x_2)$ for player 2.
The BRF of the player 1 $\beta_1(x_2)$ is a "function" which return the best strategy(ies) that player 1 must choose when player 2 plays a given strategy $x_2$. It is a "function" since for every $x_2$ there can be more than one best strategy for player 1. It is more correct to say that $\beta_1(x_2)$ is a set. The same construction is used for $\beta_2(x_1)$. Namely we have the following:
$$ \beta_1(x_2) = \{ x_1 \in S_1 : f_1(x_1,x_2) \geq f_1(y, x_2) ~ \forall y \in S_1\}$$ $$ \beta_2(x_1) = \{ x_2 \in S_2 : f_2(x_1,x_2) \geq f_2(x_1, z) ~ \forall z \in S_2\}$$
Once you have build the BRFs, you have to solve the following system:
$$ \left\{ \begin{array}{l} x_1^* \in \beta_1(x_2^*) \\ x_2^* \in \beta_2(x_1^*) \end{array}\right.$$
All the solutions $(x_1^*, x_2^*)$ are Nash equilibria. They can be mixed or pure.
Example with continuous strategies
A classical example is the Cournout duopoly. Two firms produce the same good and they act on the same market. They must decide the quantities $x_1$ and $x_2$ of good that they have to produce. Produced quantities must not be greater than the demand $D$. We have that $S_1 = S_2 = [0, D]$.
The firms are different in the sense that they have different cost of production (say $c_1$ and $c_2$ are the unitary cost for firm 1 and firm 2 respectively). Payoff functions are:
$$f_1(x_1,x_2) = k(D - x_1 - x_2)x_1 - c_1x_1$$ $$f_2(x_1,x_2) = k(D - x_1 - x_2)x_2 - c_2x_1$$
where $k$ is a positive constant.
The way to evaluate the BRF is to maximize the payoff functions w.r.t. own strategy when the opponent strategy is fixed. We use the derivative to maximize (note that, since $k>0$, then each payoff function has second derivative negative and this guarantees that the stationary point is a local maximum):
$$\frac{\partial f_1}{\partial x_1} = k(D - x_2) - 2kx_1 - c_1$$ $$\frac{\partial f_2}{\partial x_2} = k(D - x_1) - 2kx_2 - c_2$$
and we equate them to 0 in order to find the maximum:
$$ \left\{ \begin{array}{l} \frac{\partial f_1}{\partial x_1} = 0 \Rightarrow x_1 = \frac{k(D-x_2)-c_1}{2k} = \beta_1(x_2) \\ \frac{\partial f_2}{\partial x_2} = 0 \Rightarrow x_2 = \frac{k(D-x_1)-c_2}{2k} = \beta_2(x_1) \end{array}\right.$$
The last equations on the right holds since, for every $x_2$ ($x_1$) fixed we can find the best $x_1$ ($x_2$) that firm 1 (2) can adopt. It is worthwhile to note that in this case the BRFs are real function, since there is a 1-on-1 correspondence between an opponent strategy and the best response to it.
At this point, we can solve the system:
$$ \left\{ \begin{array}{l} x_1* = \beta_1(x_2^*) \Rightarrow x_1^* = \frac{kD-2c_1+c_2}{3k} \\ x_2* = \beta_2(x_1^*) \Rightarrow x_2^* = \frac{kD-2c_2+c_1}{3k} \end{array}\right.$$
and you obtain the Nash equilibrium!
About the usage of BRF with finite table games
When you are in this case, you have $2$ payoff matrix, say $A, B \in \mathbb{R}^{2 \times 2}$. From these you can build you payoff functions:
$$f_1(x_1,x_2) = [x_1 ~~~(1-x_1)]~A~[x_2 ~~~(1-x_2)]^T$$ $$f_2(x_1,x_2) = [x_1 ~~~(1-x_1)]~B~[x_2 ~~~(1-x_2)]^T$$
At this point, you act as in the previous example. Note that $S_1 = S_2 = [0, 1]$ instead of $\{0, 1\}$ because you have to extend to a continuous situation if you want to find the mixed equilibria.
In this case, when you use the BRF, you will find (if there exists at least one) mixed Nash equilibria. Some times it will find also pure equilibria, but in general you have to restrict the maximization on the border of the set $\Delta = \{ (x_1, x_2) : x_1, x_2 \in [0, 1] \wedge x_1+x_2 = 1\}$.
If you like, you can think of a pure strategy as a mixed strategy in which a player has a 100% chance of picking a certain strategy.
The equilibrium definition is the same for both pure and mixed strategy equilibria ("even after announcing your strategy openly, your opponents can make any choice without affecting their expected gains"). The difference is that in a mixed equilibrium, you are announcing your probability distribution, not the strategy that it randomly produces.
Example: Rock-Paper-Scissors. There are no pure strategy equilibria: If I announce "I'm going to play definitely Rock!" then clearly my opponent will choose Paper; if I know they're going to play paper then I don't want to play Rock anymore, so this is not stable. However, if I announce "I'm going to secretly roll a die, play Rock if it shows 1-2, Scissors for 3-4, and Paper for 5-6!" then my opponent is equally happy with any choice he makes. If he therefore chooses the same strategy as me, then I am equally happy with any choice I make, so this is a mixed equilibrium.
Also, your statement "the linear equations of the mixed one can only give one or infinite number of results" isn't true - there are many games with an in-between number of equilibria (Chicken, for example, has 3 in a two-player version of the game). As an aside, a randomly-generated game will have a finite, odd number of equilibria with probability 1, but that's about all you can say about the number of equilibria.
Best Answer
Pure strategies can be seen as special cases of mixed strategies, in which some strategy is played with probability $1$. In a finite game, there is always at least one mixed strategy Nash equilibrium. This has been proven by John Nash[1].
There can be more than one mixed (or pure) strategy Nash equilibrium and in degenerate cases, it is possible that there are infinitely many. In a well-defined sense (open and dense in payoff-space), almost every finite game has a finite and odd number of mixed strategy Nash equilibria.
A typical example of a game with more than one equilibrium is Battle of Sexes, which has two pure strategy equilibria and one completely mixed equilibrium, meaning every strategy is played with positive probability.
[1]: J.Nash. Non-Cooperative Games. http://www.cs.upc.edu/~ia/nash51.pdf