[Math] Definition of game in game theory

game theoryreference-request

I've been looking for a definition of game in game theory. I'd like to know if there is a definition shorter than that of Neumann and Morgenstern in Theory of Games and Economic Behavior and not so vague like "interactive decision problem" or "situation of conflict, or any other kind of interaction". I've started a study of the proof of the existence of Nash equilibria using Brouwer's fixed-point theorem and I think of finding a definition that allows me to understand concepts as normal-form game and mixed strategy without excessive complexity. I'd appreciate some bibliographic suggestion. Thank you!

Best Answer

I know this question already has an accepted answer, but games are usually defined depending on their form and their information structure. Therefore, the definition of a normal form game is different from that one of extensive form of incomplete information (for example). I usually define a game in normal form (its simplest form possible) as:

enter image description here

Notice that normal forms are usually represented by matrices, as you probably know already. Also be aware that a Mixed Strategy can be defined as a probability distribution over pure strategies of a given Player, if that helps. Finally, let me tell you that proving Nash Theorem might be easier using Kakutani's Fixed Point Theorem.

Bibliographical references there are many out there, but you may choose the one you prefer depending on your needs. To introductory but yet precise books are "A Primer on Game Theory", by Gibbons; or "An Introduction to Game Theory", by Osborne. You may also like "A Course on Game Theory", by Osborne and Rubinstein, which is more advanced (I have only read the second one; I use them occasionally as references, so I just share my very personal and uninformed opinion).

Good luck!

Related Question