I've been looking for a definition of game in game theory. I'd like to know if there is a definition shorter than that of Neumann and Morgenstern in Theory of Games and Economic Behavior and not so vague like "interactive decision problem" or "situation of conflict, or any other kind of interaction". I've started a study of the proof of the existence of Nash equilibria using Brouwer's fixed-point theorem and I think of finding a definition that allows me to understand concepts as normal-form game and mixed strategy without excessive complexity. I'd appreciate some bibliographic suggestion. Thank you!
[Math] Definition of game in game theory
game theoryreference-request
Related Solutions
As @JordanMahar mentions, Fudenberg and Tirole is the standard graduate-level text. But I would start with Game Theory for Applied Economists by Gibbons. It is very readable.
Prerequisites for Gibbons are minimal. A little algebra and probability will do just fine.
There are many books on game theory. I am mostly familiar with those written by economists. The fact that books listed below are (mostly) written by economists does not mean they are not rigorous! But it means that certain aspects are taken for granted, for example, in the choice of examples covered.
Undergraduate books:
- Osborne "An Introduction to Game Theory" - Probably the most comprehensive introduction to game theory. It is undergraduate but uses math.
- Gibbons "Game Theory for Applied Economists" - Another good introduction.
These are probably the most mathematical and most complete undergraduate textbooks (though I might be unfamiliar with some newer ones).
Graduate books:
There are three classic textbooks for graduate level game theory.
- Fudenberg and Tirole - Game Theory
- Myerson - Game Theory: Analysis of Conflict
- Rubinstein and Osborne - A Course in Game Theory
Out of these three I personally recommend the last one. It is the shortest of the three, but the most elegant. It covers standard topics taught to PhD students in economics in the 1st year of PhD and you might be able to get it for free from Ariel Rubinstein's website. Fudenberg Tirole covers broader set of topics than Rubinstein and Osborne. Importantly, it covers mechanism design and auctions (also core topics). But is covers a lot of material and some it is is outside of what I would call the "core". I have not used the book by Myerson so can't comment on it, but heard it is a nice companion to either of the two.
More advanced books:
- Maliath Samuelson - Repeated Games and Reputations: Long-Run Relationships
- Bolton Dewatripont - Contract Theory
- Zamir, Maschler & Solan - Game Theory
- Krishna - Auction Theory
Once you learned the basic of game theory it is time to choose more specialized topics. Maliath and Samuelson focuses on a more recent developments in repeated games with particular focus on the role of reputation. Bolton and Dewatripont, as the name suggests, focus on design of optimal contracts. The book Zamir, Maschler & Solan is a great modern reference. It is really a encyclopedia of Game Theory and I would never suggest to anyone going through all of it. I doubt thee are more than a handful researchers that know ALL of that stuff. But it is a great way to get a quick start on a particular topic of interest. Finally, the book by Krishna is the reference for Auction theory, which has found a lot of applications outside academia.
Another good reference is the "Handbook of Game Theory" which consists of three volumes. Again, it is really a reference for researchers who want a quick introduction to a particular topic rather than a textbook to learn from it.
P.S. This guide is written from a perspective of economist.
Best Answer
I know this question already has an accepted answer, but games are usually defined depending on their form and their information structure. Therefore, the definition of a normal form game is different from that one of extensive form of incomplete information (for example). I usually define a game in normal form (its simplest form possible) as:
Notice that normal forms are usually represented by matrices, as you probably know already. Also be aware that a Mixed Strategy can be defined as a probability distribution over pure strategies of a given Player, if that helps. Finally, let me tell you that proving Nash Theorem might be easier using Kakutani's Fixed Point Theorem.
Bibliographical references there are many out there, but you may choose the one you prefer depending on your needs. To introductory but yet precise books are "A Primer on Game Theory", by Gibbons; or "An Introduction to Game Theory", by Osborne. You may also like "A Course on Game Theory", by Osborne and Rubinstein, which is more advanced (I have only read the second one; I use them occasionally as references, so I just share my very personal and uninformed opinion).
Good luck!