# All Science Fair Projects

## Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

# Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

# Nash equilibrium

In game theory, the Nash equilibrium (named after John Nash) is a kind of optimal strategy for games involving two or more players, whereby the players reach an outcome to mutual advantage. If there is a set of strategies for a game with the property that no player can benefit by changing his strategy while the other players keep their strategies unchanged, then that set of strategies and the corresponding payoffs constitute a Nash equilibrium.

The concept of the Nash equilibrium was originated by Nash in his dissertation, Non-cooperative games (1950). Nash showed that the various solutions for games that had been given earlier all yield Nash equilibria.

A game may have many Nash equilibria, or none. The Brouwer Fixed point theorem provides the sufficient, though not necessary, conditions for existence of a Nash equilibrium. Brouwer proved that for a continuous function f: mapping S-->S, where S is a non-empty and convex compact set onto itself, there exists x* such that x*=f(x*)(x* is a fixed point). In a game context, if the set of strategies by player i, is a compact and continuous set, the payoff functions for all players are quasi-concave and continuous, then the game has a Nash Equilibrium.

A game can have a pure Nash Equilibrium or a Nash Equilibrium in its mixed extension (allowing for probability distribution over strategies).

Nash was able to prove that, if we allow mixed strategies (players choose strategies randomly according to preassigned probabilities), then every n-player game in which every player can choose from finitely many strategies admits at least one Nash equilibrium of mixed strategies.

If a game has a unique Nash equilibrium and is played among completely rational players, then the players will choose the strategies that form the equilibrium.

 Contents

## Examples

### Competition game

Consider the following two-player game: both players simultaneously choose a whole number from 0 to 10. Both players then win the minimum of the two numbers in dollars. In addition, if one player chooses a larger number than the other, then he has to pay \$2 to the other. This game has a unique Nash equilibrium: both players have to choose 0. Any other choice of strategies can be improved if one of the players lowers his number to one less than the other player's number. If the game is modified so that the two players win the named amount if they both choose the same number, and otherwise win nothing, then there are 11 Nash equilibria.

### Coordination game

The coordination game is a classic (symmetric) two player, two strategy game, with payoff matrix

 Player 2 adopts strategy 1 Player 2 adopts strategy 2 Player 1 adopts strategy 1 A B Player 1 adopts strategy 2 C D

where the payoffs are according to A>C and D>B. The players should thus cooperate on either of the two strategies to receive a high payoff. Players in the game have to agree on one of the two strategies in order to receive a high payoff. If the players do not agree, a lower payoff is rewarded. An example of a coordination game is the setting where two technologies are available to two firms with compatible products, and they have to elect a strategy to become the market standard. If both firms agree on the chosen technology, high sales are expected for both firms. If the firms do not agree on the standard technology, few sales result. Both strategies are Nash equilibria of the game.

Driving on a road, and having to choose either to drive on the left or to drive on the right of the road, is also a coordination game. For example, with payoffs 100 meaning no crash and 0 meaning a crash, the coordination game can be defined with the following payoff matrix:

 Drive on the Left: Drive on the Right: Drive on the Left: 100 0 Drive on the Right: 0 100

In this case there are two pure strategy Nash equilibria, when both choose to either drive on the left or on the right. If we admit mixed-strategies (where a pure strategy is chosen at random, subject to some fixed probability), then there are three Nash equilibria for the same case: two we have seen from the pure-strategy form, where the probabilities are (0%,100%) for player one, (0%, 100%) for player two; and (100%, 0%) for player one, (100%, 0%) for player two respectively. We add another where the probabilities for each player is (50%, 50%).

### Prisoner's dilemma

The Prisoner's dilemma has one Nash equilibrium: when both players defect. However, "both defect" is inferior to "both cooperate", in the sense that the total jail time served by the two prisoners is greater if both defect. The strategy "both cooperate" is unstable, as a player could do better by defecting while their opponent still cooperates. Thus, "both cooperate" is not an equilibrium. As Ian Stewart put it, "sometimes rational decisions aren't sensible!"

## Stability

The concept of stability, useful in the analysis of many kinds of equilibrium can also be applied to Nash equilibria.

A Nash equilibrium for a mixed strategy game is stable if a small change (specifically a infinitesimal change) in probabilities for one player leads to a situation where two conditions hold:

1. the player who did not change has no better strategy in the new circumstance
2. the player who did change is now playing with a strictly worse strategy

If these cases are both met, then a player with the small change in their mixed-strategy will return immediately to the Nash equilibrium. The equilibrium is said to be stable. If condition one does not hold then the equilibrium is unstable. If only condition one holds then there are likely to be an infinite number of optimal strategies for the player who changed. John Nash showed that the latter situation could not arise in a range of well-defined games.

We have both stable and unstable equilibria in the Coordination game example above.

The equilibria involving mixed-strategies with 100% probabilities are stable. If either player changes their probabilities slightly, they will be both at a disadvantage, and their opponent will have no reason to change their strategy in turn.

In the case of the (50%,50%) equilibrium, there is instability. If either player changes their probabilities, then the other player immediately has a better strategy at either (0%, 100%) or (100%, 0%).

Stability is crucial in practical applications of Nash equilibria, since the mixed-strategy of each player is not perfectly known, but has to be inferred from statistical distribution of their actions in the game. In this case unstable equilibria are very unlikely to arise in practice, since any minute change in the proportions of each strategy seen will lead to a change in strategy and the breakdown of the equilibrium.

Note that stability of the equilibrium is connected to, but not the same thing as the stability of a strategy.