The Values of Cooperation

Every day you run into strangers on the street.  People you’ve never met before and unaware if you’ll ever come in to contact again.  Sometimes these turn into commercial activities, such as purchasing a cup of coffee, or filling up your car with gas.  What’s the motive that prevents you bending the lines to maximum your personal gains?  Certainly it would be nice if you could just fill up your tank and just drive away without paying.  Is the threat of a cash fine or jail activity sufficient reasons enough to provoke cooperative behavior?  Or are there other factors, perhaps even subconscious motives, for why cooperation always makes a logically superior choice.

Interesting enough, the effects and consequences of cooperation has been tested and re-tested in a series of game theory simulations.

The Prisoner’s Dilemma is a demonstration in which two individuals are placed at odds with one another with two clear-cut choices: cooperation or defection.  The game is quite simple; the players individually take a stand on whether to cooperate with the other player, or not.  When both players cooperate, they receive a middle score of 3 points each (we’ll denote this as C).  In the case of mutual defections in which neither player cooperates, denoted as D, each is punished for their lack of cooperation, by given only 1 point.  However, in the case of one individual cooperating and the other defecting, the cooperative player, S, receives 0 points while as the defecting player, T, receives 5 points.  Thus, the highest allotted points potentially to be received can be written as T>C>D>S, a point value of 5, 3, 1, 0.

A question arises: when does it make sense to cooperate?  From a single interaction, clearly the best option is to defect to get the maximum 5 point value.  Individual rationality leads to a worse outcome for both players, as 3 points allotted each (had they both cooperated) is greater than the 1 of them both defecting; hence the dilemma.

However, this is just a single case; what if the individuals could potentially meet again, would it change their answers?  To test the best course of action, Robert Axelrod formed a tournament in which different computer programs could be submitted to participate in the Prisoner’s Dilemma, to see which course of action was the most efficient, i.e. received the highest “score.”  The conclusion of the experiment was that the most effective strategy was the TIT FOR TAT strategy.

TIT FOR TAT was one of the most simplistic strategies, yet proven by the results, the most logical.  The policy of TIT FOR TAT is quite simple – be cooperative, be retaliatory when defected against, yet always maintain forgiveness and willingness to go back to cooperation.  What is important to state in this game, and in life in general, is that it is not a zero-sum game.  That is, there are plenty of ways for individuals to interact with mutual beneficial gains on both ends.

Cooperation drives its power from the potential of another encounter between two people.  Although the short-sighted player might realize that the immediate gains of defection are greater than that of cooperation, the possibility of meeting in the future means that the choices made today not only determine the immediate outcome, but also the sequential ones in the future.

Even the most egotistical and self-centered people realize that they gain more via cooperation.  Not surprisingly, the computer programs submitted in the tournament that scored the best were those which made sure not to be the first to defect, which we will call “nice” programs.

Intrigued by the results, Axelrod then conducted a follow up experiment.  He published the results of the first experiment, making it clear and explicit that TIT FOR TAT had the highest score, to see if programmers would learn from their mistakes.  Not surprisingly, more “nice” programs were submitted in comparison to “mean” programs–those which try to gain a higher score by sneaking in an occasional defection.

Such programs, however, once again proved to be a failure as TIT FOR TAT again received the highest score.  What this case study reveals is that it does not pay to be greedy.  By trying to sneak in the occasional defection, “mean” programs set off a long chain of recriminations and counter-recriminations.  Thus, a single defection can result in an echo effect, with both sides suffering the consequences.  TIT FOR TAT’s success comes from its clarity.  “Its niceness prevents it from getting into unnecessary trouble.  Its retaliation discourages the other side from persisting whenever defection is tried.  Its forgiveness helps restore mutual cooperation” (54).  Entries into the tournament were too competitive for their own good.  Even expert strategists did not give enough consideration towards the importance of forgiveness.

The mutual gains of cooperation can be illustrated through countless, everyday life examples.  Take taxes, for example.  It is safe to assume there is no one who enjoys paying taxes.  Yet, by paying taxes, an ignoring the temptation of yielding the immediate results enjoyed by not paying, we all relish the benefits in the long run i.e. better schooling, roads, safety and health.

One particular case study where the benefits of cooperation were exemplified, surprisingly enough, was through trench warfare in WWI.  As the trench lines stabilized, nonaggression between battalions positioned across from each other emerged.  A standard policy of reciprocation became placed into effect; for every one of our men killed, we’ll aim to kill 2, and vice versa.

While targeting the enemies supply wagons may limit their rations, their response is quite simple: they will prevent you from drawing yours.  A live-and-let-live system was established, until eventually executive officers forced raids on opposing trenches.  Thus, the TIT FOR TAT systematic approach is portrayed, as soldiers cooperatively make use of nonaggression when it can be pertained, yet demonstrate retaliatory capabilities when need be.  Arguably the most important lesson this provides is that friendship is not required for cooperation to work.  Under certain circumstances, cooperation still makes the most sense, even amongst antagonists.

However, we cannot always expect to others to be so willing to cooperate.  There still remain individuals who are enticed by the immediate gains of defection for their own benefit, but can this behavior be changed?  The results of Axelrod’s show us they can.  In a society composed of “mean” entities, it only takes a mere 5% of “nice” like-minded individuals to challenge the governing norm.  That’s it.  All it takes is 5% of a total population cooperating and working together to undermine the short-sighted immediate gains of defection.

 

“Never underestimate the power of a small group of committed people to change the world.  In fact, it is the only thing that ever has.”

-Margaret Mead

 

This is because a population of “nice” individuals works so well with each other.  Though the majority of their interactions experienced where with that of “mean” individuals, when placed with another person willing to cooperate, their gains were exponential.  Secondly, what Axelrod’s experiment revealed, is that people’s strategies can change.  In the second round of the Prisoner’s Dilemma tournament, a majority of “nice” programs were submitted, while “mean” programs took up the majority in the first tournament.  The better a strategy does, the more representation it will grow.  The process essentially simulates survival of the fittest, and proves a small group of cooperative individuals can alter the ignorance and short-mindedness of those unwilling to cooperate.

However, to experience the full bountiful effects of cooperation, there are a few considerations to keep in mind.  Arguably the most important among these is to avoid being envious.  As stated earlier, much of life is not zero-sum; both participating parties can mutually gain from an interaction, albeit at different levels.  Most people resort to the only standard of comparison available – the success of the other player.

While it may be demoralizing to see that the other player is doing better, it shouldn’t matter.  The proof is in the TIT FOR TAT strategy.  Although it overall was the most successful strategy in the tournaments, it didn’t win a single match.  In fact, in couldn’t; TIT FOR TAT always lets the other player defect first, so the best it can potentially do is tie.

“TIT FOR TAT won the tournament, not by beating the other player, but by eliciting behavior from the other player which allowed both to do well” (112).  The other player in the Prisoner’s Dilemma should not be regarded as someone you are trying to compete with.  Both sides can, in fact, benefit from mutual cooperation.

In this world we live in, we are always adapting – adapting to the environment and the people it is composed of.  By individually adapting to this mindset of cooperation, we are not only bettering ourselves, but bettering the world around us as well.