The model we developed focuses on a single encounter between two civilizations. Its main purpose is to define the most reasonable strategy for an emerging civilization. As can be seen from our results, presented on the "Conclusion" page, almost three quarters of all encounters result into one of the participating civilizations ending up not surviving it. On the other side, due to the specifics of our model any civilization that decided to hide will survive. The cooperation as a strategy is a quite dangerous effort that really proved itself beneficial in only 12.18% of all encounters.
It is important to note, however, that as we do not have the ability to establish real values for some of the parameters of the model our conclusion may be different from real world. For example, a 50% chance to discover another civilization in close vicinity may be as high as 100% or as low as 0%.
Our results make the concept of Great Filter seem like a reasonable one. If real numbers are really close to those produced by our model then it really is beneficial to hide, as this strategy is almost guaranteed to lead to the survival of the civilization using it.
"But in this dark forest, there's a stupid child called humanity, who has built a bonfire and is standing beside it shouting, 'Here I am! Here I am!"'
Liu Cixin "The Dark Forest"
There are also a number of improvements that may be introduced to our model:
Taking into account all that is discussed above, our model is a solid basis that can model a more general cosmic society. For example, it can be embedded into a evolutionary game theory framework.
Suppose we have a galactic society consisting of a number of independent civilizations which can be regarded as a population. In light of recent discoveries made by the Kepler space telescope there are reasons to believe that almost any star in the galaxy has a planetary system and many of those actually host planets that are located inside a Goldilocks zone. This means that a significantly large number of civilizations is actually possible.
Game rules are defined by our encounter model. It may or may not be expanded with previously stated suggestions. Another suggestion that applies to this scenario, and should be considered in this model, is knowledge sharing between cooperating civilizations: information about encounters, possible alliances and possible technology sharing.
We suggest replication rules based on a number of successful encounters for every given civilization.
Replication may represent not a full change of society, but some small mutations still may occur. For example the increase or decrease of technology level may represent a progress or stagnation of a society, change of effective depth of suspicion may mirror previous experiences, and of course, change of attitude is also a possible occurrence.
Preemptive strikes as used by benevolent civilizations actually can be found in a real world history. For example, an infamous Bush Doctrine describes a strategy of "preemptive strikes" as a defense against an immediate threat to the security of the United States. Reasoning about the possible attitude of the United States is left as an exercise to the reader.
One may say that it is not really "benevolent" in any way to perform any kind of preemptive strikes, but this is out of the scope of this project.
In this article, which describes techniques used to provide nuclear deterrence during the Cold War it is clearly stated that human agents cannot be expected to behave rationally in all situations, i.e. humans are prone to make a retaliation strike, even when everything else is already lost.