On this page, observations of various kinds will appear that are related to big history teaching, as well as to big history
in general.
HOW CAN WE EXAMINE CHANCE AND NECESSITY IN BIG HISTORY?
March 2, 2017
Introduction
How
can we examine chance and necessity in big history? The thoughts expressed in this short essay were provoked by Walter Alvarez’s
book
A Most Improbable Journey: A Big History of Our Planet and Ourselves (2016). In the last chapter Alvarez argues that, because
everything in big history very much depends on chance and contingency, the current situation on our planet is most improbable.
I
could not more agree with such a view. Yet we do observe our own planet, and a great many galaxies, stars, and now also a considerable
number of other planets. They are all different, of course, but they also exhibit certain similarities. That is exactly why we can
use the words ‘galaxies,’ ‘stars,’ and ‘planets’ to characterize these objects and, by doing so, put them into these specific categories.
This
leads to the following question: because, as Walter Alvarez argues, virtually all processes are improbable seen from the point of
view of probability theory, how is it possible that processes leading to similar forms of greater complexity actually do occur?
At
the beginning of chapter 2 of my book, my general position on these questions was stated as follows:
'explaining the past always
implies striking a balance between chance and necessity. This point of view had already been expressed by the natural philosopher
Democritus of ancient Greece (460–370 BCE), while more recently French biochemist Jacques Monod (1910–76 CE) said essentially the
same (with proper reference to Democritus). My explanatory scheme is about necessity. It consists of general trends that not only
make possible certain situations but also constrain them. Yet within these boundaries there is ample room for chance.'
So far
the quotation. In sum: even though chance effects are everywhere, they do not only produce pure chaos. How is that possible? What
causes these combined chance effects to produce all the levels of complexity that we witness today, including ourselves?
While
writing my book, first published in 2010, I did not examine chance effects any further, although I did, for instance, point out the
need for constrained processes while discussing the emergence of life at the end of chapter 4.
In his 2012 keynote speech during
the
first IBHA conference in Allendale, Michigan, Walter Alvarez explained the importance of chance in big history. In fact,
the last chapter of his book can be seen as a further elaboration of the thoughts he then shared with all of us.
How improbable
are processes and outcomes?
Before examining possible causes of structure and regularities, let us first examine the starting
point of his argument: namely that virtually all processes in big history are improbable in their specific outcomes. In order to do
so, let us throw a die a few times.
What would be the chance of throwing a die once and throwing a six? Obviously, if throwing
such a die does not exhibit any irregularities that will introduce a bias in the outcomes, there will be a chance of one out of six
of doing so. After throwing any number of times we will obtain the same chance result for any predetermined single outcome. Nothing
new here.
But what happens when we consider the specific sequence of throwing the die, for example ten times? Perhaps surprisingly,
the probability of exactly replicating that specific entire sequence is (1/6)(to the power of 10) = 0.000000017. That is a far smaller
number, in fact one million times smaller, than the probability of throwing a certain specific outcome in any single throw, which
is 1/6 = 0.17. In other words, repeating the specific sequence of throwing a die is far less likely than obtaining a specific outcome
of one single throw.
How much time would be needed to throw and have a reasonable probability of replicating a certain predetermined
outcome, or of replicating a certain specific sequence? In doing so, let us assume that a person is able to throw a die and write
down the result once every 2 seconds.
Let us first look at the probability of replicating a certain outcome. If we want to have
a reasonable probability of throwing a five, for instance, we would need to throw six times on average. Throwing six times does not
guarantee the outcome of a five, of course. But it offers a reasonable probability. This means that on average we will need 6 x 2
= 12 seconds to achieve our goal, even though this goal will not be guaranteed within such a time span.
For repeating a specific sequence
of ten throws with a similar probability, however, we would need much more time on average, namely: 12 (to the power of 10) seconds
= 1963.38 years. That is far longer than the time needed on average to reach a specific outcome, which is only 12 seconds, as we just
saw.
This difference becomes even more impressive as soon as the number of throws is increased. What would happen if one throws
a die more than ten times? How long ago would that person have had to start throwing to have a reasonable probability of replicating
that specific sequence and end up in our present at the end of that exercise? This is the result:
Number of
Throws: Time required:
10 : 1,963 years (Roman empire)
11 :
23,561 years (coldest period of last ice age)
12 :
282,728 years (before Homo sapiens)
13 : 3,392,732 years (period of
Homo australopithecus)
14 : 41,278,243 years (long before humans emerged)
15 :
488,553,449 years (life moves on land)
16 : 5,862,641,391 years (before our solar system emerged)
17 : 70,351,696,690
years (long before the universe emerged)
Apparently, standing a reasonable chance of replicating a specific sequence of
only 17 throws requires almost five times as much time as the entire history of the universe (supposedly 13.8 billion years). That
is how improbable even that very simple specific sequence is.
If such a very simple sequence is already so improbable, virtually
all processes in big history must be extraordinarily improbable in their specific sequences. Such a conclusion is much in line with
Walter Alvarez’s argument, but perhaps even more extreme. As mentioned earlier, this raises the profound question of why structured
processes would occur at all.
More about that below. But let us first raise an objection: one could argue that all of this does
not matter, because the probability of the outcome of any throw is still 1/6 with each and every throw. This is surely correct. But
what would happen if at every throw the specific sequence is connected to the outcome of that particular throw?
To
model that, let us assume that with each throw the die is slightly changed randomly, for instance by the random replacement of a few
atoms on the outside of the die by different ones, while this change does not affect the probability of throwing a specific number,
which remains one out of six.
While this probability remains the same, the die itself will randomly change with each throw. And
because that change of the die is random, the probability of replicating that change is even lower than the change in sequence per
throw, which remains 1/6. In this case, not only is the specific sequence extremely unlikely to be replicated, but the specific outcome,
the specific arrangement of atoms in the die, is even more improbable.
In my opinion, this model of coupling sequence with outcome
is much closer to what we observe in the real world than the first sequence model outlined above. If such a simple thought experiment
already very soon results in a most improbable specific sequence and outcome, what is the probability that all of cosmic evolution,
which consists of zillions of often far more complex process, happened exactly the way it did? Obviously very, very close to zero.
In
other words, no other conclusion seems possible other than that virtually all specific sequences and outcomes, large and small, are
unique and highly improbable. They are very unlikely to be replicated exactly the same way within the presumed age of the universe.
This
conclusion raises a few important questions.
1. Given these observations, to what extent has the current probability theory based on
chance and statistics focused on outcomes while perhaps neglecting the probability of a certain specific sequence (because it
does not seem to matter at first sight)?
I do not yet have a good answer to this question. A short exploration on the Internet
has not yet yielded any clear answers. But it may well be that the theory of stochastic processes deals with this problem. Especially
modeling complex processes such as weather forecasts would include such approaches. Yet I am not aware of a generalized formalized
approach. This is a question that specialists in probability theory may be able to answer.
2. What could be gained by developing
a probability theory that combines specific sequences with outcomes? I do not know, but perhaps this may be a promising research field.
But perhaps it is already included in the theory of stochastic processes.
What about constraints in these processes?
One
may wonder whether all these arguments are, perhaps, to some extent an artefact of our own possibly prejudiced starting point, namely
that we are looking for a specific sequence and outcome of big history that we observe and then start wondering about the chance that
it would have happened in such a specific way. What would happen, one may wonder, if we looked at these processes from some greater
distance?
Surely, throwing a die 10 times or more will yield a very random specific sequence. Yet this sequence will always be
within certain boundaries. It will never fluctuate below one and above six, simply because that is the range of numbers on the die.
Seen from that point of view, all these random sequences are actually very similar. The same can be argued for the changes of the
die. Surely, the die may change a little as a result of the random replacement of a few atoms. But it will very recognizably remain
a die.
In other words, the similarities of these processes and outcomes are the result of certain specific constraints that are
operating, in this case the range of numbers on the die as well as our assumption that the random replacement of a few atoms does
not change the die very much. This is a very fundamental conclusion, because it means that these constraints make specific processes
possible and, in fact, likely, even though these specific processes themselves may still be very random within those constraints.
A
simple example from daily life would be observing a barometer every day. The barometer on the wall of our house fluctuates between
‘storm’ (relatively low pressure) and ‘very dry’ (relatively high pressure). The needle rarely reaches those extreme values, though,
and never goes beyond them. It spends most of its time fluctuating on ‘variable’ and a bit above or below that (between ‘nice weather’
and ‘rain’).
The specific sequence of these fluctuations throughout the year will, in all likelihood, never be repeated. Yet
the boundaries are unambiguous and predictable, because they result from the properties of Earth’s atmosphere and the location of
the barometer. A very similar argument can be made for a thermometer measuring outside temperatures or, in fact, for any instrument
measuring long-term trends within a relatively stable environment.
Let us now apply a few more constraints to the die to see
what will happen. We could, for instance, mount a little magnet on one of the sides, for instance on the side with number three. If
we were to throw the die many times on an iron table, quite likely the opposite side, number four, would end up on top most, if not
all, of the time. As a result, both the outcome and the sequence of this experiment would become extremely predictable. We could also
make both the sequence and the outcome very predictable simply by putting the same number on all six sides.
An example of
such a similarly constrained process would be the emergence of salt crystals out of a watery salt solution that is evaporating its
water. This process leads to an increase of the salt concentration to the point where crystals begin to form, because the solution
has become saturated, which means that it cannot contain a higher concentration of salt.
The formation of these crystals may
be triggered by impurities in the solution. Yet the general structure of such crystals is very regular, cubic, because in the prevailing
circumstances this structure is mostly dictated by the properties of the sodium and chloride ions that it consists of. That is what
constrains the crystal formation process. Yet all crystals will be slightly different as well, especially their sizes.
Let us
now imagine that we try to remove all constraints. If, for instance, we could make a die with endless numbers of sides, all with
different numbers on them, both the sequence and the outcome would become entirely unpredictable. In other words, without constraints
throwing such a die would result in a totally random sequence of numbers (the process of throwing the die would still be still fairly
regular, of course).
Let us now introduce selection into the process of throwing a standard die. Let us assume that we are allowed
to repeat the first throw until we get it right, and then proceed with the next throw, etc. How long would it take in this case to
achieve a reasonable probability of replicating a specific sequence of 17 throws? The answer is simple: 6 x 2 x 17 seconds = 3.4 minutes.
That
is a lot quicker than almost five times the history of the universe. In other words: introducing a specific form of selection can
change very improbable sequences into reasonably likely ones. This thought experiment may offer a rough first-order model of how biological
evolution works.
Let us now imagine that we are able to influence the throws intentionally, simply by putting the right number
on top by hand (in this case we would not really throw the die anymore). Obviously, in such a case we would need only 17 x 2 seconds
= 34 seconds to replicate any specific sequence of 17 throws. That is even a lot quicker than 3.4 minutes.
Would such a thought
experiment represent a rough first-order approach to cultural evolution? If so, this may illustrate and explain why cultural evolution
proceeds so much quicker than biological evolution.
In conclusion: to understand the probability of the emergence and development
of relatively-structured processes we must examine both chance effects and the constraints involved (necessity). That is why I argued
along the lines quoted earlier at the beginning of chapter two of my book. It seems to me that a systematic theoretical exploration
of this theme in big history has barely begun, while doing so it may yield most delicious fruits.
In most of the themes discussed
in Walter Alvarez’s book, the force of gravity provides such a constraint, which makes possible many of the examples described in
such exquisite detail. It is, of course, not the only constraint that is operating on our planet. All the natural forces can be seen
as providing constraints, and all of them play a role in Alvarez’s book.
Do relatively-structured processes involve more than
only constraints?
Posing this question is almost answering it. Clearly there is more to the emergence of structured processes
than only constraints. What is needed for such processes to occur are certain favorable circumstances, which in my book are called
Goldilocks circumstances.
These good circumstances very much depend on the type of complexity involved. Within the context of
this little essay, a relevant question may be: what would be the Goldilocks circumstances for throwing a die? The obvious answer is:
those circumstances that favor the die to exist as well as those that make it possible to throw it.
While these conditions may
seem obvious, in most of the universe dice do not exist, while the conditions for throwing them do not exist either. Throwing a die
in space, even as close to Earth as inside the International Space Station (ISS), would not work for lack of any effects of gravity.
That is the case, because the IIS including everything inside it is falling around the Earth with the same velocity. As a result,
little or no gravity is experienced inside those amazing living quarters in space. Inside the ISS, the die would simply float around
and would hardly ever end up on one side, if at all.
More in general, Goldilocks circumstances are those opportune circumstances
that facilitate processes to occur, while the constraints limit them, and, by doing so, also make them possible. Perhaps one can interpret
constraints as Goldilocks boundaries, beyond which these processes cannot occur anymore. But already in the case of the die that seems
a little too artificial to me. One could easily make a die with more, or fewer, than six numbers, all of which could be thrown without
any problems within the same further Goldilocks circumstances. Yet these processes and their outcomes would be different.
If
this is already the case with this very simple experiment, it may be wise to make a distinction between Goldilocks circumstances and
their boundaries on the one hand and other types of constraints on the other hand. But I immediately admit that this distinction may
sometimes be vague.
This subject may, therefore, represent unexplored territory from a theoretical point of view. To my knowledge,
a careful exploration of big history in terms of constraints and Goldilocks circumstances has barely begun. Further systematic
investigations could yield wonderful results. Such an investigation should first of all include making inventories of the available
knowledge about these aspects in all academic fields, and then seeking to systematize them within the field of big history.
While
studying human history in the 1980s and early 1990s, the Dutch sociologist Joop Goudsblom and I agreed on the formulation that we
were not trying to explain how history had to happen, but that we instead sought to explain how it could have happened. We had reached
that conclusion, because of the role of chance and contingency that we observed. It seems to me that the same reasoning very much
applies to all of big history.
In conclusion
Chance and necessity depend on the amount of detail examined, and on
the factors that constrain chance and make it possible. Clearly, chance and necessity are both important aspects of all processes.
If we observe structures, they will be unique, yet they will also be constrained and facilitated by certain circumstances. In order
to understand those situations better, we need to examine all these aspects. Without constraints and Goldilocks circumstances big
history would only consist of utterly irreproducible chaos without any structure whatsoever.
How could we further proceed along
these lines?
I see the following possibilities. We could try to assess to what extent processes in reality conform to this general
model, ranging from our personal histories all the way to the history of the universe. As a result, a whole field may be open to further
for investigation.
In many cases, these processes may influence one another, thus leading to more complicated circumstances.
How would we analyze that both qualitatively and perhaps also mathematically? Again, a whole new field may be open to exploration.
Perhaps weather models do already incorporate such schemes to some extent. These are only my first thoughts. There may well be many
more possibilities.
Returning to Walter Alvarez’s book
In chapter ten of his book A Most Improbable Journey Walter Alvarez
convincingly argued how improbable our current situation is. Yet in the preceding chapters he hardly mentioned this theme at all.
Instead he nicely demonstrated some of the geological constraints and opportunities that have contributed to shaping human history
which, jointly, have made this most improbable journey possible.
Postscript
On February 1, 2019, the word 'trajectory' was
replaced by the word 'sequence' following a suggestion by US economist Dennis Flynn.