Understanding Black Swan Events
Often, humans lack the ability to define or understand randomness and give in to decision-making methods that heavily rely on intuition. The human perception of randomness and intuitions while making predictions and decisions can result in profound consequences individually, and as a society, often at the expense of accuracy.
Nassim Nicholas Taleb coins the term ‘Black Swans’ to describe events that are thought to be impossible, yet redefine one’s understanding of the world. In his 2010 work, The Black Swan, Taleb points out these seemingly random events and their pitfalls. It is a guide to understanding one’s own shortcomings while making logical decisions and predictions, recognizing the human desire to fit information into easy-to-understand narratives as an outcome of clouded judgement, making better use of ignorance, and why it is detrimental to cling to one’s beliefs.
Humans are wired to derive meaning from all the information available in the environment. This ability to compartmentalize any stimuli received into meaningful information has led to understanding science and creating scientific methods, complex mathematical models and even philosophizing the nature of the being.
However, these accomplishments do not indicate that humans are actually good at it. In fact, humans are predisposed to be narrow-minded and cling to their beliefs of how the world functions.
The constant evolution and growth of human knowledge is a dogmatic approach that makes no sense. For example, doctors and scientists were supremely confident about their knowledge of medicine, just about two hundred years ago. Today, that very confidence they have seems ludicrous.
A dogmatic approach to beliefs makes humans blind to anything that falls outside the paradigms of what is believed to be true and can result in massive surprises. More often than not, the surprise generates not from the randomness of the event or information, but from the fact that human perceptions and outlook is too narrow to accept it. These surprises are the ‘Black Swans’ that propel humans to fundamentally reconsider their beliefs and perceptions of the world.
The term is derived from a very simple event. Earlier, people assumed that all swans were white. This led to all imaginations, depictions, and associations of ‘swanness’ to the colour white. However, when they discovered the black swan, it fundamentally changed their understanding of what all a swan could be.
It is thus necessary to accept that not all events are white swans, and not all events have the same outcome as previously experienced or take the course as previously believed.
The Earth-Shattering Consequences Of Black Swans
Let’s look at an example of what a ‘black swan’ event is. John always places his bets on his favourite horse Rocket. For about 3 to 4 races, her track record, her build, the skill of her jockey, as well as the lack of competition, has made Rocket a safe bet.
With this in mind, John gambles everything on a sure-shot win once again. However, when the pistol is fired, Rocket refuses to budge from her place at all. Such a surprising turn of events is a classic example of a black swan event. Even though John had all the information, and had placed his bets carefully, he lost everything!
On the other hand, the owner of the horse, Mr Wallis, knew that Rocket had been acting up the previous evening, and had placed his bets on another horse. The same black swan event had no effect on Mr Wallis, because he simply had a piece of information John did not.
Black Swan events affect everyone in different ways. The deciding factor on how a black swan event will affect someone depends on the information one possesses. Simply put, the more information one has, the likely the person is to be affected by such an event.
Moreover, Black Swan events differ in terms of the scale of their impact too. They can affect individuals, as well as whole communities and societies too. They can change the course of how the whole world works, impacting different areas of society such as theology, physics, and philosophy. Copernicus’ proposition that the Earth is not the centre of the universe has challenged not only the historical belief of the people but also the historical belief of the Bible itself and thus the authority of the ruling Catholics.
This Black Swan event impacted Copernicus as an individual, as well as established a new beginning of belief for European society.
How Logical Fallacies Fool
Human beliefs are based on the information acquired in the past. Humans tend to create narratives because past experiences are seen as an indication of the future. This tendency is a fallacy that does not consider unknown factors that could throw curveballs into the belief system and leave humans open to making mistakes.
Turkeys living on a farm are fed, watered, and cared for by the farmer. Accustomed to the past experiences, they are oblivious to the fact that the next day, they would be Thanksgiving dinner, eaten by the very people why fed and cared for them.
Humans are like those turkeys on the farm, believing in the fallacy that predicting the future based on past experiences works. Thus such fallacies can lead to dire consequences.
The concept of Confirmation Bias is one such fallacy, wherein people tend to look for evidence to support their own beliefs, to the extent that they even tend to ignore evidence that contradicts their beliefs. Humans tend to not only ignore contradictory evidence, but they are also unlikely to accept it and even look for sources that could help undermine the contradiction.
People who think that climate change is a conspiracy theory, are likely to get upset if they see the documentary called “The Undeniable Evidence for a Changing Climate,”. In fact, they are more likely to search the internet for information on ‘climate change hoax’ rather than “evidence for and against climate change.”
Belief in such fallacies is sewn into the very nature of humans. They are bad habits that aren’t easily shaken off.
Accurate Predictions Are Difficult Due To The Way The Brain Categorizes Information
The human brain evolved to survive the wild, adapt fast and learn to deal with danger quickly. Today, the inherent danger has been replaced by complexities. Hence, the way the brain evolved to categorize information to survive the danger, is practically useless today.
The Narrative Fallacy, where humans use linear narratives to describe situations is an example of how humans incorrectly categorize information. According to this method of information categorization, the brain selects only the information it considered important out of the tons of information it is bombarded with daily, to make sense of it. For example, the brain chooses to remember what food was eaten for breakfast rather than remember different types of cars it saw on the way to work.
Moreover, the brain tries to turn all the unconnected bits of information retained into coherent narratives. For example, a person selects only those events in their life that seem important and arrange them in order if she wants to reflect on how she became who she is. So this person connects her love for music because she remembers her mother singing to her every night before bed.
This method of creating connected, coherent, narratives, however, doesn’t give one a meaningful understanding of how the world works. It doesn’t take into account the infinite reasons that could have contributed to any one event. It only considers the past. The fact of the matter is that even the tiniest, most insignificant event could have massive, unpredictable consequences.
Differentiating Between Scalable And Non-Scalable Information
Humans, unfortunately, do not have the ability to differentiate between different types of information, despite inventing many models and methods of categorizing it. However, understanding the difference, especially between scalable and non-scalable information is crucial.
Non-scalable information like weight and height of the body has definite, statistical lower and upper limits. For example, while a person can weigh 1000 pounds, it is impossible for anyone to have a weight of 10,000 pounds. These physical limitations make it possible for people to make meaningful predictions and averages.
Scalable information, on the other hand, such as the distribution of wealth and sales, is abstract. There are no limits to how much sales a digital album could make if sold on iTunes. The sales numbers on digital platforms are not determined by the number of physical copies manufactured. Furthermore, for online transactions, one isn’t limited by physical currency to prevent one from selling a possible billion digital albums online.
To get an accurate picture of how the world works, understanding the difference between scalable and non-scalable information is important. Applying the rules of scalable data to non-scalable information only leads to errors.
For instance, to measure the wealth of the population of England, one has to simply work out the per-capita income, by adding the total income and dividing it by the number of citizens.
But since wealth is scalable data, and there is a possibility of a small percentage of the population having a large amount of wealth, using per-capita to calculate the total wealth could result in inaccurate outcomes.
Overconfidence In Beliefs
Humans are, by nature risk-averse. Thus, to be safe from harm, people try to assess and manage the possibility of risk. People try to measure risk as accurately as possible, all while trying to not miss out on opportunities.
To do this, people evaluate possible risks and then measure the probability of these risks materializing. For example, while buying insurance, people try to choose a policy that gives protection from the worst-case scenario, yet not is a waste of money. People try to make this informed decision by measuring the threat of accidents or disease against the consequences if those do happen.
Unfortunately, humans fall prey to ludic fallacy, a tendency to be overconfident of knowing all the possible risks that they need to be protected from. They tend to handle risk like they would a game, with probabilities and a set of rules that are pre-determined before they actually play it.
This approach is based on the ludic fallacy. For example, for casinos, the major threats might not be thieves or lucky gamblers. Threats to the casino could be utterly unpredictable, like an employee that fails to submit the casino earnings to the IRS, or a kidnapping of the casino owners child.
The fact of the matter is, that no matter how much one tries to assess or calculate the accuracy in risk, there is too much uncertainty to factor in every bit of it.
Ignorance Is Not Always Bliss
There are two contradicting thoughts, ‘knowledge is power’ and, ‘ignorance is bliss’. While both have their merits depending on the situation, it is far better to be aware of what one doesn’t know.
Focusing only on the knowledge one has can limit one’s perception of all possibilities, in any event, creating the perfect base for black swan events. Consider a person in the US in 1929. This person plans to purchase stocks in a company and has been studying the market trends from 1920 to 1928. While he assesses the highs and lows of the market in the past 8 years, he notices that the trend is generally upwards. He then invests all his savings in the market. The next day, however, the market crashes and he loses everything.
This person focused only on the information he had about the past 8 years. Had he noticed the trends of booms and busts from a little ahead in time, he would have probably been better prepared.
Poker players, especially the good ones, have an understanding of what they don’t know. For example, they know the rules of the game, they know the cards they hold, and they are also aware of the fact that the opponent could have better cards than they do. However, they are also aware of information that they don’t know such as, the opponent’s strategy and how much the opponent can stand to lose.
By simply being aware of what they don’t know, they are able to strategize and make a better assessment of risks in playing their hand.
Understanding Limitations Leads to Better Choices
We have established that it is necessary to have a good understanding of the tools one uses to make predictions. However, it is far more important to understand the limitations of those tools.
While knowing limitations isn’t a sure-shot strategy to escape the consequences of every blunder it can help in reducing the number of bad decisions one makes. For example, a person who is aware that he is subject to cognitive bias finds it easier to recognize that he is only looking for information that confirms his beliefs.
Thus if one understands that humans have a tendency to organize everything into casual, neat narratives and that this helps to simplify the complexity of the world, one will be more likely to look for more information that enables one to get a better view of the whole picture.
Even a small amount of this kind of critical analysis, and knowing one’s own shortcomings can give a person an edge over others. For example, if one is aware that there could be unforeseeable risks in an opportunity, one will be careful to not heavily invest in it, even if it seems extremely promising.
One can at least, mitigate the damage of ignorance, even if one cannot understand the complexity of the world or win over seeming randomness.
Humans are bad at making predictions. They have full confidence in the knowledge and underestimate their ignorance. Humans need to understand that Black Swan events occur due to over-reliance on seemingly sensible methods, their inability to define or understand randomness and even the basic human biology that leads to bad decisions.
Nothing can truly prepare one from Black Swan events that can change the course of one’s life or even the world. However, simply being aware can make all the difference!