Cognitive Biases in Complex Environments
Macroeconomics, markets, and the human body are highly complex. These areas have near infinite inputs, interactions, and potential outcomes. As such, determining truths in these areas requires intense scrutiny and caution.
It is nearly impossible to find the "truth" in these areas as it is difficult to conduct rigorous studies with control groups and falsifiable claims in a complex environment. Naval Ravikant says, "You never have the counterexample on the economy. You can never take the US economy and run two different experiments at the same time."
When attempting to understand something as complicated as a nation's economy, it can be tempting to seek out evidence, data, and narratives that supports our desired, predetermined conclusion.
With so many inputs, interactions, and variables in the working of a national economy, we can almost always find data points and stories to align with our narrative of the truth.
As with most complex systems, we seek simple answers to support the narrative we have already formed. Nutrition acts in a similar way - our complex body responds to many inputs over a long time scale. Vegans, keto supporters, vegetarians, and Mediterranean diet advocates will all support their diet and reference the countless studies that show that their diet is best. However, like a national economy, testing a human body with infinite inputs over a long time frame is nearly impossible to rigorously test.
Walter Willett notes in Eat, Drink, and Be Healthy, "Nutrition scientists usually can't exert the same kind of control over their research subjects that chemists and zoologists can. Instead they must work with unpredictable, independent, mostly uncontrollable subjects: people."
The sheer amount of data and variability among interactions between groups and different people are hallmarks of complex systems. This feeds into complaints of fake news and failure to report the facts. Other than clearly egregious examples, each side of a particular narrative can cherry pick facts and stories to support their argument. In the national economy or the human body, there can almost always be new information and support for a certain narrative.
The next time you walk into a bookstore, take a look at the endless shelves of nutrition expert advice books or the political economic declarations. We don’t see the same number of shelves for the truths of calculus or physics. While science and math are technically complex, they are controlled and falsifiable.
Another similar example is that we give quick explanations to illustrate a complex trajectory of life, such as a career. "You know it's all about relationships in this business" or "To be successful in today's world you have to work hard - that's what I did to get here." We love simple answers to explain complex, random, and long time horizon events.
The complexity of topics facilitates sides of an issue having far different (and strong) opinions. Once we develop a conclusion, it is difficult to remove ourselves from this conclusion, and we interpret data and news to conform to this conclusion.
This is what I call the narrative lens problem. Once people are introduced to an idea or narrative that they determine to be sound and logical, they begin to see the world through this lens. Tied closely with confirmation bias and echo chambers, people will see the world through their lens and fit narratives and data to their established view on the world.
When attempting to have a better understanding of the world, it is essential to have an awareness of the lens we are viewing it through. As a result of exploring this lens problem, I have developed more skepticism around anyone who preaches the "truth" of complex environments.
We can find examples of the narrative lens problem in every corner of areas of complexity. Every passionate person in politics, macroeconomics, business, nutrition and others will seek explanations for their view of the world and discount views that go against their lens. (I am not excluded - as with most of my writing this is a self-diagnosis).
It's human nature to find quick explanations for complex problems. It is evolutionary and helps us draw conclusions from an environment we can never fully understand. Daniel Kahneman, the author of Thinking, Fast and Slow, has researched many cognitive biases. His work around heuristics in judgement and decision making applies to this problem. Heuristics are simple strategies that humans use to quickly form judgements, make decisions, and find solutions to complex problems. Heuristics are evolutionary and useful in day to day life.
While heuristics are useful in knowing that the next chair you sit on will not collapse beneath you, it is not useful in developing truths on how the world functions. Kahneman writes of the availability heuristic - how readily a particular idea comes to mind influences how important we perceive it (we really fear shark attacks as they are easy to imagine). When we begin to view the world through a certain lens, more and more information we consume is tied to this lens. As such, more and more supporting evidence and narratives that support our view are available in our minds when we encounter data, news, or stories. This ties closely with confirmation bias.
Cognitive biases have been studied extensively. My research has led me to a simple conclusion - we are far less rational than we think. Kahneman, who spent his career understanding cognitive biases, says that he still doesn't believe he can control his biases, "I’ve been studying that stuff for over 50 years and I don’t think that my intuitions have really significantly improved."
Complex environments should breed skepticism rather than confidence. When there are infinite inputs and consequences to each action, our confidence in a policy or person's judgement should be reduced. We should seek far more empathy for each side of an issue. We should give each side the benefit of the doubt and time to explain. In reality, both sides are probably a little right and a little wrong. Recognize that hearing that our side is probably a little wrong causes defensiveness and discomfort - this is precisely my point.
A challenge to this approach is that it goes against human nature (again). Once our worldview is established, we join an “in-group” and in turn develop an “out-group" enemy. We are hardwired to think this way. In Blueprint, Nicholas A Christakis argues that in-group/out-group bias is fundamental.
"Prejudicial treatment of out-groups starts when people are very young and it does not seem to vary much with age, which suggests that the capacity for intergroup cognition is innate. Additional evidence for this comes from brain-scan studies that show that there are particular regions of the human brain devoted to social categorization. And as we have seen, the existence of even minimal groups can elicit in-group bias. While xenophobia can arise even without conflict, conflict between groups can surely exacerbate people's dislike of other groups."
The differences between your group’s opinion and the out-group’s opinion become exacerbated with the evolutionary desire to fight for your own group.
Recent punditry has discussed tribalism at length. While usually criticized, we may be evolutionary inclined to support our tribe. And once you begin to view the world through your tribe's lens, it seems very difficult to look away.
Identifying your lens
We live in an age of information abundance. With a constant flow of information, we must have the ability to recognize with what lens we are viewing the world. Had you been viewing the world from the opposite lens, would you come to the same conclusion?
We must be careful that we are not looking through the same lens for every new piece of information. Are we putting narrative before information? Awareness of your lens is nearly impossible at all times, but it is a healthy start to recognize that your lens is shaping your view of the world.