It is a worthwhile experiment to periodically check ourselves to assure that we are thinking properly about a matter of interest to us. This is the proverbial introspection, a sort of kicking the tires and tightening the nuts and bolts of our mind. As a man thinks, so he is…or so a Biblical proverb says. It is this perspective that stimulated my thinking when I came across Daniel Kahneman's book titled Thinking, Fast and Slow (TFS).
For those who don't know Mr. Kahneman, he is an eminent scholar who has influenced many in the behavioral finance field. His writings have explored the realm of rational behavior of people in the course of living life, and what he has found is often at odds with the rational being that makes up many economic/finance models. Some of his conclusions are quite eye-opening, such that all of us need to take heed, lest we become culprits of our own downfall.
TFS points out how our mind is bifurcated in the way it processes information: on the one hand, a part of our mind takes in information without doing much thinking; while the other half takes a more rigorous analytical approach to reach conclusions. Which part of the mind we are using depends on the nature of the activity we are undertaking. A somewhat easy task, like brushing your teeth, will take less effort when compared to, say, studying chemistry. Often unaware, we shift from the latter to the former way of thinking and therefore end up making gross errors.
This becomes problematic when we don't take into account for the fact that we have certain behavioral limitations, aka biases or heuristic, that we must be aware of and to ensure we have rules not to fall into that trap.
Here I will point out a few of the behavioral limitations, as noted in TFS:
1. Anchoring - here is when we reach a conclusion that was based on recent piece of information that may or may not have been connected to the subject of the inquiry. For example, if I were to say 20, and then ask how many islands are there in the Caribbean, your answer will likely revolve around 20.
2. Regression to the mean - here we are saying that performance tends to average out over time. So, any above or below average observation can be expected to regress to its average.
3. Narrative Fallacy has to do with the fact that we prefer simple and plain explanations, as opposed to complex or theoretical propositions.
4. Hindsight bias has to do with looking back at experiences and re-interpreting them to fit our understanding of current information. This typically comes out as, “I knew this was going to happen.” But the reality is that “knowing” now occurred because we have re-interpreted our understanding.
5. Loss averse tells us that we dislike losing more than the gaining of some amount. For example, if we were to measure this, we'd say the person dislikes more losing $100 than winning $150.
6. Endowment effect has to do with the idea that we attribute more value to something simply because we own it (or used to own it).
All of us are prone to these biases, which is a good enough reason to be careful to readily accept "expert" opinion, particularly in complex fields (where standard errors are high, to put it in statistical terms). Examples are many when we look at fields that relate to forecasting events. Being able to discern the expert’s ability, whether it was their talent or skill or just imply pure chance or luck, is a task we must engage.
No comments:
Post a Comment