The normalcy bias, or normality bias, is a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster and its possible effects. This may result in situations where people fail to adequately prepare and, on a larger scale, the failure of governments to include the populace in its disaster preparations.
The assumption that is made in the case of the normalcy bias is that since a disaster never has occurred, it never will occur. It can result in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation.
The opposite of normalcy bias would be overreaction, or “worst-case thinking” bias,[1][2] in which small deviations from normality are dealt with as signaling an impending catastrophe.
https://en.wikipedia.org/wiki/Normalcy_bias
Help spread the messages in this guide by sharing it with your colleagues https://www.researchgate.net/publication/323254437_GUIDE_TO_EFFECTIVE_RISK_MANAGEMENT_30
More information about RISK-ACADEMY, our training courses and services at www.riskacademy.blog
Check out other decision making books
RISK-ACADEMY offers online courses

ISO31000 Integrating Risk Management
Alex Sidorenko, known for his risk management blog http://www.riskacademy.blog, has created a 25-step program to integrate risk management into decision making, core business processes and the overall culture of the organization.