Topic: Business Strategy, Decision Making, Evidence Based Management, Statistics
Publication: Harvard Business Review (APR 2012)
Article: Good Data Won’t Guarantee Good Decisions
Authors: S. Shah, A. Horne, and J. Capellá
Reviewed By: Megan Leasher
When we were in grade school, we learned that 1 + 1 = 2. We quickly realized and celebrated the immediate success in figuring out what came after the equal sign. This celebration built faith; blind faith that we should always believe in the result of an analysis.
But in business, it’s not quite so simple. We should not automatically rejoice in what we see after an equal sign, because we need to judge what went into the numbers in the first place. This concept is the focus of a study conducted by the Corporate Executive Board, which classified 5,000 employees at 22 global companies into one of three categories: Those who always trust analysis over judgment, those who always rely on their gut, and those who balance analysis and judgment together. The Board advocates the latter “balanced” group, as their research found that this group demonstrated higher productivity, effectiveness, market-share growth, and engagement than those in the other two groups. However, the Board also found that only 38% of employees and 50% of senior executives fell into this “balanced” group. Taken together, their findings advocate cultivating both analysis and judgment in decision-making at all levels of organizations.
The authors present several ideas as to how organizations can begin to make a shift toward a culture of applying appropriate insight and judgment to their data analysis. First and foremost, they argue that data must be made accessible and presented in usable formats that enable analysis. A dual-focus must be placed on the both the data and the judgment; increase data literacy and statistical expertise while simultaneously training employees how to correctly use the data, encouraging both dialogue and dissent throughout the interpretation.
But this is easier said than done. You have to know what to trust and distrust in data. You have to learn if and how metrics support the strategy and growth of an organization. You have to learn what types of caveats and error can be found within the data. You have to learn how the data was collected, what might be wrong with the collection process, and what important information might have been ignored. You have to know how to interpret and proceed when you find that multiple metrics of performance are giving you competing answers; not all data play nice with each other. You have to know what data is worth analyzing and what data should be abandoned altogether. Sometimes running away screaming is the appropriate response.
Analysis isn’t just about writing a formula and clicking “run” or “execute” to crunch the numbers. After all, data without method is just numbers in columns and rows. It’s about a series of critical, incremental, and ethical judgment calls before and after each iteration within an analysis. Some of the judgment calls come from understanding the content and context of the data, some come from a grounding in organizational and industry knowledge, and some come from an understanding of the past, present, and future strategy of the organization. And yes, some judgment calls come from pure statistical knowledge. The true expertise comes from a constant interplay and interdependence of all of these factors.
Regardless of the challenges presented, the authors are clear that decisions should never be made by data or one’s gut alone; analysis is critical, but so is applying corresponding judgment.
human resource management, organizational industrial psychology, organizational management