(First published in the WAO/FACTOR newsletter in July 2013)
Decision Making – Finally, I am now discussing the last part of my Analytics Governance model I started presenting in February. This is also the most complex, because it has been the object of very few articles, blog posts, book chapters, etc.
Mainly, it deals with making sure people will actually use analytics results. Don’t laugh, and be naïve enough to believe that because you’ve invested tons of money in solutions and people, your analytics will de facto be an important part of managers and executives’ decisions. Tsk! Tsk! Nope! There is a long distance between “I think you should do X” and “Sure I will” (with actual action following the statement).
Throughout the years I have been involved in Digital Analytics (11 years now), the number one complain I hear from analysts is that people don’t seem to care about their reports/analyses/dashboards, etc. This is really strange, isn’t it? Not the complaint, but the fact that people would think they can afford to ignore analytics. I mean, if analysts are in the business of generating insights/knowledge, in short, in the truth business, why is it that so many managers can still ignore those truths in their decision making process?
Many factors are at play here: trust in the whole analytics process, trust in data reliability, trust in how the data is interpreted and the relevancy of that interpretation, politics, the company’s performance measurement system of individuals, etc. I believe another factor is just lack of governance, i.e. clear rules that state simply: each decision must be made on the account of analytics, and managers should document why they reject its results when they do. By this I mean a manager would still be allowed to reject the analyst’s conclusions and recommendations, but that rejection should be documented somewhere, and what was decided instead.
This brings some accountability and responsibility at the end of the analytical process, while allowing for some degree of freedom. After all, the decision maker usually is the one bearing all the risks. By documenting what we do with analytics results (adopt or reject them), we’re trying to make the whole process a learning one, and admit that analytics don’t get it right all the time.
We, in the Digital Analytics world (and in the analytics one in general I gather), like to make fun of the HIPPOs (Highest Paid Person’s Opinion) who will make decisions based on their guts rather than using our brilliant analyses. Apart from being an insufferable cliché repeated ad nauseam on Twitter, the HIPPO question is deeper and more complex than mocking how clueless those managers are supposed to be. What if those managers treated analytics that way because it failed them? By failure I mean not being able to give clear directions on where to go? What if analytics just did not deliver what was expected from people whose neck is on the line after all? What if they do what they do because they understand many things we don’t that have a lot of weight in the company?
I will readily admit that this is still a very fuzzy part of our overall Governance model. How does one enforce such set of rules, or even create them? How can one enforce analytics without imposing them, i.e. taking away that part of freedom that makes managing anything interesting? Frankly, I don’t know. Sure, there must be some “executive buy-in” somehwere in there (isn’t there always for anything to work in a company?). I haven’t seen any company seriously addressing that dimension, but I know analytics often fail because it wasn’t discussed in the open. Ask any analyst.
Jim Novo in this newsletter offered an amazing discussion of the relationship between analytics and “gut feel”, and offers ways to mix the two. I would like to suggest that companies need to make that dimension of the analytical function more official, to create an actual process that will allow to revisit past decisions, so that they can understand better how they were made and learn from them with the 20/20 insights of later facts and results. Maybe the “gut” was right more often than we expected. In such cases, what would it say about our Analytics? Companies need to figure it out.
Calling “HIPPOs” clueless is just too easy.