Messy Analytics Conclusion of a 3-Part Series

Originally published 10 May 2011

It is easy to criticize things, like I did in my previous two articles on analytics. In Part 1 I argued a bit against fact-based management, claiming it is hard to determine truths of fact to start with. In Part 2 I continued my criticism, this time against what I think is a contradiction in terms. Because it is too easy to criticize things, I realize I should present an alternative vision as well. This is what I do in Part 3, introducing the concept of “messy analytics.” Don’t bother Googling it – I came up with the term myself, although what I mean by it is based on some pretty basic common wisdom.

In the analytical world, truths of reason have become truths of fact. You don't get a credit card if the major source of income on your bank account does not equal income code '03' (I am just making up an example here). You don't get  good car insurance if you have the wrong ZIP code. Your flight gets canceled if the weather model predicts ashes from a volcano eruption. "Computer says no.” The analytics of the models have become so complex that we cannot do anything but believe the outcome of them. Consider the cause of the credit crunch – financial analytics chopping up and repackaging subprime mortgage packages. The analytics themselves (the truths of reason) have become the product itself (the truth of fact). All this occurs while, per definition, a model of reality and reality itself are not in sync. This is a recipe for disaster. Even so, we manage the nice and clean logic of the model, instead of coping with the messy reality.

I am not suggesting we need to get rid of analytics. People suffer from bounded rationality – our brain capacity is insufficient to oversee complex problems and come to rational and optimized conclusions. Computers are a great help in better decision-making. However, as reality is messy, I suggest we adopt what I call messy analytics. This means various things.

First, when you do statistical analysis, resist the temptation to remove the outliers. Improbable scores or data are usually filtered out of the dataset because it is noise "messing up" the model. However, the outliers might actually represent the most interesting bits. They could be the early warning signal for a black swan coming or could represent new business opportunities that others – following best practices –neatly filter out. If the model is your lens, you won't see any change coming. You won't get any weird new ideas. What you see is what you've always seen. All the model does is confirm your hypothesis. Outliers deserve extra attention.

Secondly, I strongly suggest actually intentionally introducing noise to the model and the analysis. This is based on a principle called perturbation. If you throw a "wrench in the machine," the people in the process need to figure out how to deal with that disturbance. In the case of an analysis, the analysts would have to figure out what that really means. Or, in the case of a learning algorithm, how the system would explain the noise. As the future cannot be predicted, the only thing you can do is be ready for it. Deliberately introducing disturbances helps people train for the event when real change is occurring (e.g., a fire drill).A real-life application of this can be found at airports. The X-ray machines are programmed to show forbidden objects in luggage, and it is the task of the security person to notice those objects. Missing too many of them means the officer is not sharp enough! The same thing should be done in analytics as well.

Furthermore, there should be more qualitative research. One effective methodology is scenario analysis. The principle is very simple. Create between two and four scenarios that list the major assumptions that the business model is based on (such as increasing prices, decreasing costs, or limited or abundant availability of a certain resource), and reason what happens if this assumption all of a sudden doesn't hold true anymore. Scenarios are narratives that describe how your organization will be doing in such an imagined world. Instead of narratives, you could also create computer games, in which you need to act in such a possible reality. Again, the aim is not to be right about the future, but to be ready for it. The narrative is not supposed to be precise and complete, but it should be imaginative and provide a call to action.

Other types of qualitative research include ethnographic research or even participative research. One insurance company I know of has videotaped customers going through customer interactions with the company, not only recording action and time, but also body language and other types of soft data. In fact, "Undercover Boss" is a TV show in the United Kingdom and in the United States, in which a CEO of a company goes undercover and works in the front lines for a few weeks, learning about the real work and real problems of the employees. This is quite a different approach from drilling down in a productivity report.

Scientists will be quick to point out the Hawthorne Effect, which explains that measurement itself already impacts the behavior of the subject that is studied. The term comes from the Hawthorne Works, an American factory that conducted a series of worker productivity experiments in the 1920s. One of the tests focused on the impact of the illumination in the factory on the workers’ productivity. They found that better lighting improved productivity, which was expected. However, they also found that dimming the light or leaving the lights alone also improved productivity. Indeed, it was the people walking around measuring productivity itself that impacted the productivity. This is a curse for science because you try to study a subject as objectively as possible and don't want to alter behavior. You can't control all variables, and there are no truths of reasoning in this style of analysis. You mess with reality, but then again, hopefully reality messes with you too and you learn something.

Perhaps the most straightforward advice I can give is to simply talk to people if you want to figure something out. Don't send surveys, as they only measure the things you thought of and rule out the possibility of serendipity. Talk to people. Ask customers and partners what they like about you and what not, and ask suppliers if they find it easy to do business with you. Ask your staff how they are doing, and how they think the company is doing.

Messy? Certainly, that's the good bit. It's as close to truths of fact as you can come.

 

SOURCE: Messy Analytics

  • Frank BuytendijkFrank Buytendijk

    Frank's professional background in strategy, performance management and organizational behavior gives him a strong perspective across many domains in business and IT. He is an entertaining speaker at conferences all over the world, and was recently called an “intellectual provocateur” and described as “having an unusual warm tone of voice.” His work is frequently labeled as provocative, deep, truly original, and out of the box. More down to earth, his daughter once described it as “My daddy sits in airplanes, stands on stages, and tells jokes.” Frank is a former Gartner Research VP, and a seasoned IT executive. Frank is also a visiting fellow at Cranfield University School of Management, and author of various books, including Performance Leadership (McGraw-Hill, September 2008), and Dealing with Dilemmas (Wiley & Sons, August 2010). Frank's newest book, Socrates Reloaded, is now available and is highly recommended. Click here for more information on how to get your copy today.

    Editor's Note: More articles and a link to his popular blog are available in Frank's BeyeNETWORK Expert Channel. Be sure to visit today!

Recent articles by Frank Buytendijk

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!