Five Steps to Keep an Organization from Fumbling Its Analytics Implementation

Originally published 7 December 2009

Enterprise-wide analytics programs can be a boon to an organization – particularly during challenging times. Getting proper information and putting it into the right hands at the right time (and taking the right action as a result) can mean the difference between a company’s life and death, between its success and mere survival.

Yet, many analytics implementations fall short of the target. Organizations frequently invest in significant upgrades to their analytical capabilities, but never reap the rewards expected.

This is born out by Accenture research that shows that two-thirds of the large companies in the United States and United Kingdom believe they need to improve their analytical capabilities. With this belief comes the growing understanding that better managing their business and improving their decision making requires more than depending on gut instinct or politics as usual.

Factors that Undermine Implementations

There are five key points that are largely responsible for the best laid plans falling short. In order for an organization to establish an analytics program that will help it get the right information, ask the right questions and make the right decisions, it needs to take a hard look at the following five concerns that can undermine any implementation:
  1. Management fails to embrace. What’s needed, regardless of the industry, is a management team that embraces the idea of using analytics to guide its business and help it make key decisions. Analytics is more than technology, and it starts at the top with a CEO who will view analytics as a way to run the business and make decisions. Yet, a critical part of this is also the presence of a CIO with a mind-set of “How do I make data into a business asset?” and the preparation to create the right interface between business and IT.

  2. Failures in data quality assurance (i.e., “garbage in, garbage out”). The biggest failure rates we see with analytics, as one would expect, are those where the underlying data quality fails. As a result, people spend more time arguing about the data than what it means or what they should do about it. When the initial data quality has flaws or is in question, everything that flows from it in the analytical process suffers. The ingredients, after all, need to be right in order for the meal to be satisfying. Any failure creates a lack of faith in the data that’s driving the analysis, and that’s where data quality most often sabotages analytical implementations. This means you may have to be selective about the data you feed into your model. Less can be more when you focus on the most pertinent information and leave out questionable numbers.

  3. Unstructured data. This is particularly important in certain industries, like patient care or credit risk analysis, where there is a need to combine access to non-system or unstructured data, whether it’s voice or video or simple documents, to extrapolate the most information out of enterprise-wide analytics.

  4. Legacy systems. For those organizations that have been running business intelligence programs and collecting data for years, and are now looking to mine insights they can funnel to the front line, the questions remain on which information processing capabilities to keep, which to adapt and which to scrap. This can become an arduous process wrought with confrontation and difficult decisions. Some companies will require wholesale investments in new tools and processes, while others will simply need to tweak existing offerings.

  5. An inability to decide what “success” is. It makes little sense to invest in analytics without first knowing the questions the organization wants answered. The question should not be “How do we get all the data we possibly can into the system?” but rather  “How do we select what data we need to put in?” As such, there needs to be a “look before your leap” approach taken. Those organizations that are successful in the early stages do not dive into systemic reform, but rather first take a step back and raise questions on where to use specific information in the business, what it will be used for, and how it will make a difference. Many programs fail because decision makers never articulated a clear goal. Without a target point, it is impossible to achieve the necessary objectives.
Those organizations that can effectively bridge these five concerns and get to the point of having real-time analytic data drive their decision making will, as a result, make better and more timely decisions. The tough economic times today dictate this more than ever.


SOURCE: Five Steps to Keep an Organization from Fumbling Its Analytics Implementation

  • Royce BellRoyce Bell
    Royce is CEO of Accenture Information Management Services. During his more than 23 years of management consulting, technology and outsourcing services, he has worked with a range of large multinational companies, notably Shell and BP, and several government institutions. Bell sits on the board of the Oxford University Business Economic Programme. He has lived and worked for extensive periods in the Far East and France. Bell has a Bachelor’s and Master’s in Natural Sciences and Law from Cambridge University. A keen theatre supporter, he also sits on the corporate advisory group of the National Theatre in the UK.
 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!