When Data Can’t Be Trusted, Master Data Management Becomes “Plan B”

Originally published 4 August 2009

For some time now, data has been moving around the enterprise at a faster pace. Banks, retailers, communications companies, healthcare providers and high-tech firms – they all have a more urgent need to share data across business applications, end users and core systems. The emergence of new technologies, most notably message bus solutions, facilitates these new levels of data exchange. But this data sharing reveals other problems, namely different versions of the same data.

The fact is that there are more constituents, more lines of business and more business uses for customer data than ever before. But as quickly as solutions for data movement were being introduced, problems with the data itself became more visible. Two different data fields that meant the same thing nevertheless looked different, and were interpreted very differently. This was true for established data types like “Social Security number” that implied strict formatting rules, but also for data types like “address” where the rules might be less rigorous. Ironically, the trend of master data management (MDM) – the automation of data reconciliation across and between systems for various types of reference data – was greeted heartily by business and IT executives who were still fundamentally change-averse. Yet, many didn’t know what to do or where to start.

Let me remind you master data management is not a new solution to an old problem, but rather a new solution to a new problem. Simply put, it’s a paradigm shift. The reasons for this are fundamental to what MDM is. Labels aside, the true value of any MDM solution is its ability to acquire, asynchronously or in real-time, enterprise data from heterogeneous systems and online stores where that data originates. Whereas many vendors rushed to hang the MDM shingle, or re-craft their marketing messages to retrofit their products into the MDM rubric, the best MDM solutions are demand-driven: they match and correct data when it’s used. And, not to put too fine a point on it, they don’t touch the data if it’s not being used.

As data is used for different purposes in different ways by different constituencies, it further erodes to the point where it can become generally regarded as untrustworthy. Even direct source system reports, usually considered more reliable, are difficult to interpret. And as you add more source systems, ETL programmers need to assess the impact to their expanding collection of ETL jobs and modify them. Point-to-point code that had worked before stops working. This becomes even more unwieldy when you begin to consider subscribing to third-party data.

A sad turning point happens when mistrust of the data becomes part of the company’s culture. Following the inevitable pressure from business colleagues, CIOs eventually come around to the idea of going to “Plan B” and fixing the data “once and for all.” CIOs generally try to resist “Plan Bs” because they almost always involve disruptive technology. So the search begins for a solution that needs to be flexible, authoritative and permanent. Yet because existing systems and users still needed data from the company’s data warehouse, CRM and EII environments, any solution needs to cause minimal system or business disruption.

Mindful of these requirements, the CIOs take a hard look at MDM for a single point of reference for data matching, integration and two-way propagation. A new MDM hub prevents “one off,” custom solutions for every new system linkage. And it eases the burden of having to propagate data from new sources and support different integration rules within and between operational systems.

With “Plan B,” the MDM hub assumes the role of a data reference system. Systems in need of data call the MDM hub for the reference record, and then retrieve the actual data. In tandem, you have the continued propagation of data to and from different systems. In this way, companies finally have access to authoritative versions of data for a range of applications while avoiding drastic, disruptive change.

Because it works so well, the MDM hub becomes affectionately known as the “master chef” for enterprise data. It ensures that all systems and users have “standard ingredients” – and standard mixing instructions. If the ingredients change – say, a source system redefined its data – other ingredients could be affected, and the chef regulates the entire recipe. The MDM team’s unofficial but oft-quoted motto becomes: “Never cook with rotten eggs.”

Master data management applies consistent business rules to data that ensure its reliability. That reliability is informed by the data’s ability to reflect real-world facts. And – to be truly useful across business processes, technologies and decisions – data changes when real-world facts change, with as little latency as possible.

The need to retrofit existing processes to new tools confronts an historical resistance to change. But MDM isn’t a substitute for these technologies, each of which addresses a very specific set of needs. Rather, MDM is nothing less than a disruptive innovation that, again, solves new problems with a new solution.

But that doesn’t mean your existing solutions aren’t still important. Indeed, MDM can enhance their performance, their accuracy and their overall business value.

As companies increasingly recognize MDM’s promise as a solution to their most urgent business problems, research leads them to a range of vendor solutions. As you do your own research, you should first understand your company’s unique requirements for accurate, meaningful master data. Then, consider those requirements as you evaluate your MDM options.DM is at its most effective when linking multiple systems for the purposes of once-and-done data reconciliation. You not only will be able to capitalize on MDM’s economies of scale each time systems are added or changed, but reap the benefits of data consistency, validity and accuracy. Then, when you get asked about your Plan B, you’ll give it an A+.

SOURCE: When Data Can’t Be Trusted, Master Data Management Becomes “Plan B”

  • Jill DychéJill Dyché

    Jill is a partner co-founder of Baseline Consulting, a technology and management consulting firm specializing in data integration and business analytics. Jill is the author of three acclaimed business books, the latest of which is Customer Data Integration: Reaching a Single Version of the Truth, co-authored with Evan Levy. Her blog, Inside the Biz, focuses on the business value of IT.

    Editor's Note: More articles and resources are available in Jill's BeyeNETWORK Expert Channel. Be sure to visit today!



Want to post a comment? Login or become a member today!

Be the first to comment!