Oops! The input is malformed!
Originally published 2 May 2007
In the first part of this series, we came to the conclusion that introducing a master data management (MDM) system may cause changes to data, applications, processes and workflows, user interfaces, documents and people. In this second part, I will look at each of these in detail.
MDM Change Management – Data
Looking at Figure 1, the introduction of a centralised MDM system with a single system of record could potentially mean that all the “local” system of entry (SOE) master data subsets shown in the diagram could be phased out. This could theoretically be achieved without changing the system of entry (SOE) applications by using enterprise information integration (EII) software to create a virtual view of each application’s data model. EII products would need to support distributed heterogeneous two-phase commit in this solution. Some MDM products come with an EII product as part of the solution that may help with this (e.g., Siperian with its Activity Manager). It may also be the case that you already have an EII product in your shop (e.g., one that comes with a reporting tool such as that from Composite Software and CognosReportNet). This nonintrusive approach is shown in Figure 2 where the EII on-demand data integration software slides between the SOE applications and their data to create a virtual views of the underlying data where the virtual views are replicas of the SOE system data models that map to both the local application specific data still in use and the central master data.
Note that the multiple systems of entry still maintain the master data; it’s just that they don’t know they are being redirected to the MDM system for that data. A perfectly valid question here is “Is EII always a valid option here?” I think the answer is not always. It really depends on the application. Consider IBM CICS and IMS mainframe applications that may be very high volume transaction processing systems. The kind of architecture in Figure 2 may not be able to sustain that kind of transaction volume, and I would certainly want to benchmark any EII software in this kind of architecture to validate whether or not this architecture is viable. Also, EII products may also have limitations on update processing. There are other questions that can also be raised against EII in update mode including those about data validation and data cleansing when updating data through an EII virtual view before the master data changes get into the MDM system. Also, if you want to enforce this kind of data validation and cleansing during update, then invoking data integration services may be needed to update master data. What is certainly clear is that even if EII is viable, it is a tactical option until more serious re-engineering can be budgeted and implemented. The positive benefit is that it might be possible to consolidate the data without changing the applications. But even if this is achievable, why would you ever want to leave it this way with some of your data centralised and some not? It would seem to me that once you head down this road, you surely must be intent on completing the job of centralisation of core master data and even transaction data. It is true to say that application specific operational reporting should continue to work when using EII or an MDM registry product such as Initiate or Purisma in this manner.
The alternative to EII is that multiple SOEs would have to be changed to use common master data services to access and maintain common master data (see Figure 3). The problem is that this requires applications to be changed if master data is consolidated into a master data store. This could be done for custom built applications but difficult to achieve for packages. Again, multiple systems of entry still maintain the master data.
Looking at Figure 3, you may well ask if the common CRUD services surrounding the master data are enough. It is possible that they may not be. In fact, they are definitely not enough if you want the changes made by disparate line of business systems of entry to the master data hub to be consistently validated, cleaned and correctly transformed every time. This would require applications to call COMMON master data integration or data quality services (or even invoking them on an event-driven basis). These data integration or data quality services could then themselves call the underlying CRUD services to maintain the data only after the data has been validated and cleaned.
MDM Change Management – Applications
Having looked at the impact on data, the next area to look at is the impact of MDM on applications. Figure 4 shows that application functionality needs to be changed if the MDM system is a system of record (SOR) and the data is only held centrally. Figure 5 on the other hand shows that application functionality needs to be removed if the MDM system is a SOR and SOE plus the data is only held centrally. This means that you have to identify what application functionality in what applications need to be changed or removed.
MDM Change Management – User Interfaces
With respect to user interfaces, if the MDM system becomes a central system of entry then all disparate master data SOEs need to be prevented from changing master data. This means going through all the user input screens and online forms to identify master data entry fields that will need to be removed from these screens. In the case of “green screen” terminal applications (e.g., IBM 3270 mainframe applications) the user interface needs to be re-engineered to remove these fields while still allowing update to master data centrally. One of the ways in which this can be done is to use a product like Software AG ApplinX or IBM HATS to “wrap” the green screen application so that the 3270 terminal data stream user interface is “screen scraped” to produce XML for presentation in a portal as portlets. This would need to be done making sure not to include the master data entry fields from the old green screens. One or more new portlets could then be built to manage master data entry directly into the central MDM system (see Figure 6). In this way, the user still sees one screen in their browser and may be completely unaware that they are updating multiple systems through a single user interface.
While this kind of user interface re-engineering may be possible for custom built applications, the obvious problem may be packaged applications in that you may not be able to reengineer the user interface of such applications to remove or disable master data fields. You do stand a chance here if the packaged application is portlet based as is the case with the latest releases of some of the major packaged applications in the market today. If you cannot disable master data entry fields on a packaged application system of entry user interface, then you will most likely have to tolerate master data changes made via these SOEs and have to design for two-way synchronisation and conflict resolution with the central MDM system, which is the main master data SOE.
MDM Change Management – Document Workflow and System Processes
With respect to document workflow and system processes, MDM systems can also cause changes here. For example, if there are multiple ways to add a new customer on an order entry process, these may need to be changed to use common master data services. In addition, some tasks (e.g., document flows to other departments) may need to be removed from processes because access to shared master data makes them redundant.
MDM Change Management – People and Documents
Finally, the introduction of an MDM system may even affect people. People may no longer be required to perform specific tasks associated with master data. Examples of these tasks might include manual credit checking of customers, passing documents between departments on customers and/or products and creating and maintaining master data. This is because these may be duplicate, overlapping, conflicting and redundant tasks and are therefore likely to be phased out. This is a sensitive issue and requires careful handling. The processes, applications and screens that people use may change, thus re-training may also be needed.
With respect to documents, it may well be the case that certain documents and/or forms that people have been using may have changed as a result of the introduction of an MDM system. For example, master data fields may need to be removed from these documents, or it may even be the case that some documents may be completely eliminated because they are redundant.
In most enterprises, the introduction of master data management is often part of a larger business integration initiative that is aimed at simplifying systems and infrastructure and consolidating data for the purposes of sharing across applications and processes. While simplification is a good thing in general, it is clear that introducing master data management requires careful planning of change that can ripple right across the enterprise and beyond. Good understanding of business processes is a critical success factor in implementing MDM not only to identify all the master data needed, but to also understand the full impact of introducing MDM on business operations. Without process analysis, MDM will focus purely on data integration and data synchronisation, and IT may be caught unaware of the impact on their business colleagues. The result may be that half a job is done and with unexpected realisation late into a project that much greater change is needed, which is beyond budget and not planned for. In addition, IT may then be under pressure to deliver such unplanned changes in a hurry to try minimise the impact on the business. The way to avoid this is to learn how your business processes work at an early stage in the project while preparing to implement master data management. In this way, you should hopefully stay on track and on budget without experiencing nasty surprises.
Recent articles by Mike Ferguson