Judgment Day on Enterprise Data, Part 1

Originally published 1 November 2006

Over the last few years, the number of cross-industry and industry-specific compliance initiatives being introduced through legislation and regulatory bodies has increased rapidly. These compliance initiatives range from tougher, fully auditable accounting practices to having to implement specific practices for things like anti-money laundering and information security.

The main reason why most of these have been introduced is to increase the quality of business by raising the bar on higher standards in business practice.

Almost all of these compliance initiatives rely on one thing – data. Moreover, that data must be secure, correct, complete and trusted to avoid the heavy financial penalties and prosecutions that can be triggered if it is proven to be wrong. Regulatory and governmental bodies often require companies to provide data in a variety of regulatory reports covering information on business performance and other information as part of their compliance initiatives. In addition, compliance may also involve the tightening up and monitoring of business processes so that auditors can verify that data and documents associated with customers, orders and products is properly managed and tracked when flowing through the enterprise.

One of the problems businesses face with these regulations is that the data needed for monitoring and regulatory reporting may have been created and stored in many different systems across the enterprise. This data may be housed in files and databases in multiple operational systems, in data warehousing systems and even in multiple office documents such as spreadsheets. Data in these systems could have been created in a number of ways. It may have come into the enterprise in files attached to emails, in inbound electronic messages, as keyboard or mobile device input from customers, partners or employees inside or outside the enterprise and even from internal programs creating and passing data to each other via batch files or electronic messages.

Most organisations do not have any consistent way to check and validate this data. Each system normally acts independently of all others in this regard if data validation is done at all. In other words, there is no guarantee that all the data coming through different channels into these systems is consistently checked, correct or complete, let alone trusted. For this reason, many people create “their own data” in spreadsheets, which they feel they trust, and start to send it to others attached to email thereby creating more versions of data and more uncertainty over whether that data is correct. On this basis, it could be argued that if financial reports are compiled from data cut and pasted together from any of these systems, then it raises the possibility that executives could be standing up in annual general meetings and quarterly reviews in front of shareholders quoting financial figures without any real guarantee or knowledge of whether or not the data is correct. Equally, it could be argued that there is no guarantee that regulatory reports compiled in a similar fashion and submitted to regulatory bodies and government organisations are correct.

Even within a data warehouse environment that has been created to integrate data, discrepancies can exist. These are due mainly to:

  • Lack of investment in data quality tools needed in data warehouse development implying that in many cases people may not be following best practice in technology selection or in development processes. 

  • Isolated development of multiple individual data marts by different development teams that do not coordinate and collaborate. 

  • Lack of enforcement of common naming and data definitions for the same data used across multiple data marts and multiple business intelligence (BI) tools which should have been implemented when taking data from customer relationship management (CRM), enterprise resource planning (ERP) and other operational systems.


Isolated development of multiple financial data marts in different business units for example can lead to these data marts using different names for the same data, different formulae for the same metrics across multiple business intelligence tools and different extract, transform and load (ETL) translations to pre-calculate the same metric when populating that metric in different data marts. Even worse is the same name and different formulae for a metric within different business views in the same BI tool. In this case, it means that users don’t know that the metric in their view is calculated differently than in someone else’s view. Once BI tools and packaged analytic applications get integrated into enterprise portals, the root of the problem is suddenly lost in a mass of information and the users simply don’t know what produces the metrics they see. Hence, they may not even be aware that there is a problem or mistrust the output.

Given this background, it is not surprising that many CFOs tread the compliance road very carefully often retreating back into their ERP shell trying to rely on data in one system or else relying on spreadsheet reports and personal databases constructed in attempts to reconcile discrepancies in underlying data taken from multiple systems.

So the question is: How do you know that the data used in regulatory reports to satisfy compliance regulators and legislators is right or wrong? Are you playing compliance roulette with bad data? Is your CFO standing on a house of cards consisting of multiple spreadsheets where the data may not be guaranteed as correct? Do you trust your data enough to tell it to a judge?

If the answer to any of these questions is “no” or “not sure,” then action is needed to create “rock solid” trusted data once and for all to achieve compliance, transparency and total business confidence. This article looks at how this problem can be solved and offers suggestions on how to approach getting better data from your critical systems.

The Burden of Enterprise Compliance and Key Compliance Initiatives
As the world starts to clean up its act in the aftermath of major publicly quoted corporate collapses, new legislation is being introduced and more and more regulatory bodies are turning attention to the strengthening of their own procedures to regulate their markets. Therefore the burden of compliance is growing with new regulations appearing and still more to come. This burden is beginning to weigh heavily on the shoulders of executives such as CEOs and CFOs who typically carry ownership of it. In particular, it is the vast number of different compliance initiatives that are putting pressure on executives. The number of compliance initiatives that enterprises have to deal with varies by vertical industry and is also influenced by where in the world a company trades.

Some country specific compliance initiatives apply to all companies irrespective of industry. A good example of this is the Sarbanes-Oxley (SOX) legislation in the United States which applies to any publicly quoted company trading in the US and filing financial reports with U.S. government financial institutions. In some parts of the world like Europe, the compliance problem is often more acute than elsewhere. This is mainly because of individual European Union countries introducing different country specific legislation and regulations to implement European Union policy. The result is a growing concern over complexity of the regulatory workload with some industries such as financial services and insurance carrying a significant burden.

In addition to the now infamous SOX, another regulatory requirement getting executive attention is IFRS. On January 1st, 2005, IFRS became an EU standard for international accounting practices. For many CFOs in EU firms, this set of regulations is just as important as SOX. IFRS demands a more complete picture and much more insight into listed companies’ business performance. IFRS is forcing companies to do much more forecasting and to hold and provide much more detailed and accurate information. For example, EU listed companies now have to disclose the precise daily movement on foreign currencies they hold and guarantee that these are correct. They also have to provide auditable summaries on share-based payments, inventories, cash-flow, events after the balance sheet date, borrowing costs, revenue, leases and much more.

The Impact of Poor Data Quality on Enterprise Compliance
Without control of data quality in all the areas where data enters the business and where data flows between systems, the data foundation upon which enterprise compliance depends starts to crumble. Flaws in data accuracy, completeness and integrity at the outset will ripple right though a company potentially causing:

  • Defects in processes that may result in additional operational costs, operational delays and compliance violations (e.g. due to inability to record missing data on audit logs).
  • Questions over accuracy of performance metrics in compliance reports leading to audits and potential penalties if the data produced is proven to be incorrect.
  • Business unit compliance reports not reconciling at enterprise levels.
  • Incorrect decisions and incorrect actions.
  • Increases in reconciliation activities in order to resolve discrepancies in data.

Monitoring processes via business activity monitoring (BAM) adds another level of complexity and risk when it involves real-time automation of decisions and actions. In particular, it is important to avoid incorrect decisions and actions based on monitoring events for noncompliance in business processes. If bad data causes incorrect automated actions during process monitoring (e.g., perhaps because bad data indicates a control threshold has been exceeded), then the impact on the business could be very damaging. As an example, what if bad data results in incorrect detection of fraud or incorrect detection of money laundering? This could damage reputations and may even attract legal action against the company for defamation. In addition, the business could still remain noncompliant.

While this is enough of a problem in itself, it is not until you look at some of the penalties associated with noncompliance that you realise why executives like CEOs and CFOs should care about the impact that poor data quality can have.

For example, Section 302 of the Sarbanes Oxley Act says that corporate officers must certify the accuracy of financial statements. If the statements contain falsehoods, or are misleading, officers can face exorbitant fines and even time in prison. In addition, if a lawsuit results from a company’s failure to accurately reflect its financial condition, each shareholder may also have the right to file a lawsuit alleging they were misled.

Several financial regulatory bodies also have statements on fraud including the fact that if it is proven that someone at the company has committed fraud, the management is subject to a lawsuit. Furthermore, an insurer providing compliance liability cover for a corporate officer can rescind cover on any policies even if proven fraudulent activity does not involve the director or officer. This would leave the insured executive liable for the full force of all legal costs and all fines levied on him/her. Even if reporting mistakes are unintentional, the legal costs and financial ramifications of defending the business are tremendous.

Besides penalties in the form of fines, noncompliance with IFRS, for example, can lead to de-listing of the company, reduced share price and therefore a reduction in overall corporate value. Given that many executives’ personal compensation packages are linked to share price it is in their own interests to invest in data quality to remain compliant.

In vertical industries there are also consequences. For example, noncompliance can impact financial institutions in many ways including:

  • Loss of profitable business.
  • Liquidity problems through sudden withdrawal of funds.
  • Termination of correspondent banking facilities.
  • Investigation costs and fines/ penalties.
  • Asset seizures.
  • Loan losses.
  • Use of senior management’s time in doing damage control.
  • Declines in the stock value of the financial institution concerned.

Simply put, the bottom line here is that noncompliance carries significant adverse consequences. It is not just about the penalties. It is about reputation, maintaining good relationships with key investors and customer confidence. All of these are threatened by poor data quality. It is also clear that there is no point doing an excellent job of implementing fully compliant processes and producing all the required regulatory reports on time and in the right formats to all the appropriate bodies if the data in them is flawed. Enterprise compliance mandates the need for trusted, “rock solid” data and control over enterprise data quality is fundamental in helping to create that trust.

My next article will cover the requirements for creating trusted data as well as the technical architecture and components needed to establish enterprise data quality.

 

 

Recent articles by Mike Ferguson

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!