It must be Christmas - Ding Dongs and Datathons
Homeowner’s Claim Subrogation - Overlooked and Uncollected

Your Data Quality Wish List Part Two

by Len Dubois, Sr. Vice President, Marketing & Sales Support, Trillium Software

WishboneFor the second installment of our data quality “Wish List” series let's tackle another common concern from data professionals, “I wish I knew how poor data quality was manifesting itself in my business.”

You and I both know that without a data quality and data governance process in place, business processes can easily become mired in inefficiency. From a marketing perspective, the inability to accurately identify an inquiry resulting from a campaign as one originating from either an existing customer or prospect may set off a chain of communications that will indicate your organization has no clear idea who their customers are. There is not a week that goes by that I don’t get some type of marketing outreach from an organization, one in which I am a current customer, asking me to sign up for their internet/cable/tv package, or a bank that asks me to apply for a credit card that I already possess, or my own home mortgage provider asking me if I would like to change my rate to one that is higher than the one I currently have.

Additionally, call center operators have been known to enter abbreviated or erroneous data to save time; even third party data from trusted partners enters your organization already wrought with inconsistencies, inaccuracies and errors. Data from these diverse source systems need to conform to the operational standards and formats approved by the business partners who use the data to make decisions on how to grow the business. When evaluating the systems that support marketing, call centers, fraud detection, compliance and even accounting applications, “bad data” is usually seen as the culprit that impacts the business.

To zero in on which areas are creating the biggest problems and having the most impact on your business, you must understand the relationship between data and processes, and the relationship between those processes and financial results. This involves working with both line of business managers and the accounting or finance department. Both of these groups have detailed knowledge about how the business is measured and where money is spent. Each can help identify the metrics that exist so you can leverage them to further exploit the potential impact of better data quality, for example: average campaign response lift, quarterly cost of third party data appends, average call time at call center, total amount of bad debt per quarter. After establishing those associations, you can work with the users of specific systems to detail the business functions they perform and determine the relationship between specific business functions and required data. That information will enable you to quantify the financial impact of the data quality challenges that you uncover. It should surprise no one that some challenges will be associated with automated systems and some with the personnel using the applications.

Once you’ve established just how much bad data is impacting your organization, you also need to develop a plan for how you’re going to not only address it, but ensure these inefficiencies don’t find their way back into business processes again. Data governance has a critical role to play here as it is a function that spans both technology and business resources. However, data governance should be more than a committee to provide IT organizations with a list of requirements and complaints about data quality, or the systems and applications used by the business to collect and utilize the data. To be successful it needs to be a cross functional team that is committed to aligning the data requirements with the objectives and goals of the organization as well as the key business processes involved. The approach should be iterative, focusing on improving the quality of data, realizing business benefits, engendering process change, and moving toward operational data governance. Ask yourself, “Why are we doing this?” “How do we know what to improve?" and "How will success be proven?”

While organizations are increasingly aware of the threat of bad data and the potential is has to wreak havoc on their business, few truly know just how much of a problem it is to their specific organization. By understanding the relationship between your data and the business process it supports, and then further, the relationship between those processes and business results, your company can put a price tag on “bad data” and motivate stakeholders to start fixing the problem.

For more information and examples of what I’m talking about, check out this article by my colleague Ed Wrazen about Building a Powerful Case for Data Quality with Metrics.

Len Dubois Len Dubois
Sr. Vice President, Marketing & Sales Support, Trillium Software

Len has been with Harte-Hanks for thirteen years and has more than fifteen years experience selling and marketing high-technology solutions. He is responsible for the strategic development and execution of worldwide marketing initiatives for Trillium Software. He created the Trillium Software System® brand that has been recognized as one of the top enterprise solutions in the data quality industry. Prior to coming to Harte-Hanks Trillium Software, Mr. Dubois was a Marketing Manager for Epsilon Data Management, Inc. He has spoken at Data Quality conferences in both the U.S. and UK. In addition, he has authored many articles on Data Quality and CRM.

Blog Posts | LinkedIn | Twitter

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

The comments to this entry are closed.