Archive for the ‘LinkedIn’ Tag

Dots On A Map Provide Unique Insights Into Data Quality

This was a presentation I originally prepared back in 2005, but is probably even more applicable in 2009 given the impact using a GIS tool can have on visualizing data quality – customer addresses on a map! The next time you conduct a customer “data” assessment – try this! You can also see a high level data profile I prepared for this trade area of specific customers.


What Different Routines Do You Consider Important When “Data Profiling” In Order To Reveal The Quality Of Information In A Data Source?

There are several different types of data quality tools in the marketplace today to essentially do one important thing – cleanse, validate, correct, and enhance your data.

In order to better understand what the “quality expectation” is for YOUR CLIENT a baseline (or scorecard) must be established for each source system. Data profiling is an ideal way to reveal and share the results with others in order to make an informed decision and rank your findings.

Address Quality Extends Beyond CASS and NCOA

Several consultants have asked me over the last 6-8 months why don’t they (their respective firms) just build there own address management solution (code from scratch!) and purchase/license the CASS, NCOA, etc… content “only” directly from the USPS.

My answer in short is “let me tell you some reasons why not”.

1.) First and most important is many of the well known original suppliers of postal coding solutions in place today had to (and still go through) rigorous certification processes to insure their software and subsequent updates (versions) continue to comply with the USPS guidelines. (Note: Many of these well known suppliers I am referring to have been around since the early 1980’s when I began my career at Metromail. Now, that’s old. LOL)

2.) The number of “bugs” that have been reported to these same vendors over the years by their respective client bases (2,000+ clients in some cases) are best positioned to minimize risk for each “new license” of their products and services sold.

3.) The “people” behind the design, creation, and ongoing development of these (existing) postal products and services have 15+ (minimum) years of experience in the industry rather than a new team that may be just formed with little or no knowledge about this process.

4.) The barrier to entry for a new “postal coding” engine in 2009 (with the exception of a new add-on service you may want to bolt-on) to an existing postal coding engine is hard to envision. But, that’s my opinion.

In summary, my advice is to stick to creating some kind of exception process or create a client-specific data governance process (or standard) using an existing vendor solution offering that already has an established relationship with the USPS.

Here is a good example of one software supplier who exemplifies several of my points above:

GreyHair Software, Inc. goes beyond CASS/NCOA as major sources to power their address quality (best practice) offering which includes other alliances like the UAA Clearinghouse.

Let me explain further:

Here is a brief excerpt from an article last years published by GreyHair Software, Inc. Note: One of the executives at GrayHair is a past work associate of mine – Raymond Chin, Vice President of Product Management & Development. (See point #3 above)

Hold that thought, and read about how providers today are enhancing the traditional “postal” offerings today to expand beyond traditional USPS – CASS and NCOA… content!

Publication: Business Wire
Date: Wednesday, April 9 2008

GrayHair Software, Inc. and UAA Clearinghouse today introduced the most comprehensive set of offerings for managing Address Quality and reducing Undeliverable-As-Addressed (UAA) mail. By using source data from the USPS([R]) next-generation Intelligent Mail([R]) Barcode with change-of-address data from publishing and telecommunications organizations, best-of-breed solutions are now available for suppressing and/or redirecting addresses.

This will improve responses and reduce the cost of business mailings, thus enhancing the return-on-investment of direct mail programs, and making a significant contribution to the bottom line.

The article goes on… you can read the rest by going to:

(End of article, excerpt.)

In summary, consultants “do your research”… find existing companies like GreyHair Software to support your basic client needs (with confidence) plus any other unique requirements.

Addtional note,

Address Quality (Best Practices) today are providing more “value” than just save postage ($$$) and improved deliverability of a piece of mail… like days past.

The benefit of good address quality (best practices) is also a big plus for customer data integration (CDI) initiatives… resulting in increased customer match/merge/link/search scenario’s, especially in customer (MDM) hubs where clients today are centralizing disparate customer data sources across the enterprise into a single view of a customer.

To my fellow consultants… and postal software vendors… you are welcome to add your own comments or share with us your unique “product” differentiators.

Enterprise Data Quality Blog

Here is the link to the most recent article on my Enterprise Data Quality blog:

Premier-International’s EPACTL Tool (Applaud)

Premier-International is based in Chicago and has software and consulting services:

What is Applaud?

Applaud is the only “EPACTL” tool – the only single software product with integrated tools to extract, profile, analyze, cleanse, transform and load data.

EPACTL is a new breed of software that provides integrated tools to accomplish all requirements of data quality and data migration/consolidation projects.

After reviewing the website, here are some of the key service offerings I would like to share which has been directly taken from their website to avoid mis-representation:

1.) Data Migration and Data Conversion – Migrating data from legacy systems to a new replacement system.

2.) Data Consolidation – Consolidating data from multiple instances of the same system or multiple disparate systems.

3.) Data Cleansing – Cleansing data and supporting data quality initiatives.

4.) Data Quality Audits – Performing data quality audits.

5.) Data Integration – Constructing interfaces between on-going systems.

6.) Data Management for IT – Building customized data management solutions.

7.) Data Management for Employee Benefits – Delivering customized data management solutions for employee benefit consultants and actuaries.

8.) Rapid Application Development – Using Applaud’s RAD tools to deliver dynamic system solutions fast.

If you want to learn more about Applaud and Premier International, visit…

If there are any readers out their who have knowledge about Premier-International or Applaud, please feel free to comment.

From TDAN: 11 Predictions About Data Quality Space

Diby Malakar has written an interesting article on possible upcoming trends regarding Data Quality given the current economic climate:

Read this article and more at TDAN – The Data Administration Newsletter.

Gartner Says… Companies Want to Get The Data Right

Here is a good < 10 minute video on MDM from Gartner = November 2008 by Ted Friedman, Vice President covering Data Integration, Data Quality and Data Warehousing.

My high-level notes:

1.) Ties to critical business initiatives are a must.

2.) Gartner is seeing “pre-packaged” product offerings on the rise.

3.) Appliance offerings also… “Datawarehouse in a Box”.

4.) Challenges of Data Integration and Data Quality – automating data transformation and data cleansing routines.

5.) Companies getting even more serious about Data Quality given the regulatory issues.

6.) Data Quality and its impact on loss productivity, inaccurate data, etc.

7.) Companies want to get the data right.

8.) Business issues > IT issues according to Gartner

9.) Key question to ask your client is what does Data Quality mean to you?

10.) Dimensions are several – identify key metrics and they must be fact-based.

11.) Data Quality tools continue to emerge in the industry.

12.) Information Management issues are top of mind.

If your company has a Master Data Managment (MDM)

offering you would like to share – click here – and it will take

you to another blog =

where you can request to have your company name added to the “links” section of the blog.   Include a brief description, as well.

Here is Ted’s video:

Data Governance Offerings

For those of you with one or more datahygiene processes, procedures, and standards… are probably good candidates for creating a data governance program within your organization or in the case of a vendor… a niche offering that entails your best practices and lessons learned since its inception.

To find out more about data governance and companies who have “niche” offerings in this area:

Please visit this blog: Offers FREE Data Quality Analysis!

For a limited time is offering a FREE Data Quality analysis – just tell them you read about it on are experts in cleaning and enhancing your customer files, databases and mailing lists.  They support many of the world’s largest companies, as well as hundreds of smaller organizations.  Their services are provided in a controlled and secure environment where protection of your data is their number one priority.

The main data center is in Canada but they operate throughout North America.

For a limited time or until someone notify’s us at to discontinue this offer… is offering a FREE Data Quality analysis – just tell them you read about it on

Furthermore, If anyone has used this secure service and would like to comment about their past experience with Interact Direct Marketing,  please let our readers know more.

Thanks Dave Anderson for making this offer possible.

Seamless Data Quality for SAP

Using Information Server Data Quality Module:

1.) Duplicate Check

2.) Error Tolerant Search

* IBM OmniFind and IBM Information Server – Quality Stage

Listen to Details With Illustrative Examples: