As if ensuring accurate, clean data and managing it wasn’t
already a big enough challenge, these new, emerging
sources are making it even more difficult. With
new collection points, insurers are struggling
to achieve accurate contact data. For survey
respondents, the median percentage of inaccurate data in an organization’s existing
database was 35 percent.
Church Pension Group (CPG) recently
initiated a master data management program as a result of an abundance of data.
“Because we have different lines of businesses, we have different transactional systems those business units use,” said Danette
Patterson, manager, Enterprise Data, at the insurer, which provides retirement, health and life insurance benefits to Episcopal clergy and lay employees.
“Our first step was getting a centralized database in place, using
Oracle’s Customer Data Hub (CDH) to manage that data and exchange it across systems for elements that we have in common.”
Like Church Pension Group, other insurers have taken notice. Experian’s July 2011 report reveals that 90 percent of the
respondents plan to invest in initiatives related to data quality
“in the next 12 months.” So, by our calculations, right now.
According to the report, the reason for the inaccurate data
appears to stem from the way insurers clean it—manually. Only
27 percent of respondents currently use in-house software tools
to cleanse contact data. The most popular tools are point-of-cap-ture address verification, back-office software tools for existing
data and e-mail verification.
Another technology that helps insurers refine data for use
is de-duplication, which employs sophisticated algorithms to
weed out redundancies in data and compress it to a fraction of
its former size. Major storage vendors IBM, EMC and NetApp offer de-duplication within their product lines, but take different
approaches. There also are third-party vendors that will de-du-plicate data for carriers as a service.
“Technologies must be selected according to the business
case,” says Kurt Hausermann, senior manager, enterprise risk
management at Deloitte AG and author of the Deloitte report.
“A profiling tool, which shows the content of your data, is a good
starting point. Cleansing tools help you to standardize data and
remove duplicates (e.g. in master data or marketing data).”
However, technologies, such as de-duplication, data profil-
ing and data quality assessment tools, are just one of three di-
mensions in a successful strategy, according to Hauserman. Two
other dimensions—data risk management and data governance
—have to exist.
Also, before looking to tools, insurers need to consider different approaches to data quality, two to be specific—traditional
and emerging—according to Mark Gorman, CEO and founder,
The Gorman Group Insurance Consultancy.
In his report, “From the Backroom to the Boardroom: The
Evolution of Data Quality in the Insurance Market,” Gorman indicates that in the traditional approach the focus is on fixing data
quality issues. Issues are identified after reports have
been generated and before the reports are shared
with a broader audience. In the emerging approach, data is recognized driving information
for applications, such as predictive analytic
models, management dashboards, revenue
and expense forecasting, account profiling
and segmentation, etc.
The appropriate approach varies
among organizations, but Gorman says
picking one is the first step to a successful
data quality environment. “An organization
can do both [traditional and emerging], es-
pecially if siloed,” Gorman says. “However, the
emerging method, once implemented, requires
senior management support and will take precedence
In developing its MDM program, CPG started with tools but
discovered that every unit has different processes, policies and
data validations and verifications that may not correspond to the
other business units. “We found different things that we needed to
establish to ensure that the data was cleansed and maintained
appropriately,” Patterson says. “From that we established a cen-
tralized team to manage that data and build processes and rules
identified through data mining, monitoring and reporting. You
need to form various committees and an organizational structure
that has a certain hierarchy in order to ensure that all of the
units are aligned, especially when you have so many lines of
businesses.” —Carrie Burns
With the onslaught of Big Data and more sophisticated business
intelligence and analytics, data quality will be more important
than ever to insurers in 2012. The experts say you can’t get there
without clean, accessible data. Poor contact data affects a variety of operations, from underwriting to policy service. But from
a cost perspective, inaccurate contact data can have a significant
impact on insurers. In a September 2011 report “Why is Data
Quality Important for Insurers?” Deloitte lists four reasons:
1. Poor data quality is expensive: 10 percent of revenue.
2. Management needs valid, complete, consistent information.
3. Controlled data quality is mandatory for compliance.
4. Good data quality helps increase customer value.
Data is coming at insurers from an increasing variety of nontraditional sources. In fact, results from a survey of 100 insurance
industry respondents connected with data management across I T,
marketing, operations and finance reveal mobile and social media
as growing data capture points. Eighty-four percent of the respondents to the survey—“The Dilemma of Multichannel Contact Data
Accuracy” from Experian QAS, a division of Experian Marketing
Services—currently capture customer contact data through mobile applications. Ninety-six percent see the use of mobile platforms growing, and 95 percent communicate via social media.
Top data errors:
• Spelling mistakes
• Outdated data
• Incomplete data
• Duplicate data
Source: Experian QAS report “The Dilemmas of
Multichannel contact Data Accuracy”
• Establishing the lineage of the data
• Identifying source systems and
• Recording transformations that were
applied and the reports using the data
• Enabling the traceability of the data
• Identifying downstream applications
and reports that make use of the data
Source: The Gorman Group report “From the
Backroom to the Boardroom: The Evolution of
Data Quality in the Insurance Market”
november/december 2011 insurance networking news 13