Tweet
Share
Send

How to simplify and accelerate a data quality management project

"I don't have any data quality problems. "We manage our data via an ERP system, so it's good quality. ". Who hasn't heard that? But things have changed in recent years, with the arrival of cloud computing and, above all, big data. Today, data quality is a key issue, particularly in the banking and insurance sector, where Solvency II regulations require data quality management. Éric Gacia, Consulting Director for Banque Assurance at Micropole, looks at recent changes in this area.

According to a 2011 PricewaterhouseCoopers (PwC) survey, "90% of companies believe it is essential to have a data quality strategy, but only 15% actually address it." Why? Companies often do not have internal sponsors to manage this critical aspect. The goal, however, is crucial: to know the data, define an internal data dictionary and a common vocabulary for the company to share a consistent vision of the data.

A regulatory requirement

Solvency II emphasizes data quality and data governance through very concrete regulatory requirements. It has therefore become a real business issue in the banking and insurance sector, whereas two or three years ago, the profitability and relevance of these projects were not proven. "At a large group in the banking and insurance sector, which is one of my clients, everyone is now on board with this initiative because no one has indicators on the quality of the data, which makes it complicated to use them and generates discrepancies, sometimes significant, between the various reports produced. Another smaller organization told me that they were running their marketing campaigns "blind" because they didn't trust the data they were using. Management has become very sensitive to this topic. There is a real maturity, especially since ROI is quickly achieved. The key is to gain confidence in the data used, to know it in order to better exploit it, to save time and to devote oneself to tasks with higher added value, linked to one's original business.

What governance for what data?

Governance includes all the rules and procedures to be followed. It is necessary to set up a data quality steering committee involving the many players concerned: general management, the IT department, the risk management department and management control... The objective is then to set up an operational working group that will analyze quality indicators, propose and follow up on action plans aimed at improving quality, and proactively analyze the impacts of changes related to data (new products, new management applications...). Data quality management has led to the "creation" of a new position in companies: that of Data Manager. A true conductor, the Data Manager will ensure the follow-up of the indicators in place or to be created, the integrity of the repositories, as well as the piloting of improvement actions to measure their efficiency.

If all the company's data is concerned, external data is more difficult to control. In order to integrate this data, it will be necessary to implement a data quality firewall that constitutes a data quality lock (compliance with quality standards, management of rejections and implementation of manual correction actions, etc.) before it is integrated into internal operational applications. The data stewardship portal can also be made available to data provider partners in order to make them accountable and provide them with all the tools they need to guarantee the quality of their data. 


The need to involve business managers

To carry out a data quality governance project, various internal sponsors will be essential. All departments are involved: general management, finance, marketing, risk management, internal control, accounting, etc. Very often, each department still manages its own data, which is neither shared nor centralized, which creates significant discrepancies in the figures. A cross-functional understanding of the approach allows the creation of a reliable data repository that is shared by all.  

Connected and decentralized planning: adapting business processes in a VUCA environment

Connected and decentralized planning: adapting...

On March 14, Micropole took part in the Journées DAF, organized by...
Adoption, the 1st success factor for a Data project

Adoption, the 1st success factor for a project...

Today, every company has to collect, manage and use its data,...
AI, a powerful ally in product data quality

AI, a powerful ally in quality...

Since the beginning of 2023, generative AIs have been booming...
ACCELERATE WITH US
ARE YOU DATA FLUENT?

Contact us