Publish
Share
Send
Data cloud consulting and digital transformation - Micropole

How Micropole supported Scor in automation, performance and governance, in particular through the deployment of Databricks

A look back at SCOR, the world's 4th largest reinsurer,present in over 160 countries with 3,500 employees. The Group offers its customers an innovative and diversifiedrange of solutions and services for risk control and management.

Issues

As part of the redevelopment of the OMEGA reinsurance management system (OMEGA 2.0), the evolution of reporting became a strategic focus to accompany the technical and functional transformations.
The main objective was to adapt SAP Business Objects-based reporting to new technological and business requirements, while integrating specific user requests. The project challenges were as follows:

  • Automate the generation and distribution of reporting packs for underwriters.
  • Improve claims management with an optimized architecture and a complete overhaul of indicators and universes.
  • Standardize and migrate the reporting environment from Sybase to Netezza and then DB2-Azure for greater performance and scalability.
  • Prepare the transition to a modern Data Platform based on Databricks to improve governance and exploitation of analytical data.
Thanks to Micropole's expertise, we were able to modernize our architecture, automate our processes and improve our data governance. Today, we have an agile, high-performance and scalable platform, perfectly aligned with our strategic ambitions.
Patrick POUX
Head of IT & Data - SCOR

Method

Micropole supported SCOR with the implementation of a structured methodology based on the principles of agility, process industrialization and rigorous data governance, in order to deliver robust and scalable solutions, while guaranteeing smooth adoption by end-users.

Automated generation and distribution of reports

1. Scoping workshops with subscribers :

  • Identify the specific needs of sales teams.
  • Definition of critical KPIs to be included in summary reports.
  • Mapping of data flows and interactions between Business Objects and the SCOR Intranet.

2. Project development in iterative mode (Agility - Scrum)

3. Industrialization and automation :

  • Development of an automated script to generate and format Sap Business Objects documents in PDF format.
  • Implementation of a programmed batch to guarantee regular execution.
  • Integration with the SCOR messaging system for automatic distribution of documents to underwriters.


Datamart modeling and implementation

1. Data design and governance :

  • Structuring business needs through collaborative workshops.
  • Definition of an indicator governance model.
  • Creation of a shared business glossary.

2. Datamart modeling and implementation :

  • Implementation of 7 universes, integrating 483 dimensions and key indicators.
  • Optimize data models by applying best practices for efficient data mart structuring.

3. Continuous integration and business validation :

  • Development under GIT to ensure rigorous monitoring of evolutions.
  • User acceptance phase with real-life demonstrations.


Migration to DB2-Azure and modernization of Reporting

1. Audit and analysis of existing systems: identification of critical dependencies between systems, benchmarking of performance to anticipate possible regressions...

2. Iterative, controlled migration :

  • Implementation of a hybrid Data Pipeline to ensure a gradual transition and redesign of stored procedures
  • Automated load and performance testing to guarantee optimal response time for reporting

3. Skills training and support for teams: training sessions on the new architecture and SQL best practices, full documentation on the new querying and performance optimization procedures.


Setting up a Data Platform on Databricks
 

1. Analysis and POC :

2. Industrializing the platform :

  • Deployment of a Databricks Lakehouse architecture to centralize data.
  • Automation of ETL workflows with Databricks Delta Live Tables and orchestration via Azure Data Factory.

3. Training and ramping up teams :

  • Technical coaching sessions for Data Analysts and Data Scientists.
  • Integration with Tableau Software for an advanced user experience.

Benefits

  • Improved customer care and claims management
  • Boosting sales efficiency and customer loyalty
  • Automation and productivity gains
  • Successful technology adoption and integration


Outlook and next steps :

  • Generalize Databricks: extend its use to other business areas and integrate more real-time data flows.
  • Continued adoption of Databricks, integrating advanced AI and machine learning functionalities.
    Strengthening data governance with data lineage and advanced monitoring tools.
  • Evolution towards a BI self-service model.
ACCELERATE WITH US
ARE YOU DATA FLUENT?

Contact us