Companies across every industry vertical continue to face the challenge of how to effectively migrate and quickly access massive amounts of enterprise data—all while keeping system performance up to par throughout the obstacle-ridden process.
The complexities involved with the ubiquitous, traditional Relational Database Management Systems (RDBMS) are many. RDBMS systems can often inhibit performance, falter under heavy volumes and slow down deployment. With MongoDB’s document-based, distributed database, however, performance and volume issues are easily addressed. But when it comes to speeding up time to market? The right auxiliary tools are still needed.
Capgemini, a MongoDB partner and global leader in digital transformation, provides the final piece of the puzzle with a new tool rooted in automated intelligence. In this blog, we’ll explore three key Capgemini Solutions that help customers modernize to MongoDB.
-
Tools that expedite time to market
-
Migration from legacy system to MongoDB
-
New development using MongoDB as a backend database
Whether your company is developing a new database or migrating from legacy systems to MongoDB, Capgemini’s new Database Convert & Compare (DCC) tool can help. Below, we’ll detail how DCC works, then walk through a few recent, client examples and the immense benefits reaped.
Tool: Database Convert & Compare (DCC)
A powerful tool developed by the Capgemini team, DCC optimizes activities like database migration, data comparison, validation and much more. The tool can perform data transformations with specific customization based on the source and target database in the scope. When migrating from RDBMS to MongoDB, DCC achieves 70% automation and 30% manual retrofit on a database level.
How does DCC work?
In context of RDBMS to NoSQL migration, DCC performs the migration in 3 stages.
1) Assessment:
- Source database schema assessment – DCC extracts source schema information and performs an assessment to generate detailed inventory of data objects such as tables, views, stored procedures and indexes. It also generates a detailed report on data volume from each table which helps in assessing estimated data migration time from source to target
- Apply analytics to prepare recommendation for target database structure—The target structure varies based on various parameters, such as:
- Table relationships (one to many, many to many, one to one)
- Indexes applied on table for performance requirements
- Column data type
2) Schema Migration
- Customize tool to apply recommendation from step 1.2 hence generating the script for target database
- Target schema script preparation – DCC will generate a complete database schema script except for a few object types such as stored procedure, views etc.
- Produce detailed report of schema migration, inclusive of objects that couldn’t be migrated
- Manual intervention is required to apply business logic implementation of source database, stored procedures and views to target environment application
3) Data Migration
- Column mapping – assessment report generates inventory of source database table fields as well as post recommended schema structure; the report also provides recommended field mapping from source to target based on adopted recommendation and DCC customization
- Post migration data validation script – DCC generates a data validation script after data migration is complete which takes field mapping into consideration from the related assessment and recommendation reports
- Data migration script for execution – DCC allows for the setup and configuration of different scripts for data migration, such as:
- One-time data migration from source to target
- Daily batch run to sync up source and target database data
- Intermittent data validation during the process of data migrationIf there are any discrepancies found in validation, the job will stop and generate a report with potential root cause of issue in data migration)
The Capgemini team has successfully implemented and deployed the DCC tool for various banking customers for RDBMS to NoSQL end-to-end migration including for application retrofit and rewiring using other capable tools such as CAP360
Case study 1: Migration from Mainframe to MongoDB for a Large European Investment Bank
A large banking client encountered significant challenges in terms of growth and scale-up, low resilience and increased risks, and certainly increasing costs associated with the advent of mobile banking and a related significant increase in volume. To help the client evolve more quickly, Capgemini built an Operational Data Platform to offload expensive mainframe operations, as well as store and process customer transactions for business operations, analysis and reporting.
The Challenge:
-
Inefficient and slow to meet customer demand for new digital banking services due to heavy reliance on legacy infrastructure and apps
-
Continued growth in traffic and the launch of new digital services led to increased cost of operations and decreased performance
-
Mainframe was the single point of failure for many applications. Outages resulted in poor customer service, brand erosion, and regulatory concerns
The Approach:
An analysis of digital channels revealed that 92% of traffic was generated by 25 interaction types, with 85% of these being read-only. To offload these operations from the mainframe, an operational data lake (ODL) was created. MongoDB-based ODL was updated in near real-time via change data capture and messaging queue to power existing apps, new digital services and other APIs.
Outcome and Benefits:
-
Accelerated time to market for new digital services, including personalization
-
Improved stand-in capability to support resiliency during planned and unplanned mainframe outages
-
Reduced number of read-only transactions to mainframes (MIPS cost), freeing up resources for additional growth
-
Saved the customer over 80% in year-on-year post migration costs. The new MongoDB database was seamlessly able to handle 25mn+ transactions per day as well as able to handle data volume of over 30 months of history with ~13b transactions held in 114m documents
Case study 2: Migration of Large-scale Database from Legacy to MongoDB for US-based Insurance Customer
A US-based insurance client faced disparate data spread across 100+ systems, making data aggregation a cumbersome process. The client wanted to access the many data points around a single customer without hindering performance of the entire system.
The Challenge:
-
Reconciling different data schemas from multiple systems into a single schema is problematic and, in many cases, impossible.
-
When adding new data sources, it is difficult to iterate on the schema quickly. Providing access to the data within the ‘Single View’ requires ad hoc queries as well as multi-layer indexing and aggregation which becomes complicated for relational databases to provide.
-
Lack of personalization and ability to provide context-based experiences in real time results in lost business opportunities.
Approach:
In order to assist customer service reps in real-time, we built “The Wall,” a single view application that pulls disparate data from legacy systems for analytics. Additionally, we designed a flexible data model to aggregate disparate data into a single data store.
MongoDB’s expressive query language and secondary indexes can reach any field in real time, making data access faster and easier.
Our approach was designed based on 4 key foundations:
-
Document Model – Rich and flexible data store. A single document can store up to 16 MB of data. With 20+ data types meant flexibility in terms of managing data
-
Versatility – Variety of structured and non-structured data models defined
-
Analytics – Strong data aggregator framework to aggregate data related to single customer
-
Workload Isolation – Parallel run for operational and analytical workload on same cluster
Outcome and Benefits:
Our largest insurance customer was able to attain the single view of the customer within 90 days timespan. A different insurance customer achieved 360 degree view of 13 million customers on MongoDB Enterprise Advanced. And yet another esteemed healthcare customer was able to achieve as much as a 300% reduction in processing times and increased processing throughput with 50% less hardware.
Ready to accelerate your digital transformation? Capgemini and MongoDB can help you re-envision your data and advance your business processes so you can focus on innovation. Reach out today to get started.