Case Studies Case Study: Testing Data Lake Applications in Financial Services
Edit This Case Study Record

Case Study: Testing Data Lake Applications in Financial Services

Analytics & Modeling - Data Mining
Analytics & Modeling - Predictive Analytics
Functional Applications - Enterprise Resource Planning Systems (ERP)
Finance & Insurance
Business Operation
Quality Assurance
Software Design & Engineering Services
System Integration
The Bank issued a Request For Information (RFI) to evaluate a test data solution that would enable multiple teams to perform a complete range of testing operations in a highly efficient and scalable manner. They focused on synthetic test data generation because of the ability to produce highly controlled data variations in multiple data formats and its inherent data security. They were looking for a solution that would meet their needs for automated unit testing, exhaustive functional testing and performance testing procedures. GenRocket responded to the RFI with a complete Test Data Automation solution and participated in a rigorous Proof of Concept (POC). The combined RFI/POC process included several test data challenges. They were incorporated into four use cases that reflect their application testing requirements. They also wanted to evaluate the management and scalability of the system. The POC requirements are briefly outlined below. System Setup and User Account Management The Bank required a platform to provide control over access to the system and its resources. They also required reporting on various aspects of system operations.
Read More
A multi-national banking and financial services corporation required comprehensive test data automation for testing its data lake applications. The company offers financial products for retail banking, direct banking, commercial banking, investment banking, wholesale banking, private banking, asset management, and insurance services. They operate in more than 40 countries and rank as one the world’s largest banks. Data lakes are repositories for large amounts of data collected from multiple sources in a raw and native format. They eliminate information silos by combining data from diverse sources such as electronic banking systems, IoT devices, social media sites, and internal-collaboration systems. Data may be stored in structured, semi-structured or unstructured formats. Increasingly, banks are using data lakes to turn big data into actionable business intelligence to drive profitable business outcomes. Electronic data is growing at a phenomenal rate due to the rise in online banking and digital transformation of the customer experience.
Read More
GenRocket worked closely with one of its global IT services partners to jointly conduct the POC with the Bank. The process showed how GenRocket can combine speed of provisioning with full control over data quality in a way no other test data solution can match. This successful POC resulted in the selection of GenRocket as the best solution for the Bank’s Data Lake testing requirements. Several GenRocket capabilities combined to make this evaluation a success. Modular Architecture GenRocket’s component based architecture provides the flexibility to design any variation or volume of test data with assured referential integrity. Powerful data generators and receivers allow the Bank’s QA team to conduct exhaustive testing with extensive control over data combinations, patterns and permutations. The ability to query external data sources allowed testers to combine real-world production data with controlled synthetic data. The use of synthetic data provided total security and compliance with privacy laws. Model-Based Test Data Because GenRocket’s test data is based on the customer’s data model, any database schema, DDL file, or a metadata contained in a CSV file can be used to structure test data that accurately reproduces the original database or file format. Data models can be imported and immediately used to define test data scenarios for generating real-time test data on-demand.
Read More
GenRocket’s component based architecture provides the flexibility to design any variation or volume of test data with assured referential integrity.
The ability to query external data sources allowed testers to combine real-world production data with controlled synthetic data.
The use of synthetic data provided total security and compliance with privacy laws.
File 1 = 1,000 records
File 2 = 100,000 records
File 3= 1,000,000 records
Download PDF Version
test test