Snowflake Migration Qa Lead Resume
2.00/5 (Submit Your Rating)
SUMMARY:
- has 15+ years of experience in Data Analytics, Quality Assurance with ETL, Data warehouse and Big Data implementation process, Design define the ETL Requirement, creation of Mapping and validation scripts. Involved in test plan creation, define the strategy and approach on data validation and certification with focus on requirement analysis and ETL & MDM solutions on Data driven Customer applications using Informatica with different databases like Oracle & DB2, Hive, Snowflake Cloud databases. Have 3+ Years in experience with Mobile application and functional testing.
- Have extensive experience in analyzing Business & Functional Requirements, Data Analysis, use case design, business acceptance certification, Data Profiling, Cleansing, and Data Verification and validation methodologies in Data warehouse
- Very good experience in capturing business requirement for various source dataflow and convert the requirement in ETL Mapping based on business rules and priorities.
- Worked in various solution approaches for project proposal, estimation and created, defined the ETL Integration & Migration testing plan, strategy that includes testing using various ETL, MDM tools, BI report and Big Data Cloud validation.
- Design and implemented Automation data quality assurance for Metadata, Data & Converted Informatica code Part of Snowflake Cloud Data Migration.
- Experience in Master Data Management (MDM), with creation of various data profiling rules, validation of match and merge & data stewardship.
- Created value adds to the Organization Specific Customizable automation tool for Metadata Validation and Data comparison between source and target systems like table to table, files to file and files to tables.
- Certified SAFe Agilist - Implemented on the Confidential project
- Have strong analytical skill helps customer bridge the gap between business and technical development of data transformation rules and ingestion in Data Lake environment
- Guide & mentor the offshore and onshore team members in preparing the ETL Design, Mapping & data validation scenarios, SQL scripts, defect Root Cause Analysis and comparison of data for the various transformation & business rules.
- Played a lead\Manager role in testing various projects involving of multiple new sources / enhancements / emergency fixes into the data warehouse and trained cloud systems like AWS.
- Work in various solution approaches for project proposal, estimation that includes using various ETL, MDM tools, BI report testing and Big Data Cloud testing.
TECHNICAL SKILLS:
Cloud Computing: Snowflake Cloud, AWS, Hive, Salesforce
ETL Tools: Informatics, MSBI
Databases: MS SQL Server 2008, Oracle11g, DB2, Snowflake
Master Data Management Tools: Oracle Customer Data HubOracle Product Data Hub (PIM)
Agile Methodology Tools: JIRA and Rally
Other Utilities: HP Application Lifecycle Management 12.20, Microsoft Team Foundation Server, MS-OfficeUnix
EXPERIENCE:
Confidential
Snowflake Migration QA Lead
Responsibilities:
- Created Snowflake Migration test plan and test strategy
- Created separate test execution plan for History migration, Incremental data load and Catch-up loads
- Created source data availability plan for Incremental and Catch-up
- Strategy created for data load required for validation and defined the parallel load approach to ensure the quality of code conversion
- Actively participate in regular discussions with the various Product Owners (SMEs) marketing leads and architectures tounderstand the business requirements.
- Create Metadata validation plan and guide the team to create scripts to run and compare and two different environments
- Analyze all the Informatica end to end data flow and define the validation approach for History and Incremental data loads
- Responsible for driving entire QA migration project for requirement gathering, creation of test plan by working with various team to articulate the dependency between applications and test runs
- Ensured requirement coverage by keeping track of the new requirements/changes as CR
- Create and review milestones and various testing approaches.
- Review the DB2 and Snowflake scripts for validation and ensure the PII fields masked and roles created as per the organization access policy
- Involved in various phases of project development cycle like requirement gathering,design SIT, UAT Support and Production Implementation Support
- Guide the onsite and offshore team on application flow, test runs, prioritization of deliverables and defect tracking
- Training the team to run the conversion tool to convert informatica mappings as per the Snowflake required format
- Guide the team to validate the converted informatica mapping and Snowflake specific DDL which defines the rules for data \ tables creation in Snowflake for the corresponding source data type
- Prepare daily and weekly status reports and review with Customer, Business team on project milestones, risk and dependencies
- Drive the defect triage calls with application and DBA teams and getting closure of conflicts between the team
Confidential
Analytical Test Lead
Responsibilities:
- Customer feedback identification and generate metadata of the customer survey using Informatica and DB2 by creating business rules to pull records from ODS sourced from Salesforce
- Analyze and load the Voice Call and unstructured data to Machine Learning enable tool to identify the sentiment and customer emotions.
- Actively participate in regular discussions with the various Product Owners (SMEs) marketing leads and architectures tounderstand the business requirements for Confidential .
- Ensured requirement coverage by keeping track of the new requirements/changes inrequirements of the Project as User Stories
- Prioritizing the requirement based on business discussion and role-up to the release deliverables in Agile Model
- Create test strategy and design for Source to target mapping documents and data required attributes to analyze and understand the customer expectation
- Create and review the Architecture, Dataflow diagram, requirement documents and prepare test plan, milestones and various testing approaches.
- Create DB2 and Oracle scripts for validation, PII information obfuscation which ensure the data security enabled and protected
- Involved in various phases of project development cycle like requirement gathering,design SIT, UAT Support and Production Implementation Support
- Interacted with clients, Business Analysts and FSD owners for clarifications and issuesand passed on the gathered inputs to offshore.
Confidential
Test Lead - Big Data QA Lead
Responsibilities:
- Actively participate in regular discussions with the various Product Owners (SMEs) tounderstand the business requirements for Customer Data Lake Project.
- Educated the offshore Big Data QA and Dev Team on the Business Understanding andguiding the entire team towards the required solutions
- Ensured requirement coverage by keeping track of the new requirements/changes inrequirements of the Project as JIRA User Stories
- Review of all the deliverables related to Big Data Requirement documents and create test Strategy, Plan,and Scenarios/use Cases
- Data validation and Automation of the Customer Data Lake data validation using Cognizant Information Value Management Tools Data Comparator and TASQ
- DB2 SQL Scripts to be created to auto validation of PII information obfuscation which ensure the data security enabled and protected
- Customer 360 Degree tools will be used to expose the reports to Data Stewards to create\update the hierarchy of the records