We provide IT Staff Augmentation Services!

Data And Etl Quality Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Almost 6 years of IT experience in Data Quality testing in ETL, Big data and business Intelligence reporting/dashboards.
  • Extensively used various ETL tools like Informatica PowerCenter 9.6/9.1/8, Informatica Data Quality (IDQ) 9.6/9.1, Informatica DVO 9.6.0, Microsoft SSIS, IBM Datastage for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time and automating the ETL testing.
  • Knowledge of data warehouse approaches - Top down (Inmon’s approach) and Bottom up (Kimball’s approach), methodologies- Star Schema, Snowflake.
  • Working Experience with various Agile methodology and ceremonies, Scrum process and Kanban methodology
  • Extensive experience in ETL/ Data warehouse backend testing and BI Intelligence reports testing.
  • Excellent knowledge in requirement analysis, test planning, test strategies and test schedules.
  • Extensive experience in testing and reviewing of dimensional modeling (Star and Snowflake) of data warehouse.
  • Experience in CDC, daily load strategies of Data warehouse and Data marts, slowly changing dimensions (Type1, Type2, and Type3), Surrogate Keys and Data warehouse concepts.
  • Experience in debugging, troubleshooting, bug fixing and Performance tuning of Informatica mappings and Data Warehouse loads to resolve any data inconsistencies across loads.
  • Used Informatica Data validation Option (DVO) to complete data testing quickly and easily by creating rules that test the data being transformed during the data integration process.
  • Experience in leading, communicating, managing SLA and expectations of the senior management team and affected stakeholders during the planning and roll out of project releases
  • Experience with Microsoft Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer
  • Knowledge in AWS lambda, S3, EC2, SQS, RDS EMR and Aurora Glue
  • Possess specific experience performing testing including Backend, Frontend, Regression, Functional, System, Interface, Usability, Black box, White box, Integration testing and User Acceptance Testing.
  • Proficient in troubleshooting database issues and experienced in using performance monitor/profiler to solve dead locks/ long running queries.
  • Experience in Business Intelligence testing in various reports using IBM Cognos, SAP Business Objects, MicroStrategy, Microsoft Power BI and Microsoft SSRS.
  • Excellent Knowledge in Microsoft 365, Microsoft Visio, Microsoft TFS
  • Strong analytical and quantitative skills with the ability to use data and metrics to back up assumptions, recommendations, and drive actions

TECHNICAL SKILLS

ETL Tool: Informatica Power Center, Informatica DVO, Microsoft SSIS, IBM Datastage, Microsoft Azure Databricks and Data Factory, Microsoft Azure data Factory (ADF), Confidential Web Services

Business Intelligence and Reporting Tools: Microsoft SSRS, Microsoft Power BI, MicroStrategy, IBM Cognos, SAP Business Objects and Crystal Reports

Database: Microsoft SQL Server, Teradata, Oracle PL/SQL, IBM DB2, Microsoft Azure database, Confidential RDS

IT Service Management: Incident Management, Change Management, Problem Management, Knowledge management, Project Management

ALM/DevOps Tools: ServiceNow, BMC Remedy, HP Service Manager, HP Quality Center, Microsoft TFS, Jira, Trello

Languages: SQL, Unix

PROFESSIONAL EXPERIENCE

Confidential

Data and ETL Quality Engineer

Responsibilities:

  • Work closely with developers/project owners and BAs to develop and execute thorough test suites in all phases of the software development cycle
  • Develop Test strategy, test plan/design, execute test cases and defect management for the ETL & BI systems
  • Develop and execute detailed ETL related functional, performance, integration and regression test cases, and documentation
  • Created test cases and executed ETL functionality to test - Source to Target Data count validation, data transformation validation, data length validation, duplicate data check, audit check, performance check.
  • Created and tested Source to Target mapping (STM)
  • Developed rules and mapplets that are commonly used in different mappings
  • Performed data analysis, identified data dependencies with the source systems to create test data sets.
  • Used various transformations like Address validator, parser, joiner, filter, matching to develop the maps
  • Worked in data warehouse migration project from Teradata to Microsoft SQL Server
  • Worked for migration of mapping from Informatica Power Center to Microsoft SSIS
  • Worked on Big data Hadoop projects
  • Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
  • Analyzed records loaded into staging table that are extracted from several tables in FACETS core database.
  • Involved in migration of the maps from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Worked on different environments with different source and target databases like Teradata, DB2, and SQL server.
  • Validated various Power BI, MicroStrategy dashboards and SSRS reports based on business requirements
  • Facilitated UAT with Product owners and Business owners
  • Worked on Microsoft Team Foundation Server/Azure DevOps for Agile to create Epics, Features, User Stories and tasks
  • Engage with business users to gather requirements, design visualizations and provide training to use self-service BI tools.
  • Developed and maintained multiple Power BI dashboards/reports and content packs
  • Created POWER BI Visualizations and Dashboards as per the requirements
  • Actively participated and conducted Agile ceremonies like Stand-ups, User Story grooming, Sprint Review, Sprint Planning, Story pointing, prioritization meeting and Retro meetings.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Working on Microsoft Azure suite to implement ETL and data migration solutions using: Azure SQL Database, Azure Data Lake (ADLS), Azure Data Factory (ADF) V2, Azure SQL Data Warehouse.

Tools: Informatica Power Center 10/9.6, Informatica Data Quality (IDQ) 9.6, Teradata, Microsoft SQL Server 2014, Teradata, MicroStrategy, Microsoft Power BI, Microsoft SSIS, Microsoft SSRS, Microsoft TFS, Service Now, Microsoft Azure, HDFS, Hive, HP Tidal Jobs, Agile, Microsoft Office 365, Unix

Confidential

Data and ETL Quality Engineer

Responsibilities:

  • Wrote test scenarios, generic tests cases, detailed positive and negative test cases for ETL.
  • Performed functional testing and prepared regression test scripts for ETL process.
  • Executed test scripts per defined ETL testing processes.
  • Manually performed integration and regression testing, documented bugs and worked with development team to resolve issues.
  • Extensively used Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Used Informatica DVO for Table Pairs Test, value tests, single table constraints to test source to target mapping.
  • Performed backend validation testing for Oracle database using Toad (test SQL of Source and destination using latest data mapping).
  • Wrote shell scripts to back up daily test data.
  • Participated in the design, development, implementation and maintenance of an on-going enterprise-wide Data Factory program.
  • Participated in data analysis in support of different business data requirements definition, UAT and training activities.
  • Assisted in the translation of other business data requirements into conceptual, logical, and physical data models.
  • Participated in identification of data sources for the required data attributes.
  • Developed data maps to document data attributes to data sources also data targets, including identification and documentation of data transformation algorithms.
  • Assisted in knowledge transfer to other technical staff members on business data processes.
  • Imported various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
  • Used heterogeneous files from Oracle, Flat Files, SQL server as source and imported stored procedures from oracle for transformations.
  • Created test dashboard and Report test plans and verified counts between summary and detail reports various SAP Business objects dashboards
  • Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
  • Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Tested standard and Ad hoc reports and undergone data validation for the Cognos reports.
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
  • Worked on Production support cycle to address ETL job failures and dashboard failures
  • Worked to find Root Cause Analysis of a ETL job failure and data anomaly
  • Participating in Technical Architecture Documents, Project Design and Implementation Discussions

Tools: Informatica Power Center 9.6.1/9.5, Informatica Data Quality (IDQ) 9.6, IBM DB2 9.7, SQL Server 2008, Business Objects XI R2, SQL, PL/SQL, XML, T-SQL, IBM Cognos, HP Quality Center, Microsoft Visio, MicroStrategy, MS Office Suite, Unix

Confidential

ETL/BI Quality Engineer

Responsibilities:

  • Involved in QA Analysis, Design, and Testing of the application.
  • Created Test strategy and Test Plans, reviewed requirements to ensure they were clear and testable
  • Executed Test Scripts and Test Cases using manual testing procedures.
  • Designed test design documents and QA project development design documents.
  • Coordinated test activities with all testing resources
  • Performed Regression testing on corporate and personal documents and fixed the errors.
  • Tested Data Stage ETL jobs according to data mapping requirements.
  • Testing to make sure that data is moved from source database to destination database by writing SQL queries.
  • Worked on Value added routines in Facets and provider and subscriber modules.
  • Involved in validating the claims, invoices coming as Electronic data Interchange transactions.
  • Involved in validated the communications, syntax and compatibility of information between EDIs.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Worked on improving stored procedure and trigger performance.
  • Conducted Black Box (Functionality Testing), User Acceptance, and Regression testing.
  • Created Traceability Matrix to ensure that all the requirements are covered by the test cases
  • Involved in testing Cognos reports for data quality and cosmetics according to requirements.
  • Conducted status report meetings with internal team on a weekly basis and Documented and tracked status meeting minutes and activities
  • Provided clear concise feedback to Development team on recurring errors both on individual and team with aim of long-term reduction of defects found in final releases
  • Conducted status report meetings with internal team on a weekly basis and Documented and tracked status meeting minutes and activities
  • Provide on Call Support for Prod mission-critical (24x7) or problematic databases, and own problem resolution from end-to-end
  • Extensive worked on Migrating Databases residing on servers in one data center to another and Upgrading Database platforms to newer versions
  • Document technical details of issues resolved and assist the team with building and maintaining technical repositories.
  • Work closely with the infrastructure team to perform SQL Server installations and to configure hardware and software so that it functions optimally with the DBMS. Plan and implement database upgrades as needed

Tools: MS SQL Server 2005, MS SQL Server Reporting Services 2005, MS SQL Server Integration Services 2005, Microsoft SSRS, Flat Files, DB2, TOAD 9.5, MS Office, XML, Windows Server 2003, Mercury Quality Center 9.2, Lotus Notes 7.0.3, Agile Scrum, Web services, HTML and JavaScript

We'd love your feedback!