We provide IT Staff Augmentation Services!

Sr. Etl /backend/qa Tester Resume

3.00/5 (Submit Your Rating)

Sanantonio, TX

Professional Experience:

  • 6+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI,SOA, Client/Server applications.
  • Excellent testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Solid Back End Testing experience by writing and executing SQL/PL-SQL/ T-SQL
  • Extensively strong on databases including MS SQL Server 2000 /2005/2008, Oracle /11g10g/9i/8i, Teradata V2R6/12/13,DB2 8.x.
  • Conducted functional, integration, regression, UAT, performance test of the applications
  • Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.
  • Experienced in planning and executing System, Integration, Functional Testing, Regression Testing, Performance, and UAT Testing.
  • Expertise in creating Test Plan documents and developing test strategy documents and preparing the Traceability Matrices.
  • Expertise in designing the test scenarios and scripting the test cases in order to test the application.
  • Automated and scheduled the Informatica jobs using UNIX Shell Scripting.
  • Experience in UNIX shell scripting and configuring cron-jobs for ETL Loads
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using ETL tools like Informatica
  • Expertise in Testing complex Business rules by creating mapping and various transformations
  • Strong in testing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL.
  • Good experience in Data Modeling using Star Schema and Snow Flake Schema
  • Good understanding of OOPS concepts

Technical Skills:

OPERATING SYSTEMS

Windows XP,NT/95/2000, OS/2, Sun Solaris 2.6/2.7,Linux 6.X/7.X/8.X

LANGUAGES KNOWN

C, PL/SQL 8.x/2.x, SQL*Plus, Java, Shell Scripting, PERL

REPORTING TOOLS/OLAP

Crystal Reports 11, Business Objects 5.1,6.5.2,XIR2,XI R3 Cognos 6.5/7.0,8.X, SSRS, SSAS

SCRIPTING LANGUAGES

VB Script, Java Script, PERL, XSLT

DATABASES

IBM DB2 8.x, Oracle 9i/10g, Teradata V2R6, Sybase 12.3, SQL Server 2000/2005, 2008,Netezza NPS 8050

ETL TOOLs

Ab Initio, (GDE 1.14/1.13/1.12, Co>Op 2.14/2.13/2.12), EME, Data Profiler, Informatica 8.6/8.1/7.1/6.2/5.1 (Power Mart/Power Center) (Designer, Workflow Manager, Workflow Monitor, Server Manager, Power Connect), DataStage 8.1/ 7.5.1, SSIS/DTS

WEB SERVERS

Java WebServer2.0, Netscape Enterprise Server, Web Logic 6.0.

DATAMODELLING TOOLS

Erwin 3.5.1/4.x,7.x, Designer 2000

DATAWAREHOUSING

Star Schema, Snow Flake Schema, Kimball Methodolgy , Bill Inmon Methodolgy

Confidential, SanAntonio,TX Aug' 11 - Till date
Sr. ETL /Backend/QA Tester - Master Data Management, Experian CheetahMail Project

Responsibilities:
  • Analyzed business requirements, system requirements, and data mapping requirement specifications interacting with client, developers and QA team.
  • Involved in rigorous meetings with offshore testing team and DWH Lead to plan and implement the testing efforts.
  • Created test data for all ETL mapping rules to test the functionality of the Informatica workflows.
  • Writing complex SQL queries for data validation for verifying the SSIS Packages and business Rules.
  • Tested the Parent packages and packages and dataflow transformations and control flow\'s and scripting tasks.
  • Tested OLAP cubes for various business calculations and written MDX for validating the same including working with different MDX data types like Scalar, Dimension, Hierarchy, Level, Member, Tuple and Set.
  • Worked with SQL*Plus, SQL*Loader, Data Pump, Exp/Imp, TKPROF, Oracle scheduler, Advanced Queues, Object management.
  • Worked with MYSQL performance tuning and optimization. (query optimization, index tuning, caching and buffer tuning)
  • Tested reports using SSRS functionality like Queries, Slice and Dice, Drill down, @Functions, Formulae etc.
  • Tested and converted each report to Excel format and tested both xls files.
  • Tested all formats of SSRS reports including Excel, PDF, CSV, XML, TIFF and many other
  • Responsible for identifying and defining the Key Performance Indicators in SSAS.
  • Tested Metadata, formula's dimensions and measure groups, aggregations in Cube.
  • Tested the KPI, Hierarchies of cubes and cascading parameters.
  • Tested the SSRS report functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Ad hoc , Dashboards etc
  • Tested Stored Procedures, Complex Queries, Triggers, UDF and Views using SQL Server 2005
  • Involved in understanding Logical and Physical Data model using Erwin Tool.
  • Prepared the guides for UAT.
  • Writing complex SQL queries for data validation for verifying the ETL Mapping Rules.
  • Trained the users before UAT on how to test and document the test results. Also, assisted the users during UAT. Took part in Triage Meetings with the required parties after defect analysis to prioritize defect resolution.
  • Tested several dashboards and deployed them across the organization to monitor the performance.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Performed validation tests to ensure that the developed functionality meets the specifications prior to UAT testing.
  • Used Mercury Quality Center to document issues found during the test automation process, suggested appropriate solutions and prioritized defects for resolution in coordination with the development team
  • Functioned as the Onsite / Offshore coordinator and Team Lead
  • Created and executed Test scripts for system validation and user acceptance testing (UAT)
  • Created ETL execution scripts for automating jobs.

Environment: Informatica 8.6/8.1, MS SQL SERVER 2005/2008/2008 R2, SSRS, SSAS, MS Excel, MS Access, Oracle 10g, MySQL 5.0 ,Windows 2007, SQL Server 2008, Quality Center 9.2, T-SQL, SQL, MOSS, TOAD, Flat Files

Confidential, Richmond, VA Jun' 09 - July'11
Sr ETL Tester - Customer Data Integration

Responsibilities:
  • Analyzed business requirements, system requirements, and data mapping requirement specifications interacting with client, developers and QA team.
  • This project is about Customer Data Integration Customer Data Integration includes the systems, processes and rules required to harmonize disparate customer data into a best version of the truth
  • Involved in rigorous meetings with offshore testing team and DWH Lead to plan and implement the testing efforts.
  • Writing complex SQL queries for data validation for verifying the SSIS Packages and business Rules.
  • Metadata graphs from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Perl.
  • Experience in ETL Data Warehousing, database testing using Ab Initio for ETL process.
  • Created test plans and performed unit testing for the Ab Initio graphs
  • Analyzed and verified the flat, Mainframe IMS and xml data using Ab Initio Data Profiler.
  • Execute SQL Queries in TOAD to validate back end information and compare it with the information retrieved from Data warehouse Ab-Initio
  • Tested the Parent packages and packages and dataflow transformations and control flow\'s and scripting tasks.
  • Tested OLAP cubes for various business calculations and written MDX for validating the same including working with different MDX data types like Scalar, Dimension, Hierarchy, Level, Member, Tuple and Set.
  • Tested the SAS jobs in batch mode through UNIX shell scripts
  • Involved in code changes for SAS programs and UNIX shell scripts
  • Tested and Automated SAS jobs running on a daily, weekly and monthly basis using Unix Shell Scripting
  • Worked with SQL*Plus, SQL*Loader, Data Pump, Exp/Imp, TKPROF, Oracle scheduler, Advanced Queues, Object management.
  • Worked with MYSQL performance tuning and optimization. (query optimization, index tuning, caching and buffer tuning)
  • Tested and converted each report to Excel format and tested both xls files.
  • Tested all formats of SSRS reports including Excel, PDF, CSV, XML, TIFF and many other
  • Maintain, improve and support current ETL and data integration processes.
  • Tested Metadata, formula's dimensions and measure groups, aggregations in Cube.
  • Tested the KPI, Hierarchies of cubes and cascading parameters.
  • Tested the SSRS report functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Ad hoc , Dashboards etc.
  • Involved in understanding Logical and Physical Data model using Erwin Tool.
  • Prepared the guides for UAT.
  • Writing complex SQL queries for data validation for verifying the ETL Mapping Rules.
  • Trained the users before UAT on how to test and document the test results. Also, assisted the users during UAT. Took part in Triage Meetings with the required parties after defect analysis to prioritize defect resolution.
  • Tested several dashboards and deployed them across the organization to monitor the performance.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Performed validation tests to ensure that the developed functionality meets the specifications prior to UAT testing.
  • Used Mercury Quality Center to document issues found during the test automation process, suggested appropriate solutions and prioritized defects for resolution in coordination with the development team
  • Functioned as the Onsite / Offshore coordinator and Team Lead
  • Created and executed Test scripts for system validation and user acceptance testing (UAT)
  • Created ETL execution scripts for automating jobs.

Environment: Ab Initio, (GDE 1.14, Co>Op 2.14), EME, Data Profiler, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ) SSIS/DTS, MS SQL SERVER 2005/2008, Quality Center 9.2, T-SQL, SQL, MOSS, TOAD, Flat Files

Confidential, Columbus, OH Mar' 08- Apr '09
DWH Tester - Enterprise Data Warehouse (EDW)

Responsibilities

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.
  • Written complex SQL queries for querying data against different data bases for data verification process.
  • Designed the data flow diagrams using VISIO.
  • Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Written test cases to test the application manually in Quality Center and automated using Quick Test Pro.
  • Responsible for creating manual test scripts to include Functional Test, Regression Test, UAT, Migration Test and Study Configuration Test.
  • Analyzed SQL code to ensure Business Objects queries the correct data from the database.
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test Director.
  • Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans.
  • Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Creating test cases for ETL mappings and design documents for production support.
  • Setting up, monitoring and using Job Control System in Development/QA/Prod.
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files.
  • Scheduling and automating jobs to be run in a batch process.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Reported bugs and tracked defects using Test Director 6.5
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Worked on issues with migration from development to testing.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic.
  • Performed UAT testing

Environment: IBM Infosphere Datastage, UNIX Shell Scripting, Business Objects 6.5, Oracle 10g, Mercury Test Director 6.5, QTP 7.2, SQL *Loader, Cognos 7.0, Oracle8i, SQL Server 2000, Erwin 3.5, Windows 2000, TOAD 7.6

Confidential, Irving, TX Aug' 06- Jan '08
DWH Tester (Customer Data Mart - Global Consumer Group)

Responsibilities

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.
  • Written complex SQL queries for querying data against different data bases for data verification process.
  • Designed the data flow diagrams using VISIO.
  • Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Written test cases to test the application manually in Quality Center and automated using Quick Test Pro.
  • Responsible for creating manual test scripts to include Functional Test, Regression Test, UAT, Migration Test and Study Configuration Test.
  • Analyzed SQL code to ensure Business Objects queries the correct data from the database.
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test Director.
  • Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans.
  • Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Creating test cases for ETL mappings and design documents for production support.
  • Setting up, monitoring and using Job Control System in Development/QA/Prod.
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files.
  • Scheduling and automating jobs to be run in a batch process.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Reported bugs and tracked defects using Test Director 6.5
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Worked on issues with migration from development to testing.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic.
  • Performed UAT testing

Environment: Informatica (Power Center 7.1 workflow manager, workflow monitor) (Power Center Designer, UNIX Shell Scripting, Business Objects 6.5, Oracle 10g, Mercury Test Director 6.5, QTP 7.2, SQL *Loader, Cognos 7.0, Oracle8i, SQL Server 2000, Erwin 3.5, Windows 2000, TOAD 7.6

EDUCATIONAL QUALIFICATION:

Bachelor of Technology in Computer Science
Masters in Software Engineering

We'd love your feedback!