We provide IT Staff Augmentation Services!

Sr. Etl/bi Tester Resume

0/5 (Submit Your Rating)

Dayton, OH

SUMMARY

  • Over 7 years of experience Software Quality Assurance (QA) experience testing Data Warehouse, Database (ETL & BI), Web, Client - Server Systems and Applications for various Industries.
  • Experience in defining Testing Methodologies; creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC)
  • Strong in Software Analysis, Planning, Design, Development, Testing, Maintenance and Augmentation for various applications in data warehousing, metadata repositories, data migration, data mining and Enterprise Business Intelligence.
  • Expert in ETL, Data Warehousing, front-end and BI testing.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter.
  • Worked with Regulatory Report Authoring & Publishing management Systems (e.g. Dossier, Liquent Insight etc) and Documentum Adhered to HIPAA for health insurance & Medicare claims, responsibility included from functional requirement specification to testing the claims for HIPAA-compliant format and also to ensure all NCPDP formatting standards.
  • Extensive experience in writing SQL to validate the database systems and for backend database testing.
  • Extensive experience with IBM Mainframe for analyzing the data in designing the ETL Process.
  • Extensively worked on design and implementation of Database Management Systems such as Oracle 10g / 9i / 8i / 7.x, DB2 UDB, MS SQL Server, and MS Access.
  • Implemented all stages of development processes including extraction, transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica PowerCenter using Informatica Designer (Source Qualifier, Warehouse Analyzer, Transformation Developer, Mapping and Mapplet Designer), Repository Manager, Workflow Manager and Workflow Monitor.
  • Worked extensively in UNIX (AIX) and Linux environment, used Shell Scripts for automating batch transfers, table space management, automated backup, user group maintenance, security and custom report generation.
  • Strong experience in using case tools such as ERwin for database designing. Carried out Business process Modeling and Entity Relationship diagrams as precursors to Workflow Development, re-engineering, functionality enhancements, migration to new technology and value-addition towards existing business solutions.
  • Comprehensive knowledge of Ralph Kimball’s data modeling concepts including Dimensional Data Modeling and Star/Snowflake Schema Modeling.
  • Extensive experience in performing Black Box, Regression, Integration, and User Acceptance testing.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Effective independently or in a team. Worked as a Team Leader. Excellent communication as well as inter-personnel skills. Ability to convey technical information at all levels. Excels in research, analysis, and problem solving skills.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 8.6.1/8.1 / 7.1 / 6.2 / 5.1 , Ab Initio GDE 1.15/1.14 / 1.13 / 1.12 , CO>Operating System 2.15/2.14 / 2.12 / 2.11 , EME

Data Bases: Oracle 10g / 9i / 8i / 7.x, MainFrame via web3270, DB2 UDB, MS SQL Server 2008/2005/2000 / 7.0 , MS-Access, teradata v2r6

Development Languages: SQL*Plus, T-SQL, PL/SQL 2.2 / 8.x, Unix Shell Scripting, Benthic Software Golden 5.7/ Goldview32/ ImpExp32, TOAD 7 / 7.6, SQL Loader, VBA

Operating Systems: UNIX (AIX), Linux, MS-DOS, Windows vista / XP / 2000 / NT / 98 / 95.

Data Mining Tools: Mantas 5.5 / 4.1.1

Scheduling Tools: Tivoli Workload Scheduler (TWS) for Z / OS, CA7

BI Tools: Cognos 8, crystal reports, business objects xir3, ssrs, ssas, olap

Data Modeling Tools: ERwin 4.1 / 4.0

Methodologies: Ralph Kimball’s Data modeling, Star and Snowflake Schema Modeling

Testing Tools: Mercury Quality Center 9.0 / 8.0, Test Director 7.6 / 7.0, Quick Test Professional 6.5 / 5.6, Win Runner 7.5 / 7.0

CAD and CAM Tools: Auto CAD2000/14/12, Microstation95, Gerb Tool7.3

Workflow Tools: PuTTY, WinSCP3, MS-Project, MS-Excel, MS-PowerPoint, MS-Word.

Management Tools: Peregrine ServiceCenter 5.1, HP Project and Portfolio Management Center (ITG) 7.1

PROFESSIONAL EXPERIENCE

Confidential, Dayton, OH

Sr. ETL/BI Tester

Responsibilities:

  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.
  • Used HP Quality Center to perform Manual Testing and logging and tracking defect in BugZilla until the bug was fixed.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation.
  • Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.
  • Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation .
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Involved in Front to Back testing for all European and Asia pacific regions
  • As QA Tester, performing responsibilities with both the functionality and back end testing.
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Created high level use case diagrams, process composition diagrams and data modeling
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Written several UNIX scripts for invoking data reconciliation.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Performed Functional, Regression, Data Integrity, System, Compatibility testing
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.
  • Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center
  • Prepared Test status reports for each stage and logged any unresolved issues into Issues log.
  • Used T-SQL for Querying the SQL Server database for data validation.
  • Writing the test scripts for manual testing.
  • Tested data warehouse ETL process using SSIS (Integration Service).
  • Involved with ETL test data creation for all the ETL mapping rules.
  • Preparing and supporting the QA and UAT test environments.
  • Tested different detail, summary reports and on demand reports.
  • Communicated discrepancies determined in testing to impacted areas and monitored resolution.

Environment: SQL, PL/SQL, UNIX, IBM AIX 5.5, DB2, TERADATA V2R6, SYBASE 12.5, putty, Shell Scripting, Business Objects XIR3, XML Files, VSAM COBOL Files, IBM, INFORMATICA 8.6.1, XML SPY 2010, ORACLE 10G, TOAD 10, CA Erwin 4.0

Confidential, MIDDLETOWN CT

ETL/BI Test ANALYST

Responsibilities:

  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Carried out data profiling for multiple loan feeds.
  • Performed ETL testing based on ETL mapping document for data movement from source to target.
  • Transformed the EDI Data into the format understandable to Back End Systems
  • Designed Connection Models for Validation and transformation of HIPAA transactions using EDI Source and HIPAA.
  • Developed, executed, and maintained Test Plans, Test Case, Test Scripts, and Test Data for manual testing approaches using track & defect management tool HP Quality Center (QC).
  • Configured Quick Test Professional with Quality center.
  • Involved in processing claims in FACETS and validating the full cycle process to make sure the checks are generated and 835’s are generated
  • Model Configured THG (Trizetto’s HIPAA Gateway) for HIPAA 834, 837 and 835.
  • Converted HIPAA EDI 270, 276 transactions from version .
  • Involved in preparing complex test files for 270/271, 276/277 /11 transactions .
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Developed SQL Stored Procedures and Queries for Back end testing.
  • Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata.
  • Automated detailed test cases by using Quick Test Pro.
  • Used Quick Test Pro to write the automated test scripts by using various actions and reusable actions.
  • Drafted Test Plan for flow of data from front end to back end legacy applications
  • In a team environment, planned testing strategy for end to end flow through the system to link the front end functionality to the back end
  • Wrote test procedures for back end validation via functional specifications
  • Prepared SQL scripts for test data preparation for Back End functionality testing
  • Performed back up project management duties
  • Written several complex SQL queries to validate the Data Transformation Rules for ETL testing.
  • Written extensive UNIX Shell scripting for data parsing and text parsing needs including archiving the old data, running backend jobs & setting up of job dependencies.
  • Performed extensive data validations against Data Warehouse
  • Loaded flat file Data into Teradata tables using Unix Shell scripts.
  • Responsible for verifying business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse using Informatica and Shell Scripting
  • Tested several Informatica Mappings to validate the business conditions.
  • Conditional testing of constraints based on the business rules
  • Designed and executed the test cases on the application as per company standards and tracked the defects using HP Quality Center 9.2
  • Designed and prepared scripts to monitor uptime/downtime of different system components
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Monitored the data movement process through Data Extract to flat files through Informatica execution flows and Loading data to Data mart through NZLOAD utility.
  • Testing the ETL data movement from Oracle Data mart to Teradata on an Incremental and full load basis.
  • Developed the ETL process to automate the testing process and also to load the test data into the testing tables.
  • Used Excel Pivoting for various running totals, total sales for trades, highest performance of trades
  • Written several VBA macros.

Environment: VBA, SQL, PL/SQL, Excel Pivot, Informatica PowerCenter 8.6.1, Oracle 10g/9i, MainFrame via Web3270, VSAM Files, Copy Books, Business Objects 4.1/5.1.5 /6. , DB2 UDB, WinSQL, UNIX (AIX), Linux, PuTTY, WinSCP3, Mercury Quality Center 9.0/8.0, Autosys, Teradata, TOAD, XML

Confidential, Philadelphia PA

Data/SQL/ETL Tester-HIPPA

Responsibilities:

  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Wrote test scripts for the Medical and Dental Modules and Claims/Adjudication of Claims processing
  • Worked with Quality center to document test results and validate test scripts.
  • Used Quality Center in routing defects, defect resolution and assigning business cases to business users.
  • Developed inline view queries and complex SQL queries and improved the query performance for the same.
  • Expertise in gap analysis of 4010/5010 of all transactions such as 837, 835, 999.
  • Expertise in EDI transactions used in healthcare industry and good knowledge of HIPAA X12.
  • Experience as an EDI analyst performing verification and validation of the EDI transactions.
  • Tested HIPAA EDI Transactions & Code Sets Standards like 276,277,834,835,837,997
  • EDI Connation tool Processed inbound/outbound file testing for different format files like EDIFACT and ANSI X12
  • Involved in automation of test cases using QTP.
  • Did functional testing using QTP
  • Did extensive work with ETL testing including Data Completeness, Data Transformation & Data Quality for various data feeds coming from source.
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Developed several VBA macros.
  • Identify issues, information, and behaviors during the adoption of a proprietary information management system.
  • Accelerate the rate of adoption of the system, improve the quality of the data being input and generated, and promote accountability amongst the staff and users.
  • Develop code necessary to introduce additional reports, reverse engineer data models to the business meaning, and instruct users on the account management and implementation process advantages derived from the system.
  • Identify and document deficiencies in the proprietary information management system during initial implementation.

Environment: VBA, Excel, SSIS, Ab Initio Co>Operating System V2.12, GDE V1.13, ERwin 4.1, Oracle 9i, DB2 UDB, SQL Server 2000, SQL, PL/SQL, PuTTY, WinSCP3, TOAD, SQL Developer, UNIX (AIX), Shell Scripts, Mercury Quality Center 8.0, Microsoft Office 2003, Windows XP/2000, TERADATA V2R6, MLOAD, FLOAD, TERADATA SQL ASSISTANT, BTEQ

Confidential, Manhattan, New York

ETL/SQL Analyst

Responsibilities:

  • Designed the data flow diagrams using VISIO.
  • Worked with Data Profiling for various needs of business.
  • Scrubbed data to accurately generate customer pull. Provide output files in various file format based on customer request.
  • Developed automated scripts for functional testing and Data driven testing of the application using QTP.
  • Extensively worked on QT Pro to design and develop various manual and Goal oriented Scenarios for Application.
  • Worked on calling shell scripts from post-session and pre-session events. Extensively involved in implementing performance tuning techniques for the queries.
  • Tested different reports using Cognos Reportnet and Crystal Reports
  • Involved in the Maintenance and Production Support
  • Extensively used various types of transformations such as Expression, Joiner, Update strategy, Look up, filter for developing mappings.
  • Optimized QTP scripts for Regression testing of the application with various data sources and data types.
  • Executed regression tests at each new build in QTP.
  • Development of mappings as per the technical specifications approved by the client.
  • Created sessions and workflows to run with the logic embedded in the mappings using Power center Designer.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Reviewed specifications for feasibility of customer list pull criteria and commit date
  • Executed Campaign based on customer request and reviewed execution log to keep track of the number of records going into and coming out of every job step
  • Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Tested Database Triggers, Stored Procedure, Functions and Packages.
  • Optimized queries using rule based & cost based approach.
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Tuned database and SQL statements and schemas for optimal performance.
  • Expertise in SQL queries for the cross verification of data.
  • Designed documentation for the developed graphs.

Environment: Ab Initio Co> Operating System v2.11, GDE v1.12, Erwin 4.1, TOAD 7.6, Oracle 9i, DB2 UDB, Cognos 7.0 series, SQL, PL/SQL, UNIX (AIX), Shell Scripts, Windows 2000 / NT.

Confidential, Manhattan, New york

ETL Analyst/Tester

Responsibilities:

  • Responsible for building the management process, making sure controls are being followed, assisting in the requirements design, and coordinating the efforts of performers in the Enterprise Information Management Department.
  • Involved unit and integration test for the Informatica and database level.
  • Created technical specifications, mapping documents and managed test cases.
  • Participated in development of an estimation tracking tool for level of effort projections.
  • Gathered information, compiled findings and prepared management reports for staffing levels.
  • Developed database applications for managing IT staffing requirements and monitoring the status of outstanding requisitions.
  • Written several complex SQL queries to validate the data conditions as per the mapping document.
  • Provided data analysis, identified trends, summarized results and generated reports for Card Solutions Delivery reorganization effort.
  • Extracted data from different sources like, Oracle, Flat files, Xml & loaded into Operational Data Source and tested the same.
  • Extensively used SQL and PL/SQL Procedures and Worked in both UNIX and Windows environment.
  • Worked on loading of data from several flat files to XML Targets.
  • Created UNIX shell scripts for Informatica ETL tool to automate sessions.
  • Developed graphic representation of various metrics used in the forecasting, budgeting and procurement processes for the Merchandising Department.
  • Utilized Access database to collect data from multiple database sources using ODBC methods.
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Responsible for Unit Testing of the Mappings created according to the business requirements.

Environment: Oracle 7.3, SQL, PL/SQL, SQL*Plus, MDL, Auto CAD14, Gerb Tool, Microstation95, Mainframe, Small world, Windows 2000/NT/98/95, Microsoft office 97, Informatica 6.1, UNIX, PERL, Shell Scripting, XML

We'd love your feedback!