Etl / Qa Analyst Resume
Hartford, CT
PROFESSIONAL SUMMARY:
- Around 6 years ofexperience inmanual and automation testingonData Warehouse applications using Oracle, Unix, Mainframe, Datastage, Informatica, Abinitio and BI Reporting.
- Involved in various Manual Testing types like System testing (ST), System Integration Testing (SIT), User Acceptance testing (UAT), Progression, Regression, Load Testing, Performance End - to-End Testing.
- Well-versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- Extensive knowledge in Banking Domain and experience in ETL process consisting of data transformation, sourcing, mapping, conversion and loading.
- PreparedTest Strategy, Test Plan, detailed Test Cases for functional and non functional requirements, Test Scriptsby decomposing business requirements, Test Scenariosto support quality.
- Experience in maintaining Traceability Matrix to ensure comprehensive test coverage of requirements.
- Expertise in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying Data Mismatch.
- Proficient in Back-End/Database Testing by developing and executing SQL queries, Mainframe JCLs and UNIX scripts.
- Expertise in using HP Quality Center tool for Defect Management, Defect Tracking and Defect Status Reporting.
- Ability to analyze root cause of a defect and demonstrate in a good sense when handling the problem over to development.
- Experience in conducting Test Case Reviews with business and project teams and preparing Daily Status Reports and Metrics.
- Clear understanding of business procedures and ability to work as an individual or as a part of a team.
- Good experience in leading a team and working with onsite-offshore team model projects.
- Experience in bringing new team members up to speed with project scope, process, business application flow and testing deliverables.
- A very good team player with excellent Communication (Verbal and Technical), Presentation and Reporting skills.
- Strong working knowledge of major Operating Systems and tested applications on windows 98/NT/2000/XP and UNIX environment.
TECHNICAL SKILLS:
Operating Systems: Windows 98/2000/NT/XP, UNIX, LINUX
Programming Languages: C, C++, Java, SQL, PL/SQL, JCL, Cobol, CICS, Rexx
Scripting Languages: XML, HTML, XHTML, Shell Scripting, PERLS
Application Development: Visual Basic 6.0, Oracle 9i/10g/11g
Packages: MS-Office, Visual Studio. 2005, Informatica, Abinitio
ETL/ BI Tools: Informatica, Abinitio, Datastage, Autosys, Toad
Testing Tools: HP Quality Center 9.0/11.0 (ALM), QTP 9.0
Defect Tracking: HP Quality Center 9.0/11.0 ALM, Rational Clearquest
Databases: Oracle 9i/10g/11g, SQL Server 2005/2008, MySQL, MS Access
Database Tools: TOAD 4.5, SQL Navigator 4.0, SQL Server 2005, SQL Developer
Scheduler Tools: BMC Control-MSoftware, Unix Multi Scheduler, Autosys
PROFESSIONAL EXPERIENCE:
Confidential, Hartford, CT
ETL / QA Analyst
Responsibilities:
- Worked closely with business in understanding and documenting the Business Requirement Document (BRD) for accuracy and completeness.
- Defined test strategy, developed whole functional testing plan and a single Master Test Plan (MTP) for front end (IAS) and back end (GDW) team.
- Created detailed test cases for each test phase to ensure complete coverage. Test Cases were incorporated both positive and negative test conditions.
- Identifying duplicate records in the staging area before data file/excel sheet sources gets processed to make sure correct data has been captured to be loaded to target tables.
- After job runs, validated load process to make sure the target tables are populated according the data mapping that satisfies the transformation rules.
- Raised PDU (Partition Data Usage), by listing out source tables, running self estimate scripts and by creating driver tables based on the business logic.
- Prepared test data by modifying the sample data in the source systems and setting up environment to cover all the requirements and scenarios.
- Phase 2 B Supported reverse flow. This phase needed strong understanding of data, data relationships at the table and at the attribute level.
- Consolidated ‘Regression Suite’ for Mainframe-Unix-Database test cases to reuse ETL transformation test cases which represent daily, weekly and monthly loading of Data.
- Developed back-end validation test script which will check the database automatically for data integrity in accordance with business standards of the company.
- Did environment cleanup, backup of tables-files for reuse, zipping, archiving and documentations which will help for phase 2B and other teams.
- Tracked and reported the issues/defects to the project team and management during Phase 2A and Phase 2B test cycles.
- Collected daily tasks accomplished from front end team, generated reports and sent DSR to the internal IAS-GDW project streams and to the client.
- Completed within deadline in spite of having numerous application issues and uploaded a combined Test Summary Report (TSR) representing GDW and IAS teams.
- Documentations and test cases for GDW forward and reverse flow got recognized. Phase 2A was a Single resource GDW project which got 7/7 feedback from client.
Environment: Ab-Initio, UNIX AIX 5.2, Autosys Scheduler, HP Quality Center 9.0/11.0 (ALM), Oracle 9i, SQL Server 2008
Confidential, Atlanta, GA
ETL / BI Test Analyst
Responsibilities:
- Provided inputs into project plan for providing timelines and resources required. Prepared test plans/test schedules by keeping in sync with Test Manager, IT development teamand front end IDM interface team.
- Participated in walkthroughs, involved in requirements gathering and followed with the Business Analysts to worked towards the resolution of the specification issues.
- Worked with development team to ensure feasibility of design and to translate business requirements into Informatica technical specifications.
- Tested a number of complex Informatica mappings, mapplets and reusable transformations for daily data loads from source flat files to target RDBMS tables.
- Prepared KT plans, trained new members, assigned them tasks to handle and also took KT to front end team to show what exactly happens in the back-end.
- Formulated methods to perform Positive and Negative testing against requirements and set up peer reviews for each modules to minimize development errors.
- Scenario verifications were done based on in depth banking knowledge such as accounts, customers, cards, transactions, BI reporting.
- Intensively analyzed Mainframe and Unix jobs based on flow diagrams provided. A job scheduler list for Control-M was created for all flows from various sources.
- Since this project extracted data from almost all sources, we used over 1000 GDW jobs. So created Reusable Test Case Suite which can be used by other GDW projects in future.
- Found a bug in existing production behavior as a result of hard core analysis of card libs of jobs. Raised potential defects and got appreciations from managers and development team.
- Created AML Solution Document particularly for AML GDW flow since this flow was run after 10 years without any onsite help and solved all weird errors.
- Checked out Mainframe elements from endevor, transferred files from GSYS(production) to TSYS(testing), promoted packages from DEV to TEST to PROD using Harvest and used WinSCP and FTP which connected other servers to transfer files.
- Validated Mainframe job runs, took screenshots of the logs, verified Mainframe input-output files, did JCL conversion of jobs, browsed copybook layouts, checked pseudo codes and used FileAid.
- Reviewed manual testing methods and implemented an automated test strategy by generating automation scripts. Also created partition list, transfer list, copy-backup table list to run all flows in a single stretch.
- Scheduled automated jobs to be run in a batch process and parallely maintained a Batch Calendar in which batch run dates, processing files details and batch completion informations are noted down.
- Coordinated with different users in UAT process. Participated in User Acceptance testing (UAT), involved in executing UA test cases and signed off from application.
Environment: Mainframe, UNIX AIX 5.2, BMC Control-M software, Informatica, HP Quality Center 9.0/11.0 (ALM), Oracle 9i, SQL, Datastage 7.5, IBM DB2, Windows XP.
Confidential, GA
ETL / BI Test Analyst
Responsibilities:
- Interacted with senior peers and subject matter experts to learn more about the databefore combining projects.
- This was Combined project in a single environment. Lead the team, conducted KT sessions to new team members and contributed in developing knowledge transfer documents.
- Put together the test cases for Day1-Day2 scenarios of both projects and did Unit Testing for all modules and packages.
- Promoted and Demoted packages containing new or updated scripts and codes using Harvest. Verified the modifications in card libs by comparing them with older version.
- Deployed DDL packages in UNIX to make changes in database and rechecking them to make sure the indexes, compression and partitioning are as expected.
- Maintained the data integrity and security using integrity constraints and database triggers in Oracle TOAD for SQL Server.
- Wrote complex SQL queries using Case Logic, Intersect, Minus, Subqueries, Inline Views, and Union to validate the data populated in tables.
- Project included two databases NAB1 and GDWG2PROD. Created DB-link for data migration since the jobs were getting decommissioned.
- Planned and Prepared testing artifacts such as Master Test Plan (MTP), Detailed Test Plan (DTP) and Test Summary Report (TSR) for this project.
- Provided weekly status report to the Project Manager and discussed issues related to quality and deadlines.
- Supported various teams even after testing while implementation such as data maintenance team and production support.
- Came up with 'Implementation plan to Close-off Arrangements'. Got appreciation for finding an approach to close the accounts using blank files in UNIX.
- Combined test execution approach for MLC MySuper and TERP Migration Project, Up Stream Quality and improved test efficiency led to greater customer satisfaction and cost saving.
Environment: Ab-Initio, UNIX AIX 5.2, Autosys Scheduler, HP Quality Center 9.0/11.0 (ALM), Oracle 9i, SQL Server 2008
Confidential
DW / ETL Test Analyst
Responsibilities:
- Created highly structured test cases in HP ALM for UNIX jobs of Eclipse, CAPSIL and MNB data capture systems.
- Set up the test environment for staging area, loading the staging area with data from multiple sources.
- Executed numerous UNIX jobs from all three flows. Faced and solved a lot of job failures, unix errors and database errors.
- All errors encountered while unix job running, autosys scheduling and database loading were noted down for reuse purpose.
- Involved in troubleshooting, resolving, escalating data related issues and validating them to improve data quality.
- Ensured error logs and audit tables are generated and populated properly after respective UNIX job run.
- Wrote shell scripts using UNIX Korn shell for file transfers, error log creations and log file cleanup process.
- Extensively used Autosys for all three workflows to schedule the jobs, run the scheduled jobs at specific time and troubleshooted errors.
- Conducted sessions on Autosys, Control- M and Multi Scheduler project wise and portfolio wise in testing Tools forum and Knowledge hub.
- Attended client calls, updated testing status, conducted and actively participated in reviews, walkthroughs of test cases.
- Effectively communicated testing activities and findings in oral and written formats such as Daily Status Reports (DSR) with details of executed, passed, failed test cases and defect status.
- Appreciations from clients for faster completion of all three data capture system regression tests with maintaining quality in limited time.
Environment: Mainframe, JCL, UNIX AIX 5.2, Oracle 9i, Autosys Scheduler, HP Quality Center 9.0/11.0 (ALM), SQL, PL/SQL
Confidential
DW Test Engineer
Responsibilities:
- Reviewed the business requirement and design documents to understand the process and raised documentation defects when found.
- A lot of data mapping were done according to business logic for 174 new fields which were derivative from existing fields.
- Prepared test cases to compare source and target tables of ASIS and TOBE flows. We considered phase 1 as ASIS and new fields were populated in TOBE.
- Created SQL queries to perform source to target testing on DB2 database to check data completeness, validity, uniqueness, data integrity, data transformation, data quality, initial and incremental load tests.
- Scheduled batches of jobs using UNIX Multi Scheduler and parallely monitored job logs and error logs.
- Did Performance testing to validate additional space and time taken by the job populating TMP RRA ACCOUNT table because of its record size and quantity.
- Created generalized queries and temporary tables, views so that whole team can use them for data validation, sanity check and asis-tobe comparison.
- Raised defects for a lot of transformation logic errors and data type mismatches found out as a result of source-target comparison.
- Updated Job Matrix sheet with job start time and end time of each shell scripts to measure run time changes as record number increases.
- Manually validated extremely huge mainframe files based on sampling. To automate this process, a tool(MFCR) was created to compare millions of records within 1 hour.
- Modified tool in such a way that it will verify mainframe files of different format/copybook/layout, different length, different count and blank/missing/extra fields.
- Did Documentation and demo for the tool and submitted in Kshop when tool got approved in Prime ( Confidential Tools Repository).
- All ST, SIT and performance testing were performed by following standard test methodologies and got an ELF rating of 7/7.
Environment: Mainframe GSYS, TSYS, JCL, UNIX AIX 5.2, Oracle 9i, Autosys Scheduler, HP Quality Center 9.0/11.0 (ALM), SQL
Confidential
DW Test Engineer
Responsibilities:
- Created the test cases after thorough analysis of BRD document and uploaded test cases to test the new fields in Quality Center 9.0 and 11.0(ALM).
- Peer reviewed the test cases designed and mapped test cases to the business requirements.
- Pulled all test cases for ST and SIT cycles in testlab. Passed them in HP ALM as the testing progressed.
- Test cases mainly aimed for data completeness, data transformations, data quality, performance and scalability.
- Created Job Analysis sheet after detailed analysis of flow diagrams to figure out the right order of Unix and Mainframe jobs, input-output files and tables used by them and shell scripts used by the jobs.
- Involved in complex database testing on a risk based approach, with multiple regression cycles, huge number of batch job runs and UNIX file validations.
- Monitored job state changes, end-start time differences and events raised by scheduler for performance checking.
- Maintained execution order for each cycle to note down the job order, time taken by a job, number of records populated/generated, errors faced by a job and its solution.
- Prepared complex sql queries joining a number of tables to verify records loaded in the databases after and before job run based on business requirement.
- Raised Defects after intensive data validation and verification. Analyzed defects and worked with development team to resolve them.
- Extracted large reports from database and checked whether they have generated without any truncation or missing of records.
Environment: Oracle 10g, Oracle 11g, UNIX, SQL, HP ALM Quality Centre, Windows