We provide IT Staff Augmentation Services!

Sr.data Quality Analyst (dwh) Resume

0/5 (Submit Your Rating)

Nyc, NY

SUMMARY

  • Around 7+ years of IT experience in Quality Assurance for ETL, BI, Web based, Client/Server applications using Manual and Automated testing tools.
  • Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data Reconciliation, Initialization, and Change Data Capture, Delta Processing, Incremental process.
  • Test Design: Extensive experience in drafting Test Flows, Test Plans, Test Strategies, Test Scenarios, Test Scripts, Test Specifications, Test Summaries, Test Procedures, and Testcases & Test Status Reports.
  • Test Methodologies: Strong knowledge of test methodologies: Object Oriented test methodology, Service Oriented Architecture, Top to bottom and Bottom to top test methodology, QA Validations & QA Compliances to ensure the Quality Assurance Control.
  • Testing Types: Experience in User Acceptance Testing, Performance Testing, GUI Testing, System & Functional Testing, Integration Testing, Regression Testing, Data Driven & Keyword driven Testing
  • Test Management tools: Used Test Director, Quality Center and Business Process Testing for management of test plan, test design, test execution and defect log phase. Rational Quality Manager for management of STLC including requirements gathering, risk analysis, project planning, scheduling, testing, defect tracking, management, and reporting.
  • Rational Suite: Experience in IBM Rational Suite for maintaining test requirements, test flows using Requisite Pro, automate tests using Rational Robot and Rational Functional Tester, Rational Manager, Defect tracking using Rational Clear Quest, version control using Rational Clear Case.
  • Defect Tracking Tools: Efficiently performed Defect Tracking using various tools like Quality Center, Test Director, Rational Clear Quest, Remedy Tool, PVCS Tracker, Bugzilla.
  • Database Testing: Proficient in writing SQL queries to perform data driven tests and involved in front - end and back-end testing. Strong knowledge of RDBMS concepts. Developed SQL queries in Oracle databases to conduct DB testing. Having good knowledge on Oracle Data Integrator. Worked on Data files & label parameter of data file, strong in writing UNIX Korn shell scripting.

TECHNICAL SKILLS

Test Analysis: Functional Requirement Gathering & Analysis, Analyzing Business Rules, Test Flows, Test Planning, Causal Analysis of Defects, System Change Request, Impact Analysis

Test Management: MS Project Test Scheduling, Test Director, Quality Center, Rational Quality Manager, IBM DOORS Traceability Matrix for Requirements and Defects

Testing Methodologies: Object Oriented, Keyword Driven and Data Driven testing, Top to Bottom and Bottom to Top test methodology, Requirement based, Risk based, Priority based test methodology

Mercury Interactive: Quick TestPro9.5/9.2/8.0, Business Process Testing, WinRunner 9.2/9.0/8.0/7.5 , LoadRunner9.2/9.0/8.0, TestDirector9.0/8.0/7.5, Quality Center 9.2/8.2

ETL Tools: Informatica 7.1/8.x, Data Stage 7.5/8.1, Ab Initio (GDE 1.14, Co>Op 2.14), SSIS

IBM Rational Suite: Rational Robot 2003/7.0.2/6.0.0 , Rational Functional Tester 8.0/7.0, Rational Performance Tester 8.0/7.0, Rational Clear Quest 7.1/7.0.1, Rational Clear Case 7.1/7.0.1/7.0.0

Silk Suite: SilkTest 2006/8.5/8.0/7.6/7.5

Bug Reporting: Rational Clear Quest, Test Director, PVCS tracker, Remedy Tool, MS Access, Bugzilla, Siebel Service Request

Version Control: Synergex PVCS Tracker, Rational Clear Case 7.1/7.0.1/7.0.0 , Visual Source Safe 2005, Control Version System CVS

Scripting languages: TSL, SQA basic, Java script, VB Script, HTML, XML, XSLT, PERL, Shell Scripting

Programming Languages: C, C++, Java, Visual Basic 5.0, AS/400 CL

Operating Systems: Windows XP/NT/2000, DOS, Mac 10.4, AIX UNIX, LINUX

Databases: Oracle 10g/9i/8i/7.3, MS SQL Server 6.5/7.0/7.5/2005 , Sybase12.0/11.0, MS Access

Web Development: HTML, XML, VBScript, Java Script

MS Suite: MS Office 2007 (Word, Excel, PowerPoint, Outlook), MS Visio, MS Project

PROFESSIONAL EXPERIENCE

Confidential, NYC NY

Sr.Data Quality Analyst (DWH)

Responsibilities:

  • Analyzed and evaluated the existing as-is system, Business Process Improvements (BPI) incorporated and enhanced future (To-Be) system and developed the test requirements.
  • Wrote complex SQL queries to validate EDW data versus EDM source data including identification of duplicate records and quality of data based on Mapping/Transformation rules.
  • Used Quality Center to create manual (IEEE format) and automated tests, build test cycles, run tests and report and track defects. Generated reports and graphs to review the progress of test planning, tests run and defects tracked.
  • Extensively used ETL methodology for testing and supporting data extraction, transformations and loading processing in a corporate-wide-ETL Solution using Informatica.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation.
  • Promoted Unix/Informatica application releases from development to QA and to UAT environments as required.
  • Worked with Data Conversion projects for testing data quality after migration
  • Experience in Data Validation, Data Modeling, Data Flows, Data Profiling, Data Mining, Data Quality, Data Integration, Data Verification, and Data loading
  • Involved extensively in doing back end testing of the data quality by writing complex SQL
  • Created SAS datasets by extracting data from various sources and used complicated data step logic.
  • Tested the SAS jobs in batch mode through UNIX shell scripts
  • Tested remote SAS sessions to run the jobs in parallel mode to cut off the extraction time as the datasets were generated simultaneously
  • Experienced in handling QA teams on different projects
  • Supervising the QA team in accomplishing their tasks as per the deadlines.
  • Ensured data integrity and verified all data modifications and calculations during database Migration using ETL tools developed test cases to accomplish ETL data migration
  • Identify the primary key (logical / physical ) and put update or insert logic
  • Deleting the target data before processing based on logical or physical primary key
  • Design and execute the test cases on the application as per company standards
  • Preventing occurrences of multiple runs by flagging processed dates
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Testing of records with logical delete using flags
  • Interacting with senior peers or subject matter experts to learn more about the data
  • Identifying duplicate records in the staging area before data gets processed
  • Extensively written test scripts for back-end validations
  • Ensured that the mappings are correct
  • Conducted data validation testing
  • Used reusable actions with utilizing the flexible functionalities in Quick Test professional
  • Experience in using Automation Tools: Quick Test Professional, Win Runner and Quality Center in Client/Server
  • Experience in Mercury Quality Center - Design Test Steps, Requirement Mapping to Tests, Executing Tests Manually, Defect Logging, Defect Reporting
  • Configured Quick Test Professional with Quality center.
  • Performed UAT (User Acceptance Testing) and executed to verify requirements, look and feel of the applications.
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Use SQL, PL/SQL to validate the Data going in to the Data Ware House
  • Analyzing the AS-IS system which loads the Monthly/Weekly/daily multi source data files for Data mart using PL/SQL procedures and convert them into informatica code.
  • Back-end testing using SQL queries and Involved in writing PL/SQL quires
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Use SQL, PL/SQL to validate the Data going in to the Data Ware House
  • Automated detailed test cases by using Quick Test Pro
  • Testing the source and target databases for conformance to specifications
  • Conditional testing of constraints based on the business rules
  • Identify and request test resources like QA engineers, Software (SW) and Hardware (HW)
  • Performed functional testing using QTP for end to end application testing.
  • Written SQL scripts to test the mappings.
  • Written UNIX Shell scripts for cleanup, error logging, text parsing, job scheduling and job sequencing.
  • Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Debugging the SQL-Statements and stored procedures
  • Developed regression test scripts for the application
  • Involved in metrics gathering, analysis and reporting to concerned team
  • Tested the Oracle PL/SQL testing programs
  • Conducted review sessions on test case and regression test scripts in quality center
  • Developed Test Plan in IEEE format and implemented
  • Involved in testing procedures, functions, packages and triggers as a part of backend testing using Toad.
  • Identified and recommended training required to the testing effort
  • Conducted Training & Knowledge Transfer Sessions on new applications to QA Analysts

Environment: Informatica power center 9.1.0,Teradata V2R6, SQL Assistant 6.0, TOAD, Win SQL, Windows XP, HP Quality Center 9.2, Oracle 10G, SQL Server 2005/2008, SQL, PL/SQL, ETL Tools Cognos 8.2.1 XML, XSLT, XSD, XML Spy 2008, PERL, Shell Scripting, TOAD, UNIX, VB Script, VSAM Files, COBOL II, JCL, File-Aid.

Confidential, Charlotte NC

Data Quality Analyst (DWH)

Responsibilities:

  • Analyzed the Functional requirements using Scenarios & DDD(Detailed Design Document)
  • Ran SQL queries to verify the number of records from Source to Target and validated the referential integrity, Time variance, Missing records, Nulls/Defaults/Trim spaces rules as per the design specifications.
  • Verified correctness of data after the transformation rules were applied on source data.
  • Coordinated execution of User Acceptance Testing, regression and integration testing with multiple departments.
  • Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica for testing.
  • Automated and scheduled the Informatica jobs using UNIX Shell Scripting.
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Worked with development team to ensure testing issues are resolved on the basis of using defect reports.
  • Wrote complex SQL s to validate target data based on the business requirements.
  • Generated the detailed reports of the Bugs, go no go reports and comparison charts.
  • Analyzing the AS-IS system which loads the Monthly/Weekly/daily multi source data files for Data mart using PL/SQL procedures and convert them into informatica code.
  • Back-end testing using SQL queries and Involved in writing PL/SQL quires
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against OD
  • Analyzing Source Files to determine data quality Standards
  • Resolved data quality issues for multiple feeds coming from Mainframe Flat Files
  • Creating detailed Data Quality workbooks to define Data Quality Standards and metrics
  • Generated graphs using SAS/Graph and the SAS Graphic Editor
  • Imported data from mainframe flat files with copybooks and relational database (Oracle) files into SAS files per detailed specifications.
  • Used SAS/Graph and SAS/STAT to generate the plots of the variables and regression lines
  • Wrote Base SAS and macros code to validate the data sets before running the statistical models
  • Defects tracking, review, analyze and compare results using Quality Center.
  • Creating test scripts using Quality Center and performing regression testing on new versions of the software.
  • Used Quality Center for bug reporting
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Identifying field and data defects with required information in datastage ETL process in various jobs and one to one mapping.
  • Worked extensively with Database Procedures, Functions, Cursors and Joins to validate data.
  • Wrote & executed Test Cases for each functionality based on interfaces that interacting with workflow application
  • Responsible for deploying the builds and maintaining the test machines.
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in datastage job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Identified & presented effort estimations related to testing efforts to Project Management Team
  • Conducted test case review and revision of surrogate key generation in DataStage to uniquely identify master data elements for newly inserted data.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Sending package install requests for new builds and verifying proper packages are installed.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica power center 8.6.1, Windows XP, SQL, PL/SQL,,PERL, Shell Scripting, QTP 9.2, SQL Server 2005, Quality Center 9.0, Load Runner7.0, Oracle 10g, Java, Unix AIX 5.2,VB Script. Cognos 7.0.1

Confidential, New York, New York

Data Quality Analyst (DWH)

Responsibilities:

  • Analyzing new Business requirements for each & every release and worked with analysts and business team for any gaps found during the test analysis
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Incremental development/testing of Enterprise Data Warehouse.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).
  • Used various checkpoints in the Informatica designer to check the data being transformed
  • Debugging and Scheduling Informatica Jobs/Informatica Mappings and monitoring error logs.
  • Tested several Informatica mappings and ran for test data loading.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Write SQL queries to validate that actual test results match with expected results
  • Check the naming standards, data integrity and referential integrity.
  • Responsible for monitoring data for porting to current versions.
  • Analyzed and tested various Hyperion reports for Hierarchies, Aggregation, Conditions and Filters.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology ie backward navigation from target to source.
  • Used Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.
  • Analyze data and create detailed reports utilizing proprietary tools and third party technologies to identify and resolve data quality issues to ensure quality assurance and drive data quality initiatives
  • Formulating various post production data quality and error handling SQL scripts
  • Worked on analyzing data quality and data profiling.
  • Creating test scripts using Quality Center and performing regression testing on new versions of the software.
  • Used Quality Center for bug reporting
  • Tracked and reported the bugs with Quality center.
  • Preparing Defect Reports and Test Summary Report documents.
  • Checking the status of ETL jobs in UNIX.
  • Interacted with Business Analyst, Database Administrators, development team and End Users.
  • Preparing Test plan, Test scenarios and writing & execution of Test cases
  • Identified Test Conditions in according to the BRD’s and Change Requests
  • Written and Executed the UNIX test cases on EDI format and verify Translation response
  • Performed Reliability testing for testing the UNIX servers
  • Responsible for execution of Batch jobs and population test data for other teams
  • Executed Complex SQL queries within UNIX shell scripts in the UNIX environment.
  • Responsible for FTP jobs to transfer data between UNIX test servers.
  • Responsible for Production Monitoring during peak hours
  • Regression Testing for Each and Every Release Using Win Runner
  • Used Test Director & Rational DDTS as Test Management tool.
  • Responsible for doing System Testing, Functional Testing, Integration Testing, Load Testing and Smoke Testing.
  • Extensively worked on Backend using SQL Queries to validate the data in the database.
  • Used Rational Clear Case for version controlling

Environment: Informatica 6.,5.1, PERL, Unix Shell Scripting, SQL, PL/SQL, XML, XSLT, XSD, Teradata V2R5, IBM DB2, Flat Files, VSAM, COBOL II, MVS, JCL, Copy Books, Rational Clear Quest, Clear Case, Autosys, Cognos 6.0

Confidential, Charlotte, NC

Systems Analyst

Responsibilities:

  • Responsible for extensive testing of different modules of the web-based/Internet application and the whole testing life cycle for the various modules of this application.
  • Performed black box/functional testing, regression testing and performance testing and Load testing was conducted on AUT.
  • Pre-testing phase involved understanding/analyzing project vision, goals, specifications and requirements.
  • Developed test cases and test scripts in TSL collaboratively with other testers to test the functional requirements.
  • Carried out Smoke Test to judge the software acceptance criteria for testing.
  • Involved in Automation Frameworks and Methodologies
  • Initially performed manual testing on the applications for Functionality Testing and then developed automated execution of test cases using WinRunner for Regression Testing.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target table
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against OD
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Extensively Queried through SQL to check the database record
  • Used GUI, Bitmap, and Database and Text checkpoints in Win Runner TSL script for validations.
  • Attended change requests meetings and made subsequent changes in the test
  • Plan and wrote test cases according change requests
  • Parameterized the fixed values in checkpoint statements, created data tables for the parameters and wrote functions for the parameters to read new data from the table upon each iteration - Performed Data-driven testing.
  • Responsible for keeping up with the test schedule and interacting with software engineers to ensure clear communications on requirements and defect reports.
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Quality Center
  • Involved in Identifying of the unused test cases in the Test Plan in Quality Center and prepared a consolidated doc to delete the test cases.
  • Conducted performance testing of the application using Load Runner
  • Performed process based testing of the applications
  • Executed manual and automated testing of J2EE-based applications running on BEA WebLogic servers.
  • Provided test cases for java environment J2EE, JUnit, JBuilder and SOA applications for Web based client server.
  • Analyzed, reported and kept track of defects using Test Director.
  • Submission of test results, weekly status reports and test summaries to the QA lead and project manager
  • Conducted Training Sessions and Knowledge Transfer Sessions to develop skill sets of team members

Environment: Windows, J2EE, JUnit, JBuilder, WinRunner 7.5, Test Director 7.5, Load Runner 6.0 PL/SQL, ASP, and XML/DHTML

We'd love your feedback!