Senior Big Data Qa Analyst Resume
New York, NY
SUMMARY
- Having 8 Years of experience in Software Testing, UAT and Support. Experience in various phases of SDLC such as Requirements Analysis, Manual Testing, UAT and Production Support in Enterprise - Level Technical Environment.
- Core areas of skills are SQL and QA- Manual testing, Database testing, DWH-ETL Development & Testing, Business Reports Testing and Web Services Testing.
- Good exposure to Data Warehouse Concepts: Dimensional Modelling, Star and Snow-Flake schemas, Data Migration, and deciding approaches for Testing ETL Projects.
- Strong understanding of ETL concepts such as source to target mapping and data modelling.
- Good understanding of QA methodology, processes, software life cycle and bug tracking system. Familiar with Agile and Waterfall methodologies.
- Strong Experience in Functional, Regression, Integration, End-to-End and User Acceptance Testing.
- Experienced in writing complex SQL Queries and ETL Testing on Oracle, Teradata, SQL Server and Hadoop systems like Big Insights and Hive.
- Working experience on Hadoop Distributed File System (HDFS), Hadoop pig scripting and Data Validations on Hadoop using Hive Query Language.
- Experienced in ETL end to end testing like Data validations testing, Data Transformation testing, Data Load process testing and Business Reports testing.
- Working knowledge of Data accessing tools like TOAD, Teradata SQL Assistance, SQL Server Management Studio.
- Working knowledge in UNIX File Commands and Exposure to Shell scripting.
- Expertise in Test Case Design, Test Execution, Defect Management, Defect Tracking and generating Reports for delivering utmost Quality product.
- Experience in working with Test Management tools like HP Quality Center, Target Process, Test automation tools like QTP, Defect Management tools like Bugzilla and Business reporting tools like Business Objects.
- Experience in Web Services Testing and API Testing using SOAPUI.
- Excellent capability to explore, learn and understand newer business domains and technology.
- Coordinated well with the teams, Managed and lead a team of 5 Members. Ability to balance multiple tasks, adapt to rapidly changing priorities, and meet tight deadlines.
- Good oral and written communication skills. Good analytical skills and problem solving abilities.
- Ability to Engage in full software testing using a variety of manual techniques, Work with team to determine testing requirements. Keep current with technology, tools and software quality assurance practices and share knowledge with others.
- Having Good Onsite Experience.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter, Confidential DI Suite, SSIS.
Databases: Oracle 10g/11g, Teradata, MSSQL, DB2, MSSQLHadoop (Big Insights and Hortonworks), Cassandra, Netezza.
Test Management Tool: Quality Center (ALM), Target Process and Bugzilla.
Automation Tools: QTP.
Web services Testing Tool: Soap UI.
BI Tools: Business Objects.
Scripting Languages: C++, SQL/PL-SQL, Hadoop Pig, Scripting, VBScript
Data accessing Tools: Toad, Teradata SQL Assistance, SQL Server
Operating System: Windows, UNIX.
PROFESSIONAL EXPERIENCE
Confidential, New York, NY
Senior Big Data QA Analyst
Responsibilities:
- Review high level business requirements, technical specifications, and data mapping documents and create detailed test scenarios.
- Writing complex SQL queries based on Data Mapping rules document to validate the Dimensional and Fact Data from different Source Systems.
- Performed integration testing of Hadoop packages for ingestion, transformation, and loading of massive structured and unstructured data in to Big Insights from Flat files, relational databases like Oracle, DB2 and Hadoop system (Hive).
- Verified Direct/Indirect/Wildcard data loading from Flat files.
- Involved in test data preparation and executing workflows using the test data.
- Prepared Complex Hive SQL queries to test data correctness in Big Insights.
- Verified Big Insights functions validation in Confidential DI Suit.
- Verified Data Loading, Data Transforming and Data Validation tests like Key Constraints test, Null values, Duplicate Records checking, Source to Target record count test using SQL queries, Derived columns from Source to Target table and Parameter Replacement checking.
- Verified Data validation and data loading into the target flat files and bad files by using Hadoop Distributed File (HDFS) commands.
- Written scripts using Hadoop Pig and Unix shell scripting.
- Verified the Session logs and error tables for rejected records in the feed.
- Verification of issue resolution and data consistency as per business logic.
- Involved in defects Re-testing, Regression testing, Smoke testing, Sanity testing and System testing.
- Support business users with UAT activities including coordination and defect reporting.
- Validating major changes going into Production as part of Major/Minor Releases.
- Involved in Business reports testing using complex SQL queries.
- Involved in test planning, test case development and execution, Defects reporting and follow up.
- Generating reports through Clear quest for presenting at the defects status meeting.
- Involved in Confidential Planning, daily Scrum meetings, Confidential review meetings and retrospective meetings.
Environment: Hadoop Big Insights, Oracle, Pig, Cassandra, DB2, Confidential DI Suit, Toad, Adobe Flex, Java, HP-ALM 11.5, QTP, OBIEE, Linux 6.1.
Confidential, Overland Park, KS
Senior QA Engineer (ETL)
Responsibilities:
- Review high level business requirements, technical specifications, and data mapping documents and create detailed test scenarios.
- Writing complex Hadoop SQL queries based on Data Mapping documents to validate the Dimensional and Fact Data from different Source Systems.
- Involved in POC designs and streams implementation using Confidential DI Suit.
- Verified Data Loading into Hive from Flat files, Oracle and Teradata based on the Client requirements.
- Verified Direct/Indirect/Wildcard data loading from Flat files.
- Performed Data Load tests, Data Transformation tests and Data Validation tests like Key Constraints test, Null values, Duplicate Records checking, Source to Target record count test using SQL queries, Derived columns from Source to Target table and Parameter Replacement checking.
- Verified Hive Partitions created by date field.
- Verified Data validation and data loading into the target flat files and bad files by using UNIX commands.
- Written scripts using Hadoop Pig and Unix shell scripting.
- Verified the Session logs and error tables for rejected records in the feed.
- Validate the Hot fixes before releasing to UAT and Production.
- Involved in defects Re-testing, Regression testing, Smoke testing, Sanity testing and System testing.
- Validating major changes going into Production as part of Major/Minor Releases.
- Validation of Critical Defects during SIT, UAT and End to End Testing.
- Involved in Business reports testing using OBIEE.
- Test Planning and Status Reporting. Creation and implementation of QA integration and functional test plans using user stories and backlog listings.
- As a team lead generating reports through Clear quest for presenting the defects status meeting.
- Involved in Confidential Planning, daily Scrum meetings, Confidential review meetings and retrospective meetings.
Environment: Hive, Confidential DI Suit, Pig, Cassandra, Hadoop, Oracle, Teradata, Toad, Teradata SQL Assistances, Quality Center, QTP, Linux.
Confidential
Senior ETL QA Engineer
Responsibilities:
- Understood the requirements from the business analyst to balance the customer units.
- Executed POC designs and streams with Test data, which was made equivalent to the existing stored procedures.
- Achieved significant improvement in the throughput time with the test data provided by the customer.
- Performed reconciliation to match the logic to the client’s existing design.
- Debugged several phases of execution in the process.
- Extracted and Transformed data within Oracle using DI Suite.
- Scheduled Workflows/Streams for mentioned timings based on the Client requirement using Scheduler tool.
- Designed and developed all the tables, views for the system in Oracle database.
- Developed complex SQL queries for the validation data/logic.
- Verified and Executed UNIX scripts using Confidential Studio.
- Performed Data validations as a Back-End testing by developing several kinds of SQL queries.
- Involved in Business reports testing using Business Objects.
- Defects logged and verified in Quality Center.
- Involved in defects Re-testing, Regression testing, Smoke testing.
- Developed Test Plans, Test Cases and Test Scripts for Data Validation testing.
- Reproduced Scheduler issues in Local Environment and raised defects.
- Co-ordinated with developers and development lead on scheduler timing issues.
Environment: Oracle 10g, Toad, Confidential DI Suite 3.2, Quality Center, Business Objects, Win-SQL, Linux 6.1.
Confidential
Senior QA Engineer (ETL)
Responsibilities:
- Involved in POC (proof of concepts) development on DI Suite for Various clients.
- Analysed Source to Stage and Stage to Target mapping document based on Facts & Dimensions and validated the DDL and business rules transformations.
- Involved in Data Validation check like Key Constraints test, Null values, Duplicate Records checking, Source to Target record count test, Derived columns from Source to Target table and Parameter Replacement checking.
- Incorporating agile methodology and SCRUM techniques to manage requirements and enhance the evolving workday application.
- Involved in Confidential product GUI testing, Functional and non-functional testing.
- Involved in defects Re-testing, Regression testing, Smoke testing, Sanity testing and System testing.
- Involved in Debugging the Designs, Functions validation for various databases like Oracle, Teradata, Mssql, Netezza, Hortonworks and Big Insights.
- Verified data Load and Extraction Properties like column delimiters, Date and Time formats (YMD, DMY, MDY, Y2MD etc.), Data loading with File, Pipe options for databases, fixed width data loading for Flat files.
- Verified Direct/Indirect/Wildcard data loading from Flat files.
- Involved in DI Suite Metadata testing by Creating, deleting and updating the Objects, designs and streams.
- Verified DI Suite Functionality and Performance by creating and executing the sample designs, streams.
- Involved in test data generation for various databases like Oracle, Netezza, Teradata, Hortonworks and Big Insights for all the supported datatypes.
- Verified the transformations (Expression, Filter, Joiner, Aggregator, Union, Minus, Splitter) in DI Suite.
- Verified data Extraction, Loading using different Extraction and Load types like JDBC, OCI for Oracle, Fast Export, FastLoad, MLoad, Tpump for Teradata, JDBC, BCP for MSSQL, JDBC, BIExtract, BILoad for Big Insights.
- Scheduling the Streams/Workflows using Daily/Weekly/Monthly calendar, monitoring the scheduled streams and run times.
- Responsible for Reproducing client issues in Local environment and finding the route cause for those issues.
- Preparing manual test cases and test data for Scenarios.
- Developed Test Plans, Test Cases and Test Scripts for Data Validation testing.
- Responsible for Preparing Defect status matrix and discussing with Test Lead and Development leads on High Priority issues
Environment: Confidential DI Suite, Oracle, Netezza, Teradata, Mssql, Big Insights, Hortornworks, QTP, Business Objects, Quality Center, QTP, Bugzilla, Testopia, Linux 6.1.
Confidential
QA Engineer
Responsibilities:
- Involved in SRS (System Requirement Specification) document preparation - Functional and Non-Functional requirements.
- Involved in Test Plans, Test Cases and Test Scripts creation and modification for all the HRMS Modules (Manager, Employee, Leaves).
- Prepared test data for all the fields (Username, Password, Description, leave on Date etc.) by manual and by using Win-SQL tool.
- Validated all the Fields by entering test data and verified data storage in the database.
- Prepared SQL queries for HRMS table’s creation and executed them in the backend database.
- Prepared SQL queries to retrieve data from multiple tables using multiple Joins.
- Involved in Functional testing to check GUI testing, Input Domain testing, Error handling, Output testing and Database testing.
- Involved in Non-Functional testing to check Usability, Software compatibility, Hardware compatibility, Web services testing and Security testing.
- Used Black box testing techniques (Boundary Value Analysis, Equivalence Class Partitions, State Transition Flow) while writing Functional test cases and test data preparation.
- Involved in Defects retesting, Regression testing and Smoke testing of the Product.
Environment: SQL Server 2005, Team Foundation Server and Win-SQL.