Senior Qa Engineer Resume
Herndon, VA
SUMMARY
- Highly skilled software development professional bringing more than 14 years in software design, development,testing and integration.
TECHNICAL SKILLS
Big Data Technologies: Apache Spark, Hadoop, Elastic MapReduce (EMR), HDFS, Hive, Kinesis Datastreams, Kinesis Firehose, Kinesis Data Analytics, Kafka,Sqoop, AWS Glue, AWS DataPipeline, AWS Athena, Redshift Spectrum, AWS DMS, SCT, S3DistCP, AWS ElasticSearch and Amazon Quicksight.
Cloud Infrastructure: AWS
Programming Languages: Python, Scala, Java, Ruby, COBOL, PL/1, SQL, Stored Procedure, CICS, HTML
Databases: AWS Redshift, DynamoDB, RDS, Aurora, Oracle, Netezza, Confidential DB2, MySQL, SQL server, PostgreSQL and IMS.
Version Control: GitHub, SVN, BitBucket
AWS Services: Lambda, StepFunctions, IoT, SNS, SQS and SageMaker.
Operating System: Windows, Linux, Confidential z/OS and z/VM Others Maven, POSTMAN, Apache Airflow, Informatica Power Center, Ab Initio, JIRA, Rally, Jenkins, Alteryx, AWS CodeCommit, AWS CodeBuild, AWS CodeDeploy, Tableau, GEMS 3.4, Autosys, Putty, TOAD, SQL developer, Altova XMLSpy, Eclipse, Confidential Rational DOORS 9.6, Rally, JIRA, HP ALM, WinSCP ER/Studio, Selenium, Cucumber, UFT/Quick Test Professional and HP Quality Center (ALM)
PROFESSIONAL EXPERIENCE
Confidential, Herndon, VA
Senior QA Engineer
Responsibilities:
- Interacting with various business user groups for gathering the requirements and preparation of Test Plan, Test Strategy, Test cases with all transformation logics as per the requirement.
- Validate the data from the data sources through each step of the extract and transformation process including final load in the data warehouse.
- Write SQL scripts to extract data from various databases and systems and compare against data warehouse.
- Compare data between Source and Target using SQL /Spark SQL and HIVE.
- Create test plans, test cases, test scripts, and requirements traceability matrix.
- Test ETL pipelines that populates data warehouse.
- Orchestrate Spark Applications using AWS Step Functions and AWS Lambda
- Develop and test Spark jobs to process the data from various source systems in EMR.
- Migrate On - Premises applications to Cloud Infrastructure(AWS).
- Test Extract, Transform and Load (ETL) Informatica mappings, sessions and workflows in a data warehouse environment
- Develop and test Lambda functions to process data based on various event triggers - AWS S3 and SNS.
- Create Hive tables and working on them using Hive QL.
- Load and validate dimension and fact tables in Redshift.
- Validate various reports generated from Enterprise DataWarehouse.
- Perform unit, functional and regression testing.
- Write Gherkin test cases implementing BDD- Cucumber.
- Test ETL pipelines that populates data warehouse.
- Test various APIs using Postman.
- Build test automation using Java/Selenium
- Automate business and functional test suite, test cases and capture the test results using Cucumber.
- Perform Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Responsible for all the bug tracking, defect retesting and defect logging
- Orchestrate Spark Applications using AWS Step Functions and AWS Lambda
- Develop and test Spark jobs to process the data from various source systems in EMR.
- Migrate On-Premises applications to Cloud Infrastructure(AWS).
- Develop and test Lambda functions to process data based on various event triggers - AWS S3 and SNS.
- Create Hive tables and working on them using Hive QL.
- Load and validate dimension and fact tables in Redshift.
- Develop and test AutoSys JILs/Jobs to trigger Informatica workflows.
- Write complex data integration programs in Oracle (12g) PL/SQL.
- Write Cloud Formation Templates (CFT) to set up and built AWS infrastructure for various resources.
- Participate in Agile ceremonies like Scaled Agile Framework - Program Increment Planning, Sprint Planning, User Story Grooming, Daily Standups, Sprint Review Meeting (System Demo), Agile Retrospective and Backlog Grooming.
Confidential, Estates, IL
Senior Programmer Analyst
Responsibilities:
- Developed new mainframe COBOL DB2 programs for creating Orders from DOS application to High Jump, process Receipts, Load Confirmation, Inventory snapshot from High Jump to DOS.
- Enhanced COBOL DB2 programs for Inventory adjustments, Credit Memo transactions, Disposition moves and Order updates from High Jump to DOS.
- Developed complex SQL queries and batch QMF procedures to generate various reports like Overdue order report, Shipment Overdue report and Finance report.
- Developed and enhanced batch jobs to execute newly created and enhanced COBOL DB2 programs.
- Created test cases for DOS application which involve both Batch and Online.
- Performed Unit / System / Performance testing of DOS application.
- Performed Integration testing with High Jump, OTM, Confidential I2K application teams.
- Researched existing Physical, Logical models and Data dictionaries of DOS application.
- Responsible for gathering and understanding new business requirements and facilitating the transfer of specifications to the development team.
- Perform various reviews and ensured that every deliverable meets the quality standards.
Confidential, Gaithersburg, MD
Application Developer/Production Support Specialist/Team lead
Responsibilities:
- Developed High-level and low-level design documents for development work undertaken.
- Analyzed and identified the relationships and dependencies of software program with database and real-time components.
- Developed Hourly Servicing invoicing application online and batch programs using COBOL, DB2, PL/1, SQL Stored procedures and CICS.
- Developed key modules of the Mainframe application to upgrade warranty using PL/1, AAS, CICS and DB2.
- Developed new mainframe PL/1 programs that involves CICS TDQs for Problem determination and writes data to CICS Journals.
- Developed new mainframe programs in SCLM using PL/1, DB2to facilitate the connectivity & call from front-end web-based application to the procedure
- Developed new mainframe programs using PL/1, CICS to initialize the CICS CWA and connect new CICS regions.
- Solving severity 1 issues in the production region that could lead to software system outages.
- Developed ER diagrams, Physical, Logical models and Data dictionaries.
- Work an on-site coordinator for the team to communicate with customers on the priority of the requests and advise the team of changes or priority.
- Developed new programs (using PL/1/DB2/ CICS) to parse the data sent to MES, validate the data to recognize feature conversions.
- Created new mainframe Jobs using JCL that reads/deletes/updates DB2, copies CICS Journals.
- Created new Net View FTP, VSAM, bind jobs using JCL.
- Created test cases and performed Unit / System / Performance / Integration Testing to improve the performance of DB2 Stored Procedures.
- Created test cases for Life and Fire Insurance applications which involve both Batch and Online.
- Deployed the software programs to UAT and production regions using CMVC, VAGEN.
- Developed new stored procedures using Confidential DB2 V 9.1
- Developed the test case matrix that defined the test strategies that will help achieve the testing goal in most efficient manner.
- Enhanced existing mainframe AAS, PL/1, COBOL, CICS and DB2 Stored procedures to support feature conversions, to improve efficiency, to read and write data from MQ, DB2, to migrate Power 6 machines to Power 7 machines.
- Formulated estimates, development plan and implementation checklists.
- Handled many production problems that deals with DB2 Stored Procedures when Web based application (Client) tries access DB2 (Server) using SPs.
- Hands-on experience on developing Business process diagrams, System Architecture diagrams, System component diagrams, Data Flow Diagrams, Batch process flow diagrams for Life Insurance applications.
- Perform various reviews and ensured that every deliverable meets the quality standards.
- Performed all the in-depth analysis and due diligence of Life Insurance DB2 Databases, Life Insurance SQL Stored procedures that were invoked by Web based front end Life Insurance application
- Performed functional testing of development & provided support to Customer during UAT.
- Prepare Statement of Work (SOW), high-level design and low-level design as per the template provided in On Demand Process Asset Library (OPAL) quality process.
- Prepared fall-out plan and worked as a build manager to release the code to User acceptance testing.
- Prepared installation plans for cutover to production.
- Prepared the cost and time estimation for the proposed work.