We provide IT Staff Augmentation Services!

Test Data Subject Matter Expert/ Test Data Management Engineer & Service Virtualization Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • 5+ years in Information Technology with Expertise in ETL testing for Data Warehouse/Data Mart development, Data Analysis for critical Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications and test data management experience.
  • 2 Years of Experience on CA Test Data Manager.
  • Involved in Assessment, Pilot and implementations phases of Test Data Management system for various clients.
  • Manual / Automated Software Testing and Software Quality Assurance experience. Metrics definition and planning. Full understanding of the Software Development Life Cycle
  • Have worked during various phases of project life cycle phases like Requirements and Analysis, Testing. My Technical abilities are complemented by good communication skills and user interaction abilities
  • Complete understanding of the entire life cycle if testing. It includes both positive and negative testing.
  • Experience in using the HPQC tool.
  • Have knowledge in bug life cycle and SDLC concepts.
  • Extensive experience in writing and executing test case and also executing them.
  • Excellent logical and analytical skills with excellent communication skills.
  • Worked as a QA test analyst and have knowledge on the prod, pre - prod and QA runs and testing methodologies.
  • Certified CA Test Data Manager Implementation proven professional.

CORE COMPETENCIES:

Databases: SQL Server, Oracle, DB2.

Reporting Tools: Microsoft Analysis Services, Business Objects, SSRS, Tableau reports.

ETL Tools: BODS

TDM Tools: Data Maker, Agile Requirement designer, Fast Data Maker, CA TDM Portal, GT Subset, Javelin.

Programming Languages: SQL

Productivity: Microsoft Office, Microsoft Excel

Methodologies: Waterfall, Agile

Software Testing: Manual Testing, Test Plan, Test Case, Regression test, Co-existence test, User Acceptance Testing, Release planning, Defect Meetings.

QA Tools: Quality Center, CA Test Data Manager, HP Agile Manager.

PROFESSIONAL EXPERIENCE:

Confidential

Test Data Subject Matter Expert/ Test Data Management Engineer & Service Virtualization Engineer

Responsibilities:

  • Was involved in selection of appropriate TDM tool for the Client.
  • Installed CA TDM tool on server machine & work laptops.
  • Initial setup of tool with different types of databases was done successfully.
  • Initial 4 months was pilot phase which included conducting POC’s explaining different features of TDM like Data Masking, Data Subsetting, Synthetic Data Generation, Business Scenario Specific data.
  • All the developed POC’s are utilized by Testers through Test Data Management Portal.
  • Virtualized the services which are not available so that system can continue their testing without any lapse in time.
  • Implemented multiple use cases on Synthetic Data generation, Data Masking of PII Data (SSN, Phone Number, Names and Address), Data Subsetting (Filter Data based on certain criteria).
  • Worked on different types of sources formats like XML, Flat Files etc.
  • Worked on File masking (XML, CSV, Flat files) and Table Masking.

TOOLS: CA Test Data Manager V4.2, Agile requirement designer, Dev test solutions v10, GT Subset, Javelin.

Confidential

Test Data Subject Matter Expert

Responsibilities:

  • Conducted interviews with Stake holders on Test data related challenges.
  • Participated in TDM tool evaluation process.
  • Involved in creating High-level TDM Road Map document.
  • Involved in creating Strategy document for TDM Implementation.
  • Participated in Final presentation of TDM Assessment phase.
  • Involved in creating the plan for Pilot Phase.

TEST DATA MANAGEMENT TEST LEAD

Confidential

Responsibilities:

  • Worked as Test data management test lead on CA TDM tool for Tenncare project(State of Tennessee)
  • Worked on masking different types of files like x12, EDI, xml, dat files and HL7 messages.
  • Worked on masking tables for different subsystems like Eligibility&Enrollment, Third party liability(TPL), Providers, Internet, dataware housing.
  • Worked on different modules in CA TDM like Data cloning, Data sub setting and Test Matching.
  • Masked the data in such a way that it properly generates 834’s and 271’s.
  • Generated synthetic data for future tests.
  • Updated the Scramble database with seed data.
  • Worked as Team lead in an engagement with CA Technologies from scratch to go live in implementation of masked data in Tenncare.
  • Had to manage a team of 3 along with 2 consultants from CA technology.
  • Involved in all phases of project actively.
  • This project followed agile methodology. HP AGM was used to track project status.
  • Created user stories in HP AGM and actively involved in sprint meetings.

TOOL: USED: HP ALM, HP AGM, CA TDM, CA FAST DATA MAKER, CA EDI, CA AGILE REQUIREMENT MANAGER.

CO-EXISTENCE AND REGRESSION OFFSHORE TEST LEAD

Confidential

Responsibilities:

  • Work as a Team lead for Regression and Co-Existence team and handled a team of three.
  • Had constant interaction with the client in understanding the requirements for successful go live of project after regression.
  • Establishing the Test Data Management Charter to support an iterative, repeatable process to create, maintain and protect data in non-production environments that supports end to end testing of key business processes
  • Manage and establish service levels for the TDM shared service, including transparency through reporting and valuation of service
  • Ensure alignment to Quality Shared Services best practices, tools and standards
  • Responsible for implementation of advanced testing practices to improve quality and reduce overall cost and ensure the protection of data
  • Worked with the Golden set of data for Successful production go live.
  • Analyzing the Functional Requirements
  • Authoring the test cases for new functionalities as per the specification
  • Modifying/Updating the existing test cases based on the specification changes
  • Performing Functional, Regression, coexistence, Re-testing and Ad-hoc testing
  • Analyzing and Logging defects in Rational Clear Quest and tracking them to closure
  • Using Quality Centre for the centralized control over the entire project (Test Management)
  • Peer-Reviewing of the test cases and defects logged
  • Participated in KT sessions that is aimed for module/functionalities

Environment: ETL BODS, Oracle ODI, TABLEAU, EXCEL REPORTS, HPQC, Teradata.

QA Analyst

Confidential

Responsibilities:

  • Review functional and design specifications to ensure full understanding of individual deliverables.
  • Backend database testing in Teradata including validating stored procs, jobs and triggers.
  • Identify test requirements from specifications, map test case requirements and design test coverage plan.
  • Develop, document and maintain functional test cases and other test artifacts like the test data, data validation, harness scripts and automated scripts.
  • Execute and evaluate manual or automated test cases and report test results.
  • Hold and facilitate test plan/case reviews with cross-functional team members.
  • Identify any potential quality issues per defined process and escalate potential quality issues immediately to management.
  • Ensure that validated deliverables meet functional and design specifications and requirements.
  • Isolate, replicate, and report defects and verify defect fixes.

Environment: Oracle 11g, Toad, Crystal reports 9, Teradata

QA data Analyst

Confidential

Responsibilities:

  • Worked as an ETL Development practitioner
  • Worked on SAP BODS to create jobs, workflows, scripts, data stores to load in data quality tables.
  • Involved in data quality analysis.
  • Prepared data quality rules, business level data checks and generated reports based on the EDW data at different stages of the project.
  • Worked on Teradata to prepare SQL, validating the data in EDW tables, semantic views.
  • Extensive experience in writing and executing scripts for validation and testing of the sessions, data integrity between source and target database.

We'd love your feedback!