We provide IT Staff Augmentation Services!

Sr. Test Data Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Experience in performing ETL operations like Data Extraction, Data Transformation and Data Loading with Informatica Power Center 10.x, 9.x, 8.x, 7.x versions.
  • Experience in performing Data Analysis (Oracle, MSSQL Servers), Data Modelling (ER Studio), Data De - identification (following HIPPA rules to identify PHI/PII), data validation activities, Environment profiling, Domain Categorization.
  • Implement test data to serve various IT environments (DEV, QA, UAT).
  • Worked on Test Data Generation, TDM Governance Model, Implement Self service capability for serving test data users.
  • Experience in all the phases of the Data Warehouse project life cycle.
  • Implement the Data Profiling and Data masking methodologies using PL/SQL Scripting, CA TDM (Computer Associates Test Data Management) process, Grid tools usage (Fast Data Masker, GT data maker, Javelin).
  • Developing database virtualization activities, sandbox databases using Delphix self-service portal.
  • Interacted with end-users and functional analysts to identify and develop Business Specification documents (BSD) / (STM) and transform it into technical requirements.
  • Worked with Business Analyst in understanding the requirements and Source to Target Mapping document.
  • Experience in Understanding Logical and Physical Data Modeling, Dimensional Data Modeling -Star Schema/Snowflake modeling, Fact and Dimension Tables.
  • Experience on Informatica components like transformations, reusable transformations, mappings, mapplets, sessions, event wait tasks, Schedulers, worklets and workflows, workflow monitoring.
  • Worked closely with IT Environments process of creation of Stack Environments (Development and QA/UAT environments).
  • Extensive knowledge of various kinds of Informatica Transformations such as Source Qualifier, Aggregate, Lookup (Connected & Un Connected), Rank, Joiner, Filter, Router, Sorter, Sequence, Union, Update Strategy, Stored Procedure, Java, Normalizer, and xml Transformations.
  • Experience working with different Databases like Oracle, My SQL, DB2 and SQL Server.
  • Experience In working with JIL files, JSON Objects, Flat Files & XML Files.
  • Strong experience in writing SQL for Data Analysis.
  • Experience with Performance Tuning and other DB queries.
  • Extensively worked with Slowly Changing Dimensions (Type I, Type II).
  • Good Experience with Change Data Capture for pulling Delta data.
  • Experience in writing efficient SQL queries, PL/SQL Scripts, and fine-tuning queries.
  • Extensively worked in creating Mappings using Informatica Designer and processing tasks using Workflow Manager to configure data flows from multiple data sources (Relational, Flat files, XML Files & Application) to targets.
  • Experience in maintenance and enhancement of existing ETL Process, improving loading process for faster efficiency.
  • Worked in both Waterfall & Agile Methodology project implementation.
  • Ability to work independently with minimum supervision and initiative to learn new technologies and tools quickly.

TECHNICAL SKILLS

  • 10.2/10.0/9.6.1/9.5.1/9.1.1/8.6.1/8.5/8.1.1/7.1.2 , IDQ 9.6
  • Requirement Analysis, Business Analysis, detail design, data flowdiagrams, data definition table, Business Rules, data modeling, Data
  • Warehousing, system integration.
  • Dimensional Data Modeling (Star Schema, Snowflake, FACT-Dimensions)Conceptual, Physical and Logical Data Modeling, ER Models, OLAP, OLTP concepts.
  • Data De-identification, Data Profiling, Data Masking, Data Domain
  • Categorization, Data Integration, CATDM, Data Sub-setting, GRID Tools, Fast Data Maker.
  • Oracle 12c/11g/10g/9i/8i, MY SQL, DB2, SQL Server 2012/2008.
  • (RDBMS) (PL/SQL Developer, SQL Server Management Studio, OracleDeveloper tool).
  • SQL, PL/SQL, Unix Shell Scripting, XML.
  • Windows, Unix.
  • Agile & waterfall methodology, MS Office Tools, ER Studio.

PROFESSIONAL EXPERIENCE

Confidential

SR. Test Data Engineer

Responsibilities:

  • Responsible for creating the functional, usable Non-Production environments from Production databases by eliminating PHI (Protected health information) /PII (Personal Identifiable Information) following HIPPA rules (Health Insurance Portability and Accountability Act of 1996).
  • Built the appropriate test data sets to support various areas of organization, including QA (Quality Assurance), Application Development (APP DEV users) and business users in User Acceptance Testing (UAT) and training initiatives.
  • Coordinated with multiple stakeholders to ensure that test data needs are addressed proactively, appropriately testing critical business systems with production like data which does not comprise of any PHI/PII.
  • Worked on ingesting the data from Production environments to staging area to create and maintain Test Data Management servers in Delphix Engines which serve the TDM team
  • Performed the de-identification of data across all the various data sources in the non-production environment.
  • Implemented the process of environment profiling and (PHI/PII profiling) across the SQL and Oracle Servers. Mapping the PHI columns with the PHI types specific to the Seed list created in TDM (Test Data Management Team Repository) owned Oracle database.
  • Created application level data modelling (Relationship among Entities, Attributes for Oracle and SQL databases) by analyzing the business data flows and data rules (HIPPA), database objects for IT supported database systems under test to create and maintain the enterprise architecture using the ER Studio Data Architect application
  • Created PL/SQL sub setting scripts to mask the PHI columns across various IT environments as part of TDM Database Refresh Process occurring every quarter based on the business requirements provided by the SME, to provide the input data for testing in non-prod environments (UAT/QA/DEV).
  • Used CA technologies (Broadcom) - Fast Data Masker tool in masking the sensitive PHI/PII data in Delphix created staging database.
  • Built the packages for SQL code created using the continuous integration JENKINS tool, install that packages using CARA tool (Computer Associates Release Automation).
  • Performed Data validation activities and auditing the masked and subset data by executing customized SQL scripts using SQL Server Management Studio/ Oracle SQL developer applications.
  • Worked on file transfers with SFTP and FTP protocols among windows, Remote Unix servers using WINSCP.
  • Associated with Informatica Admins to run Informatica pmcmd commands to execute Informatica Workflows in respective DEV Environments.
  • Created and maintained the procedure documentation for DevOps-Test Data Management activities in secured SharePoint web application storage system.
  • Designed the ETL Informatica mappings using Informatica Power Center Designer tool, configuring the Sessions and Workflows using Power Center Workflow Manager tool for data movement between multiple data sources (ORACLE, SQL Servers), analyzing the data movement using Power Center Workflow monitor tools.
  • Created JIL files and updated existing JIL file jobs to automate the process of execution of ETL Informatica workflows using CA Workload Automation AE (AutoSys Scheduling system).
  • Worked closely with Release Management team to create the manifests in Team Foundation Server (TFS) Visual Studio to invoke the Non-Prod AutoSys jobs, Importing JIL file in UNIX Servers, Execution of PL/SQL queries as part of TDM routines.
  • Worked on implementing the automation of PHI/PII analysis, and execution of CA Javelin jobs.
  • Developed various Informatica jobs to improve the process of identifying the PHI/PII across several NON-PROD environments which involves bulk amount of data movement/data analysis in database objects.
  • Performed the peer-review on Test Data Management refresh jobs (PL/SQL Scripts, DDLs, DMLs, Informatica Jobs, FDM batch Files, Javelin workflows) and providing the necessary troubleshooting updates to the team.
  • Worked on designing the automation preparation process involving Test data governance Run books to implement the test data management strategies, Inventory Lists on TDM Refresh Process, DB repeatable tasks.
  • Created and maintained the procedure documentation for DevOps-Test Data Management activities in secured SharePoint web application storage system.
  • Implemented test data solutions by tiles creation for synthetic data generation using the GT Data Maker tool and CA TDM portal to improve the services for business users/QA/DEV Users.
  • Designed the reusable Data Processor Transformation using Informatica Developer Tool and integrating that into Informatica Power Center TDM owned repository to parse various types of file formats (JSON, XML, EXCEL, and FLAT FILES etc.).
  • Provided the test data manifests and folders as part of TDM Refresh process to the Release Management team periodically to execute the PL/SQL scripts, FDM batch files (to mask PHI/PII data) and Informatica jobs using TFS CARA tool.
  • Involved in creation of multiple tickets using CA Service catalog and IT Service Management portals to IT Security and DBA teams as part of accessing and modifying the Database roles and grantee permissions for TDM team DB Refresh process to provision the PHI free data to APP/DEV users.
  • Worked on generating the physical database after the subset and masked data in Virtual database using the process of V2P (Virtual-to-Physical) data generation using Delphix.
  • Using Delphix Self Service Portal, worked on creation of multiple sandboxes for existing NON-PROD & PROD snapshots/copies of several ORACLE database. And worked on reverting, taking bookmarks, reinstating those sandbox databases.
  • Clone the physical masked/subset database to generate multiple virtual databases and introduce them to lower NON-PRODUCTION environments like (APPLICATION, DEVELOPMENT, QA/UAT teams) to leverage the teams with potential PHI/PII free data which compliance to HIPPA making the organization HITRUST certified.
  • Support the DEV/UAT/QA teams by provisioning the Delphix generated V-Files where application server and application part is created and configured virtually to use against the VDBs (Virtual databases).
  • Create test cases for unit testing to validate the Informatica jobs, sub-setting scripts, Masking batch files, Auditing SQL Scripts by creating the necessary test cases.

Environment: Oracle 11g, 12c, MSSQL, SQL Server 12.0, 14.0, 15.0, Windows Server, Unix Server, Informatica Power Center 10.4.0, 10.2.0 HF2, 9.6.x,9.5.x, Informatica Developer, PL/SQL, ER Studio, Microsoft Visual Studio, Team Foundation Server (TFS), Delphix Self-service portal, Computer Associates Release Automation tool (CARA), Jenkins, GRID TOOLS (Fast Data Masker, Javelin), CA Self Service Portal, AutoSys, WinSCP, Putty.

Confidential

Data Modeler / Data Analyst

Responsibilities:

  • Studied in-house requirements for the Data warehouse to be developed.
  • Conducted one-on-one sessions with business users to gather data warehouse requirements.
  • Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions.
  • Developed a Conceptual model using Erwin based on requirements analysis.
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis.
  • Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation.
  • Responsible for defining the naming standards for data warehouse.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
  • Exhaustively collected business and technical metadata and maintained naming standards.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Informatica to load it into a single data warehouse repository.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Integrated the work tasks with relevant teams for smooth transition from testing to implementation.

Environment: Erwin r7.1, Informatica 7.1.3, Windows XP/NT/2000, SQL Server 2005/2008, SQL, Oracle10g, DB2, MS Excel, Mainframes MS Visio, Rational Rose, Requisite Pro.

Confidential

Program developer

Responsibilities:

  • Work in a team setting in relation to design and development activities using approved database and software development tools and methodologies.
  • Modify existing software to correct errors, to adapt it to new hardware or to upgrade interfaces and improve performance.
  • Perform root cause analysis on issues and provide effective timely technology resolutions.
  • Develops information systems by developing, and installing software solutions.
  • Perform analysis, design, development, unit testing and documentation for small-to-medium system implementations.
  • Demonstrated working knowledge of Object-Oriented Design/Analysis and web development/architectural methodologies.
  • Design and develop software systems, using scientific analysis and mathematical models to predict and measure outcome and consequences of design.

We'd love your feedback!