Informatica Developer/informatica Cloud Developer Resume
Tampa, FL
SUMMARY
- Substantial experience in the areas of requirement analysis, dimensional data modeling, data profiling, ETL/ELT design, development and deployment for implementing data integration solutions.
- Experience in interacting with business users to create and/or understand requirements and transform them into logical, physical data models, ETL/ELT and BI solutions.
- Strong background in design, development, testing, implementation and support of end - to-end (data extract to data at rest) data integration solutions for Enterprise Data Warehouse (EDW) systems.
- Good experience in working with Waterfall and Scrum framework in Agile SDLC methodologies.
- Good knowledge about OLTP, OLAP concepts and terminologies such as 3NF data models, Star and Snowflake schemas (Kimball methodology), Facts, Dimensions, ODS and Data marts.
- Experience in conceptual & logical data modeling and in translating them to physical models such as database design, database de-normalization and database objects (schemas & tables) creation.
- Experience in creating database schemas, queries, joins, views, indexes, stored procedures, triggers, sequences and sub queries using PL/SQL, T-SQL, PostgreSQL and DB2 SQL.
- Experience in performing Data Profiling, Data Analysis, Data Quality Assessment and Ad-hoc querying against large datasets belonging to source and target databases or files.
- Extensive experience in translating business or functional requirements into ETL technical specifications and creating source to target mapping specifications/documents.
- Experience in working with various ETL Tools such as Informatica PowerCenter 9.x, IBM InfoSphere DataStage 8.x and SSIS 2008/2010/2012.
- Hands on experience in extracting data from multiple sources such as Flat Files (de-limited & CSV), XML Files, MS SQL Server, Oracle and with legacy systems such as AS400 & z/OS.
- Good understanding in working with MPP databases such as Teradata, Netezza and Greenplum as ETL targets for large Data Warehouses.
- Good Knowledge of Microsoft Azure Cloud concepts such as, Azure SQL Data Warehouse, Azure Data Lake and Azure Analytics and on using SOAP Webservices/REST API in Informatica Cloud Services (ICS).
- Good Knowledge of various phases of STLC with respect to ETL testing such as Smoke testing, Functional testing, System Integration testing, Performance and Regression testing.
- Extensive hands-on experience in working with job scheduling tools such as Autosys, Control-M and TWS and with version control tools such as Tortoise SVN, Microsoft TFS and GitHub.
- Strong erudition about performance tuning, optimization & troubleshooting of SQL queries across Oracle SQL Server and DB2 relational database systems (RDBMS).
- Expertise in creating, publishing, updating and maintainingprocess guidelines for key ITIL disciplines such as Incident management, Change management and Release management for ETL production deployments.
TECHNICAL SKILLS
Programming Languages: UNIX/Linux Shell Scripting, PL/SQL, T-SQL, PostgreSQL 9.1, Windows Batch Scripting, PowerShell Scripting.
Tools: Informatica PowerCenter 9.x, IBM® Data Studio 4.1.0.0, Advanced Query Tool 7.0, IBM iSeries Navigator 12.0, Oracle SQL Developer, WinSQL 5.2, Toad 12.8.0.49, MicroStrategy 9.x, Autosys 4.5, BMC Control-M, PuTTY 0.60, FileZilla 3.7.3, JIRA 4.0, Apache SVN 1.8.2, MS Project 2013, HP Service Manager v4.2, MS Office Suite 2013.
Databases: Teradata 13.0, SQL Server 2005/2008/2008 R2/2010/2012, DB2/400, DB2 for z/OS, Oracle 10g/11g, MS Access 2012, Greenplum 4.3.1.
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
Informatica Developer/Informatica Cloud Developer
Environment: Informatica PowerCenter 9.6.1 HF4, Informatica Cloud Spring 2017, Oracle 12c (Oracle Exadata), OBIEE 11g, MS SQL Server 2016, Azure SQL Data Warehouse, Linux 3.10.
Responsibilities:
- Collaborated with EnterpriseIT leadership, project managers, business systems analysts, developers, architects and portfolio directors understand and discuss various Data Integration Initiatives.
- Participated in Data Integration design reviews with Enterprise Architecture team prior to the solution development phase.
- Created integration task flows between multiple applications (external vendor systems) and SAP-ERP systems utilizing Informatica Cloud Services (ICS).
- Participated in project walkthroughs and design discussions of Lease Accounting ERP integration project with External Vendors, Project managers, Enterprise Architects and Business System analysts.
- Developed ETL mappings using Informatica Cloud Services to extract data from flat files and load into SQL Server 2016 database targets.
- Worked with Senior developers & Application Architects to finalize architectural and technical requirements of new ETL Integrations for Lease Accounting and EDW Sales data integration projects.
- Created mappings using Informatica Cloud Services to extract data from SQL Server 2016 staging tables and load into flat file targets.
- Maintained Informatica PowerCenter ETL source code, Oracle 12c DDL scripts using versioning systems such as Microsoft TFS and SQL Server 2016 DDL scripts using Visual Studio Git.
- Interacted with external vendors to understand and discuss technical requirements of Lease Accounting ERP integration project to ensure that vendors are aligned on internal IT standards and policies.
- Worked with Informatica Cloud Services components such as task flow, mapping configuration task, mapping designer, data synchronization and data replication tasks.
- Performed CRUD operations with DDL and DML commands on Oracle 12c and SQL Server 2016 database tables using PL/SQL and T-SQL commands respectively.
- Supported admins in migrating/promoting the Informatica PowerCenter and Informatica Cloud ETL code from lower environments to higher environments.
- Assisted the QA team in defining Test plan, in devising Test case scenarios, in performing ETL testing and data validation as per business, functional and technical requirements.
- Participated in code review sessions with Application Architects to ensure that Informatica PowerCenter ETL and Informatica Cloud ETL coding and naming standards are met.
- Created ETL mappings using Informatica Cloud Services to extract/synchronize data from Salesforce (SFDC) platform and load the data into SQL Server 2016 database/sync data between SFDC and Azure SQL Data Warehouse using ICRT(Informatica Cloud Real Time - ICRT).
- Performed production data analysis on ODS and EDW Oracle 12c tables using PL/SQL queries to determine the root causes for production data issues on ad-hoc basis.
- Created source to target ETL mapping specs and transformation rules to support ETL/ELT development for EDW Sales and Lease Accounting ERP Data Integration projects.
- Worked with fellow developers within the system integrations team to determine appropriate technical requirements for creating various ETL solutions within Informatica landscape.
- Worked with Business Systems Analysts and Senior developers to define strategies to fix historical data on Oracle Exadata platform using Informatica PowerCenter ETL solution.
- Developed ETL mappings using Informatica PowerCenter to generate parameter files and to update/fix the problems with historical data in EDW and therefore in OBIEE BI reporting layers.
- Worked with various Linux commands and created Linux scripts to rename, move files and to automate file transfers.
Confidential - Sunrise, FL
Application Architect
Environment: Informatica PowerCenter 9.6.1 HF1, IBM DB2 for z/OS, UNIX, BMC Control-M.
Responsibilities:
- Worked on ETL projects which were based on Scrum framework for Agile methodology.
- Analyzed various Informatica mappings and mapplets to understand the implementation of existing (ETL architecture, design and code semantics) business rules across different consumer markets.
- Worked with Stakeholders, Product Owners, Business Analysts and System Analysts to gather and/or understand business requirements and to define, create, update, test and maintain consumer market based business rules.
- Assisted business analysts, stakeholders and technical project managers in performing data validation as per business scenarios during UAT and post production deployment.
- Supported ETL code migration activities including production deployment and validated Informatica ETL code post deployment.
- Maintained business parameters and transformation/business rules for various markets using DB2 database tables and Informatica reusable mapplets respectively.
- Evaluated data quality by analyzing large volumes of data using SQL queries and resolved data quality issues by fixing the ETL Informatica transformation rules/code (data quality rules) wherever necessary.
- Suggested measures and provided guidelines to fix/improve the data quality based on issues identified during analysis of production data.
- Code fixed transformation/business rules in Informatica transformations and tested the ETL code across multiple environments like Dev, QA/UAT to achieve consistent results.
- Extensively queried mainframe DB2 tables using SQL queries to fulfill ad-hoc reporting, production data analysis and data extract requests.
- Analyzed DB2 stored procedures to debug a production data issue and generate stored proc output to replicate production issue in lower environments to determine appropriate type of fix.
- Created various MS Excel (delimited & CSV) reports and XML reports on ad-hoc and regular basis to provide data extracts containing customer financial information to business stakeholders.
- Implemented re-usable business/transformation rules using Informatica PowerCenter mapplets across international consumer markets to report customer credit data to various credit bureaus.
- Worked with Informatica designer components such as Source Analyzer, Target Designer, Mapplet Designer, Transformation Developer and Mapping Designer.
- Worked on extracting data from all potential source systems and extensively performed data lineage analysis using Informatica to track data origin, data movement, data transformation to enable data debugging.
- Extensively used Informatica debugger to perform step-wise debugging and to regenerate lost output.
- Used various UNIX commands to edit different types of files, move files into and out of UNIX server and to remove control M/junk characters in UNIX files.
- Worked with XML Parser and XML Generator transformations using Informatica PowerCenter to parse and generate XML files.
Confidential, Scottsdale, AZ
ETL Informatica Developer
Environment: Informatica PowerCenter 9.5.1, SQL Server 2008 R2, Oracle 11g, Oracle EBS 11i, Linux, MicroStrategy 9.5.1
Responsibilities:
- Involved in full phase EDW implementation right from dimensional data modeling, database design & implementation along with design and development of ETL/ELT solution.
- Extensively created T-SQL and PL/SQL queries to perform source data analysis and data profiling to evaluate source data format and data quality.
- Analyzed and worked on various design approaches for creating different dimensions and facts in the process of building a data mart.
- Worked with Business Analyst, Information Management Architect and BI Architect to finalize functional and technical requirements for Member Integration EDW.
- Analyzed source data and performed dimensional data modeling to create an EDW Star schema from flat file source structure.
- Defined, designed, developed and implemented conceptual, logical and physical Star schema data model for Member Integration EDW.
- Worked with project stakeholders and business analysts to understand business and functional requirements and translated them into ETL/ELT technical specification and design.
- Designed, developed and implemented Fact tables and SCD type-1 dimension tables in Star schema for Member Integration EDW.
- Participated in ETL/ELT requirement design activities to support development of Member Integration EDW through data modeling (logical and physical) and ETL/ELT solution creation activity.
- Worked with Reporting Analyst to identify and understand source data systems, source data files and data organization.
- Analyzed business user requirements to define data requirements for new analytics and reporting projects.
- Developed source to target mapping specs and transformation rules to support ETL/ELT development.
- Created Informatica 9.5.1 ETL mappings to integrate Bonfils, Blood Source, LifeShare, LifeStream and CBS data from flat files to Oracle EDW.
- Created Informatica 9.5.1 ETL mappings to integrate KRONOS and RUT source data from MS SQL Server to CTS DataMart on Oracle EDW.
- Created Informatica 9.5.1 ETL mappings to integrate from Oracle EBS source data to EDW DataMart on Oracle EDW.
- Worked with Informatica 9.5.1 tasks such as session, email, command and event wait tasks.
- Worked with conformed and role-playing dimensions while creating Star schema for Member Integration EDW & CTS DataMart.
- Created Oracle PL/SQL triggers and sequences using Toad 12.8 to generate surrogate keys for EDW dimension tables.
- Used Informatica debugger to evaluate code, monitor data movement and troubleshoot the mappings.
- Performed CRUD operations using DDL and DML PL/SQL commands on Oracle RDBMS tables.
- Prepared and executed ETL unit test plans and unit test cases to resolve data issues with Member Data Integration EDW.
Confidential - Madison, WI
Data Warehouse Analyst/Lead
Environment: Informatica PowerCenter 9.6, Oracle 11g, Greenplum 4.3.1, PostgreSQL 9.1, Autosys v4.5
Responsibilities:
- Participated in Technical design discussions and walkthroughs of Fusion initiative’s Data Warehouse and Big data projects.
- Analyzed the business requirements to determine In-scope and out-of-scope development items of Fusion program’s Data Warehouse and Big Data projects.
- Involved in managing, leading, assisting and guiding offshore in devising test plan & test cases to perform ETL testing.
- Queried Oracle and Greenplum tables using PL/SQL and PostgreSQL respectively to analyze various data sets.
- Extensively worked on Greenplum using PostgreSQL to mock up data, perform data analysis and to- address business questions.
- Worked with Informatica PowerCenter components such as Source Analyzer, Target Designer, Transformation developer, Mapping Designer.
- Lead the development and test efforts and coordinated with onshore and offshore resources.
- Worked for Fusion Data Warehouse projects based on Scrum framework for Agile methodology.
- Created ETL mappings to load data from various external sources like Flat files, Oracle, SQL Server into Greenplum database.
- Performed ELT activity from Oracle to Greenplumusing Greenplum Utilities and shell scripts.
- Created Ad-hoc queries, views and functions inGreenplumto facilitate data availability and access for Project stakeholders, Business and Reporting analysts.
- Prepared and reviewed the necessary test strategy and test plans for ETL and Big Data Projects.
- Participated in ETL code walkthrough sessions to understand and determine the best data Extract, Transform and Loading approach from source to target system.
Confidential - Tampa, FL
Consultant
Environment: Informatica PowerCenter 9.1, Linux/UNIX, SQL Server 2010/2012, Oracle 11g, Autosys v4.5.
Responsibilities:
- Involved in implementation of Loan IQ data mart right from data modeling, database design & implementation along with design and development of ETL/ELT solution.
- Created and maintained ETL process run books, deployment guidelines & support documents to support incident, change management and release management for ETL production deployments.
- Created and executed Unit, SIT, Smoke, Functional, Positive, Negative, Performance, Regression test plans, test cases for testing ETL mappings in Dev, QA, UAT & Stage environments.
- Developed ETL solutions using Informatica PowerCenter 9.1 for Loan IQ data mart and supported code migration of those ETL objects into UAT, stage and production environments.
- Created, scheduled and system tested the Autosys JIL scripts & the Autosys scheduler flow to execute the ETL workflows in Loan IQ Data Mart across Dev, Test and Prod environments.
- Created windows PowerShell and Batch scripts to delete windows server logs, to restart windows services and to execute T-SQL stored procedures in MS SQL Server.
- Worked with Informatica Designer components such as Source Analyzer, Target Designer, Transformation developer, Mapping Designer.
- Created shell scripts to transfer files, move file content, automate job runs and to archive files in UNIX server and configured Informatica pre/post-session components to run UNIX commands/shell scripts.
- Created and implemented logical and physical data model involving SCD type 1 dimensions for Loan IQ Data Mart.
- Extensively performed source data analysis using T-SQL queries to determine the impacts of change in source system to the existing Informatica ETL mappings and workflows.
- Used Oracle SQL developer to query NCDI Oracle data sources using PL/SQL to understand source data organization and data quality.
- Created and executed T-SQL queries using SQL Server Management Studio to perform post deployment data validation testing.
- Created SQL Server database tables, indexes, views, stored procedures, primary & foreign keys using T-SQL to implement a physical data model for Loan IQ DataMart.
- Developed Informatica 9.1 mappings, sessions and workflows (technical solutions) which, Extract, Transform and Load data from and to multiple data sources & targets like Flat files, SQL Server and Oracle RDBMS.
- Prepared, reviewed ETL test cases, test plans and SQL test scripts to assist business analysts for data validation during SIT, functional and UAT testing phases.
- Created and published source to target mapping documents to reflect data transformation logic and data flow.
- Debugged ETL mappings using Informatica debugger to evaluate code and to view transformation output.
- Actively participated in ETL technical design & code review sessions and maintained ETL code using Informatica PowerCenter and Tortoise SVN versioning systems.
- Worked on Scrum software development framework in Agile project methodology with hands-on experience in using Atlassian JIRA issue tracking tool for Agile projects.
- Used Informatica PowerCenter repository manager to compare ETL code, to export and import ETL objects into various environments.
- Worked extensively with Informatica PowerCenter 9.x transformations like Source Qualifier, Expression, Filter, Router, Lookup, Update Strategy, Sequence Generator, Joiner and Union transformations.
Confidential
Applications Developer I
Environment: Informatica PowerCenter 9.1, SQL Server 2008/2008 R2, Oracle10g, IBM DB2/400, MicroStrategy 9.1.
Responsibilities:
- Extenstive knowledge about HIPAA compliances and guidelines in keeping the health information private and secure.
- Strong knowledge in working with Healthcare PHI data especially with Healthcare/Pharmacy claims data along with training and certification in Pharmacy Benefits Management (PBM) level 0 and level 1.
- Worked with PharmMD (external vendor/client) to effectively load, validate and integrate MTM cyclic data from source files into Enterprise Data Mart and generated Medication Therapy Management weekly/monthly extracts for the vendor.
- Extensively queried data sources (OLTP) and Data Warehouse (OLAP) using SQL to conduct source and target data analysis on pharmacy (Medicare & Medicaid) claims data involving ICD-9 Medical diagnosis codes.
- Handled regular, ad-hoc data load, data extract and report requests and generated MS Excel reports and flat files for Medication Therapy Management (MDM) DataMart.
- Analyzed RxClaim source and target data systems to develop ETL & ELT layer using Informatica PowerCenter 9.1.
- Worked with Informatica PowerCenter 9.1 designer components such as Source Analyzer, Target Designer, Transformation Developer and Mapping Designer.
- Worked extensively on Informatica Workflow Manager, Workflow Monitor to create, edit and run workflows, tasks, shell scripts.
- Created ETL mappings and workflows using Informatica Power Center 9.1 to move data from disparate sources viz.
- legacy systems (IBM iSeries AS400), relational databases (SQL Server & Oracle) and flat files to common targets(DB2) such as Staging, EDW and other Data Marts.
- Subject Matter Expert of the Medication Therapy Management, E-Biz and CSRF - SFDC DataMart’s.
- Operated and maintained Eligibility, Enterprise Claims Data Mart (EDM - Claims) and other Enterprise Data Warehouse projects using Informatica PowerCenter 9.1.
- Executed SQL queries using Advanced Query Tool (DB2) to perform post deployment data validation.
- Generated ad-hoc Business Intelligence reports for end business users using MicroStrategy 9.1.
- Handled ad-hoc data load requests by manually running ETL processes on demand and created DB2 SQL queries to handle ad-hoc data & report requests.
- Worked on rotational support model, provided direct & on-call support for production EDW-ETL Informatica load processes and coordinated support tasks with onsite and offshore team.
- Actively supported EDW-ETL production releases, performed post deployment support and issue troubleshooting.
- Performed troubleshooting of Informatica 9.1 ETL workflow failures and handled production incidents.
- Created UNIX Shell scripts and Windows Batch scripts to automate file transfers from Data Warehouse to customer file landing gateway\zone.
- Prepared and published Data Warehouse load (ETL) process support run books with all the job details, known errors and troubleshooting steps.
- Maintained issue tracker to track ongoing EDW (ETL&ELT) production issues and their resolution methods.
- Worked with Change data capture (CDC) for AS400 (DB2/400) sources and with SCD Type 2 dimensions for EDW targets.