Sr. Informatica/etl Developer Resume
Emeryville, CA
SUMMARY
- Over 7 years of IT experience in analysis, design, development, testing, maintenance and implementation of complex Data Warehousing applications using various ETL tool like Informatica and data bases like Oracle, SQL Server 2005 on windows and UNIX environments.
- Over 6 Years Strong experience in the Implementation of Extraction, Transformation & Loading (ETL) Life Cycle using Informatica Power Center/ Power Exchange v9.x/8.x
- Extensively worked on the PL - SQL in Oracle 11g/10g/9i, PostGre SQL, SQL Server, DB2,Teradata, RDBMS with Procedures, Functions, Triggers and SQL * Plus and SFDC application.
- Experience in Data masking of sensitive elements using ILM (Information Lifecycle Management)
- Experience with handling Informatica Designer, Informatica Workflow Manager, Workflow monitor and Repository Manager.
- Great experience in Identification of User requirements, System Design, writing Program specifications, Coding and implementation of the Systems.
- Worked on data analysis and profiling for source and target systems and good knowledge of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema.
- Expertise in developing and running Mappings, Sessions/tasks, Workflows, Worklets and Batch processes on Informatica server.
- Involved in Database design, entity relationship modeling and dimensional modeling using Star and Snowflake schemas.
- Have experience of modeling tool using Erwin 7.x/4.x Tools
- Extensively worked with mappings using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Normalizer, Union, Update Strategy, Unconnected / Connected Lookup, Aggregator and SCD - 1,2&3
- Having very good knowledge on reporting tools like OBIEE, BO and Tableau.
- Experience of unit testing of each mapping what we developed.
- Extensively worked on creating, executing test cases and test scripts using Manual/Automated methods.
- Having good working knowledge in Informatica CDC
- Worked in Production support team for maintaining the mappings, sessions and workflows to load the data in Data warehouse.
- Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
- Experience in UNIX Shell Scripting.
- Sound theoretical and practical background in the Principles of Operating systems and Multi-threaded applications.
- Worked on Big data, experience on HDFS, Mapreduce Job, Hive, Sqoop, NoSQL, PIG
- Experience in integration of various data sources like Oracle, DB2, Teradata, SQL Server, MS Access, XML and Flat files into the Staging Area.
- Effectively tuned ETL frameworks with hash partitioning, sorting algorithms for performance and scalability
- Strong knowledge in other ETL tools Datastage 8.1, Talend 4.2 and Working knowledge on Reporting tool BO, OBIEE
- Extensively used TOAD 12.5/9.0/8.5 to access Oracle database and Control center to access DB2 Database
TECHNICAL SKILLS
OPERATING SYSTEMS: Windows XP Professional, Server 2003, HP UNIX, Linux
ETL TOOL: Informatica Power Center v9.x/8.x/7.x, Datastage 8.1, Talend 4.2, SQL*Loader, ILM(Information Lifecycle Management) IDQ
DIMENSIONAL DATA MODELING: Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions tables, physical and logical data modeling, ERwin 4.1.2/3.x and Oracle Designer, ER Studio
REPORTING TOOLS: OBIEE, BO, Tableau
SCHEDULING TOOLS: Tidal, Dollar Universe
LANGUAGES: C, C++, PL/SQL, SQL
SCRIPTING LANGUAGES: Java Script, UNIX Shell, Python
DATABASES: Oracle 9i/10g/11g, SQL Server 2005/2008, MS-Access and DB2, Teradata, PL/SQL, Hive, NoSQL, PIG, Mapreduce
OTHER TOOLS: Autosys, Toad, Putty, Tel Net, WinSCP, Erwin 7.x/4.x, REMEDY, DB Solo 4 MS Power Point, Visio, Remedy and Share Point, Mercury Quality Centre, Version Control tool, SQL Developer, PVCS, CAD 2000/02/05/CAM, ANISYS 5.4
PROFESSIONAL EXPERIENCE
Confidential, Emeryville, CA
Sr. Informatica/ETL Developer
Responsibilities:
- Designing new architecture for Confidential membership billing.
- Creating new stories as per business requirement, and providing solution for that.
- Loading Flat files in to staging area and doing data profiling for insurance related data.
- Pulling Confidential travelling, Invoice related data from Sybase to oracle Staging area incremental.
Confidential, San Jose, CA
Sr. Informatica/Hadoop Developer
Responsibilities:
- Writing technical design document based on business requirement.
- Analyzing all ETLs and preparing TSD based on complexity of mapping.
- Prepared architecture document for existing Teradata Informatica workflows to Hadoop compatible architecture
- Loading data from Oracle into HDFS.
- Pulling history data from Teradata and Oracle using Sqoop and loading into Hive tables.
- Converting ETLs into Hadoop BigData edition
- Creating external Hive tables and writing custom hive query and pushing it down to Hadoop.
- Creating incremental mappings in Informatica big data edition for huge volume of daily incremental data.
- Pulling incremental data from Teradata and loading into external Hive tables using merge logic every day.
- Loading flat file data into HDFS incrementally, implementing all business logic in Hadoop and pushing into Teradata for reporting
- Used extensively Informatica Mappings to load data using transformations like Source Qualifier, Expression, Router, Union, Joiner, Filters.
- Doing performance tuning on Hive SQLs, Trying to match with TD SQLs
- Loading incremental/processed data from Hadoop to Teradata(3NF Tables)
Environment: Informatica 9.6.1, Informatica Big Data Edition, Sqoop, Hive-0.13, PIG, Mapreduce, Map-R, ORACLE RDBMS 11g, Teradata, WINScp, TOAD 10, UNIX, Putty, Remedy, Dollar Universe, Tidal, Tableau
Confidential, San Jose, CA
Sr. Informatica Developer
Responsibilities:
- Working with business ops team and Business Analyst. for gathering actual requirement from business and involving with them to write BRD’s
- Based BRD’s writing FSD’s for every Confidential quarter requirement based on time and resource availability
- Worked in all phases of PDLC from requirement, design, development, testing, support for production environment.
- Creating Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
- Landing data safely from Teradata to oracle staging env as part of daily and weekly incremental extraction jobs
- Implemented analytical functions.
- Identifying long run jobs (Informatica jobs, oracle procedures) and working with performance team and implementing changes based on perf team.
- Performed tuning of Informatica sessions by implementing partitioning in informatica level, increasing block size, data cache size, sequence buffer length with the help of DBA’s, target based commit interval and SQL overrides.
- Modifying existing procedures and shell scripting based on new enhancements.
- Generating weekly and monthly Anomaly and DDR reports, validating and sending to business team.
- Creating Uprocs, tasks, creating dependencies in Dollar Universe to schedule/Run workflows.
- Wrote test cases and evaluate test cases for Unit, Integration and system Testing; opening a case in QC,REMEDY
Environment: Informatica 9.6.1, ORACLE RDBMS 11g, Teradata, Shell programming, MS Visio, WINScp, TOAD 9, UNIX, Putty, Remedy, Dollar Universe, OBIEE, BO
Confidential, San Francisco, CA
Sr. Informatica Developer/Lead
Responsibilities:
- Gathering requirement from end user, conduction sessions with client SME’s; understanding thoroughly client requirement.
- Prepared mapping design document, STM based on HLD (High level Design Document), Involved in preparing HLDs with Business Analyst.
- Extracted Source data from MRM, Siebel and from SFDC; loading into staging area (oracle).
- Landing the cloud data (salesforce) by using informatica to the staging area (Oracle) safely.
- Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
- Merging Roche Canada data and Confidential data in staging area based on survivor and victim. And finally loading into SFDC objects.
- Involved in disabling and enabling DCR rules in SFDC before and after executing ETL process.
- Implemented delta logic in staging area.
- Developed Slowly Changing Dimension Mappings of type-2
- Created mapping to create parameter files in UNIX to run the workflows
- Used parameters and variables extensively in all the mappings, sessions and workflows for easier code modification and maintenance and for incremental load
- Implemented Incremental logic, error handle mechanism for all mappings for production environment.
- Involved informatica IDQ implimation.
- Supporting the UAT and fixing the issues raised in Mercury quality center
- Performed tuning of Informatica sessions by implementing partitioning in informatica level, increasing block size, data cache size, sequence buffer length with the help of DBA’s, target based commit interval and SQL overrides.
- Wrote test cases and evaluate test cases for Unit, Integration and system Testing; opening a case in QC
Environment: Informatica 8.6.1, ORACLE RDBMS 10g, SFDC, PL/SQL, Shell programming, WINScp, TOAD, UNIX, Putty, Remedy, Force.com Explorer
Confidential, San Jose, CA
Sr. Informatica Developer
Responsibilities:
- Worked in all phases of PDLC from requirement, design, development, testing, support for production environment.
- Prepared mapping document based on HLD (High level Design Document), Involved in preparing HLDs with Business Analyst.
- Prepared production run-book for support team
- Extracted Source data from SFDC, PostGre SQL, by using UI DB Solo 4 and force.com into staging area (oracle).
- Landing the cloud data (salesforce) by using informatica cloud services to the staging area (Oracle) safely.
- Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
- Developed Slowly Changing Dimension Mappings of type-2
- Created mapping to create parameter files in UNIX to run the workflows; Used parameters and variables extensively in all the mappings, sessions and workflows for easier code modification and maintenance and for incremental load
- Performed tuning of Informatica sessions by implementing partitioning in informatica level, increasing block size, data cache size, sequence buffer length with the help of DBA’s, target based commit interval and SQL overrides.
- Creating Uprocs, tasks, creating dependencies in Dollar Universe to schedule/Run workflows.
- Wrote test cases and evaluate test cases for Unit, Integration and system Testing; opening a case in QC,REMEDY, RTN (Right Now)
Environment: Informatica 9.1.0, ORACLE RDBMS 10g, PostGre SQL, SFDC, PL/SQL, Shell programming, MS Visio, WINScp, TOAD 9, UNIX, Putty, DB Solo 4, Remedy, RTN, Dollar Universe, ER Studio( Embarcadero)
Confidential, Redwood City, CA
Informatica Developer
Responsibilities:
- Convert the Business requirements into technical requirement documentation (LSD and ETL Design Documentation).
- Design ETL for Framework, Logging Activity, Control Tables and Error Handling by Interacting with the ETL Integration Consultants and Architects.
- Developed Informatica technical mapping document to pull data from different Source systems and integrate.
- Workflow Recovery with mapping build for recovery, Tuning and Error Handling.
- Debugging and Performance Tuning of sources, targets, mappings and sessions.
- Developed several reusable transformations and mapplets, several workflow tasks.
- Developed Slowly Changing Dimension Mappings of type II
- Created different transformations for loading the data into targets using various transformations like Sql, Joiner, Update Strategy, Lookup, Union, normalizer etc.
- Interact with SMEs to understand the enterprise level operations and its process.
- Complex mappings creation using various transformation tasks and fine tune.
- Build Packages/Procedures/Functions using PL/SQL and performance Tuning.
- Worked on Oracle Utilities SQL Loader, UTL FILE, SMTP, EXPORT, IMPORT, Triggers, EXPLAIN PLAN, Cursors, Views, Materialized Views, Error Handling mechanism etc.
- Assist in usage of Exceptional handling, records, arrays, Partitioning of tables, Bulk Collects.
- Responsible to deploy the releases according to the project plan.
- Provide production support and monitor the loads.
- Create RFC and SRs for any enhancements for deployment and get approvals.
- Write test cases and evaluate test cases for Unit, Integration and system Testing.
- Do health checks by ensuring the source data and target data are accurate and valid.
Environment: Informatica 9.0/8.6.1,OBIEE,ORACLE RDBMS 11i, PL/SQL, WinSQL, MS SQL Server, Shell programming, MS Visio, WINScp, TOAD 9, UNIX AIX, Erwin
Confidential, Atlanta, GA
Sr. Informatica Developer/Lead
Responsibilities:
- Involved in analysis, requirements gathering, function/technical specifications and development, deploying and testing.
- Prepared LLDs based on the HLDs to meet the business requirements
- Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
- Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
- Extracted source data using Power exchange from IMS DB.
- Used the Debugger in debugging some critical mappings to check the data flow from instance to instance.
- Developed the mapping to pull the information from different tables and used SQL Override to join the tables instead of Joiner transformation to improve the performance
- Created parameter files in UNIX to run the workflows.
- Scheduled sessions to update the target data using Workflow Manager.
- Developed all the mappings according to the design document and mapping specs provided and performed unit testing.
- Reviewed and validated the ETL mappings and data samples loaded in the test environment for data validation.
- Incorporated policy/rule/plan for sensitive elements to be masked using ILM (Information life cycle management) tool
- Performed tuning of Informatica sessions by implementing database partitioning, increasing block size, data cache size, sequence buffer length with the help of DBA’s, target based commit interval and SQL overrides.
- Performed data validation, reconciliation and error handling in the load process.
- Worked in Code migration from Informatica 8.6.1 to Informatica 9.0.1
- Performed unit testing to validate the data loads in different environments
- Resolved the defects and updated in quality center which are raised by the QA team.
- Provided support for daily and weekly batch loads.
Environment: Informatica Power Center 9.0.1/8.6.1 , Power exchange 9.0, ILM, TOAD, PL/SQL, Flat files, IMS DB, Oracle 10g, DB2, UDB, Mainframes, UNIX.
Confidential, Ann Arbor, MI
Informatica Consultant/Developer
Responsibilities:
- Involved in all phases of PDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
- Analyzed the Performed Data profiling on source systems data and the current reports at the client side to gather the requirements for the Design inception
- Prepared the High Level design documents like TAD and LLD documents like ETL specification documents.
- Analyzed Logical data Models and Forward Engineering the Physical data models using Erwin tool and execute them in DEV environment
- Experience with all the new features in Informatica 8.6.1 and migrated all the jobs from 7x to 8x
- Designed and modified more than 20+ jobs for Informatica ETL components
- Experience in modifying existing job according to client requirement and loading data into staging tables.
- Steps to improve the quality of identity data.
- Designed Jobs Using complex Stages like Web Services Client (source), XML Input Stage, Complex Flat File Stage, and Hashed file Stage.
- Involved and worked in staging area and supporting and developing
- Involved in designing relational models for ODS and data marts using Kimball methodology
- Extracted and transformed data from high volume data sets of delimited files and relational sources to load into target Database
- Used parameters and variables extensively in all the mappings, sessions and workflows for easier code modification and maintenance
- Analyzed existing SQL queries, tables and indexes for performance tuning and advised based on the loading time.
- Effectively used error handling logic mechanism for data and process errors
- Performance tuning has been done to increase the through put for both mapping and session level for large data files by increasing target based commit interval.
- Prepared Unit test reports and executing the Unit testing queries.
- Supporting the UAT and fixing the issues raised in QA
- Providing post production support for the project
Environment: Informatica 8.6.1, Oracle 10g, WinScp, putty, TOAD, SQL Developer, TortoiseSVN.
Confidential
SQL Consultant
Responsibilities:
- Worked with analysts and data source systems experts to map requirements to ETL code.
- Responsible for implementing data integration from source systems into Oracle data marts using Stored procedures, Functions and Triggers.
- Applied business and application knowledge to design the data loads for consistency and integrity.
- Worked with production load standards and error handling.
- Worked with IMS Data to validate Sample History module data. Assisted in performance tuning by running test runs.
- Created the transformation routines to transform and load the data. Tuned SQL queries for better performance.
- Unit testing the code and migration of code from Dev to QA and Prod
- Post production support of the application.
Environment: Oracle 8i, PL/SQL, Erwin, TOAD, MS Visio, UNIX-HP, SQL*Loader, Windows NT