Etl Developer/salesforce Data Migration Specialist Resume
Minneapolis, MN
SUMMARY
- Informatica Developer with 7+ years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart / Data Warehouse applications using ETL tools like Informatica Power Center 9.x/8.x/
- Worked as a Team Member on multiple engagements for Data Warehousing and Salesforce.com data Migration/Integration projects as an ETL Developer using different versions of Informatica PowerCenter as ETL Tools.
- Extensive experience in Salesforce.com Data Migration using Informatica with SFDC Connector and Apex Data Loader.
- Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets
- Design, develop and support application solutions with a focus on Teradata architecture and its utilities such as Fast Load, Multi Load, Fast export, TPump and BTEQ.
- Expertise in implementing complex Business rules by creating robust Mappings, Mapplets, Shortcuts, reusable transformations using transformations like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Good understanding of relational database management systems like Oracle and SQL Server and extensively worked on Data Integration usingInformaticafor the Extraction, transformation and loading of data from various database source systems
- Extensively worked on Informatica tools Repository manager, Designer, Workflow manager, Workflow monitor.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Expertise in Session partitioning and tuning session and lookup/aggregator/joiner caches for performance enhancements.
- Good experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.
- Strong in UNIX Shell scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like Autosys.
- Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
- Enthusiastic and adaptive of new technologies with Strong technical, Analytical, problem solving and Presentation skills.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter 8.x/9.x
Reporting Tools: Cognos, Business Objects
Programming Skills: Shell Scripting, SQL, PL/SQL
Databases: Oracle 9i/10g/11g, SQL Server 2008/2005/2000 , Teradata V2R4 V2R5, DB2
Methodologies: Data Modeling - Logical, Physical Dimensional Modeling - Star / Snowflake
Operating Systems: UNIX (Sun-Solaris, HP/UX, IBM AIX), LINUXWindows NT/XP
PROFESSIONAL EXPERIENCE
Confidential, MINNEAPOLIS, MN
ETL Developer/Salesforce Data Migration Specialist
ENVIRONMENT: Informatica PowerCenter 9.6.1, SUSE Linux, Power Exchange for Salesforce.com with Bulk API, DB2, Oracle 11g, MS SQL Server 2012, Teradata, Salesforce Data Loader, Autosys
RESPONSIBILITIES:
- Working with business to get the requirements and prepare requirements documents as per Confidential standards.
- Created mappings specification and unit test cases as per the requirements.
- Worked with delimited and fixed with flat files and load data into Salesforce.com using direct and indirect methods.
- Extensively used Salesforce Bulk Api to load 27 millions of customer in less than 6 hours.
- Used Salesforce Pipeline lookup transformation to directly lookup on Salesforce objects.
- Used pass through partitioning to improve the job runtime during data normalization.
- Developing UNIX scripts for splitting/sorting files, masking data and running workflows from command line.
- Extensively worked with different transformations such as Aggregator, Expression, Router, Filter, Lookup, and Sorter.
- Created reusable mapplets and transformation.
- Crated user defined function and called them in multiple mappings.
- Used mapping variables to remove the duplicates based on certain fields.
- Created Teradata External Loader connections such as MLoad, Upsert, MLoad Update, FastLoad and Tpump in the Informatica Workflow Manager while loading data into the target tables in Teradata Database
- Created command tasks to invoke UNIX scripts from Informatica.
- Worked with Workflow variables to pass the values to the command task rather passing hard coded value.
- Used upsert operation in Informatica to load data into salesforce.com using unique external id.
- Created UNIX scripts to invoke the jobs using pmcmd and automated them with scheduling tool AUTOSYS.
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
Confidential, DALLAS,TX
ETL Developer
RESPONSIBILITIES:
- Prepared Mapping specifications and wrote Unit Test cases for end-to-end data validation.
- Designed and developed ETL mappings using transformation logic for extracting the data from Sql Server and Oracle.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, SQL transformations.
- Used TYPE1 and TYPE2 Slowly changing methodologies to capture historical data according to business rules.
- Created user defined functions and used in multiple expressions in the mappings.
- Created worklets, workflows and reusable and non reusable tasks such as Session, Command, Email, and Assignment.
- Automated the load process using UNIX shell scripts, dynamically assigned the partition ranges to the parameter file.
- Identified bottlenecks in Target, Source, Mapping and Session using performance techniques.
- Developed UNIX scripts for pre data manipulation.
Confidential, DALLAS,TX
ETL Developer
RESPONSIBILITIES:
- Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate data from the source to the target DW hosted on Teradata.
- Understanding existing Salesforce and ER model of Salesforce’s Schema.
- Analyzed the business requirement document and created functional requirement document mapping all the business requirements.
- Designed and developed ETL mappings using transformation logic for extracting the data from various sources systems.
- Created Teradata External Loader connections such as MLoad, Upsert, MLoad Update, FastLoad and Tpump in the Informatica Workflow Manager while loading data into the target tables in Teradata Database.
- Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange formainframesources.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, SQL transformations.
- Created Mapplets, Reusable transformations
- Used mapping parameters and variables.
- Automated the load process using UNIX shell scripts, dynamically assigned the partition ranges to the parameter file.
- Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
- Developed process for error handling and auto reloading.
- Created reusable objects in Informatica for easy maintainability and reusability.
- Extensively used debugger to trace errors in the mapping.
- Involved in developing test plans and test scripts to test the data based on the business requirements.
- Developing UNIX scripts for pre data manipulation.
- Supported migration of ETL code from development to QA and then from QA to production.
- 24/7 On - call Support for daily batch jobs.
Confidential
Asst. System Engineer/ETL Developer
ENVIRONMENT: Informatica PowerCenter 8.1.1(ETL), Cognos Report Net 1.1, MS SQL Server 2005, Oracle 9i, Teradata, IBM AIX 4.3.3/5/1
RESPONSIBILITIES:
- Understanding existing business model and customer requirements.
- Studying the business logic for different custom objects in Salesforce environment.
- Understanding the Data architecture of legacy system and integrating/ automating the data load from the legacy system using ETL tools.
- Design, develop and support application solutions with a focus on Teradata architecture and its utilities such as Fast Load, Multi Load, Fast export, TPump and BTEQ.
- Designed and developed mappings and Mapplets using Informatica Source analyzer, Warehouse designer, Transformation designer and Mapplet designer.
- Created and developed numerous mappings using Expression, Router, Joiner, Lookup, Update Strategy, Stored Procedure and other transformations
- Extensively involved in working on different active and passive transformations and involved in generating, modifying and processing the data
- Extraction, transformation and loading of data were carried out from different sources like MS-Access, Flat files, MSSQL Server and ORACLE.
- Created stored procedures, triggers and functions to support the data load to Data Warehouse.
- Involved in designing and development of multi dimensional star schema.
- Involved in creations of various snapshots like transaction level snapshots, periodic snapshots and Accumulated Snapshots.
- Designed and developed pre and post session routines and batch execution routines
- Used session partitions, dynamic cache memory, and index cache to improve the performance
- Used debugger and breakpoints to view transformations output and debug mappings
- Implemented different tasks in workflows which included sessions, command task, decision task, timer, assignment, event-wait, event-raise, e-mail etc.
- Worked on writing UNIX shell scripts to invoke the sessions using PMCMD Command.
- Created the reports using report net functionalities like cross-tab, master detail.
- Customizing reports using breaks, filters and sorts.
- Involved in creation of Unit and System Test Cases.