Etl/ Informatica Developer And Salesforce Integrator Resume
VA
SUMMARY:
- Over 7+ years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x/8.x/7.x.
- Good knowledge of Data warehouse concepts and principles - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center and Informatica Cloud Real Time(ICRT).
- Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators.
- Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
- Experience with Informatica Advanced Techniques like Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.
- Familiar in using cloud components and connectors to make API calls for accessing data from cloud storage ( Google Drive, Salesforce, Amazon S3, DropBox ) in Talend Open Studio .
- Experience on Monitoring and Scheduling of Jobs using UNIX (Korn & Bourn Shell) Scripting.
- Involved in the Migration process from Development, Test and Production Environments.
- Excellent backend skills in creating SQL objects like Tables, Stored Procedures, Views, Indexes, Triggers, user defined data types and functions.
- Experience in Creating SSIS package development and implementation for different tasks.
- Experience in migration of Data from Excel, Flat file, Oracle to MS SQL Server by using MS SQL Server DTS and SSIS
- Extensively involved in Optimization and Performance Tuning of mappings and sessions in Informatica by identifying and eliminating bottlenecks, memory management and partitioning.
- Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
- Extensively used Informatica Repository Manager and Workflow Monitor.
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
- Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
- Knowledge in developing reports using Business Intelligence tools like OBIEE, SSRS and Cognos.
- Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.
TECHNICAL SKILLS:
ETL Tools: Informatica 9.1,8.6/7.1/6.2 (Power Center/Power Mart, Informatica Cloud RealTime(ICRT), Talend Open Studio (TOS) for Data Integration 5.5.2
Data Modeling: Erwin 4.0/3.5, Star Schema Modelling, Snow Flake Modelling
Databases: Tera data 14,Oracle 10g/9i/8i,PL/SQL, MS SQL Server 2005/2000, DB2
OLAP Tools: Cognos 7.0/6.0,SSRS
Languages: SQL, PL/SQL, Unix Shell Script
Tools: Toad, SQL* Loader, SQL Navigator, Putty, MS-Office, VMWare Workstation
Operating Systems: Windows 2003/2000/NT, Sun Solaris, Linux
PROFESSIONAL EXPERIENCE:
Confidential, VA
ETL/ Informatica Developer and SalesForce Integrator
Responsibilities:
- Created Data Synchronization scripts in ICRT to insert, update or upsert data into SalesForce based on the type of data that needed to be updated on SalesForce.
- Created Data Replication tasks in ICRT to extract the GUID’s of SalesForce records to enable updates on SalesForce objects.
- Tuned the Synchronization and Replication jobs to reduce the runtimes of the jobs and eliminate data contention in the bulk data jobs.
- Worked on Informatica Cloud (SAAS) creating mappings and data synchronization tasks to performing daily loads and incremental loads of data from Salesforce Objects to Oracle and SQL server databases.
- Worked on migrating objects from dev to test and test to prod environments using org sandbox credentials.
- Created custom schedules and task flows in informatica cloud for running data synchronization tasks and mappings tasks at regular intervals.
- Imported workflow XMLs from powercenter to Informatica cloud.
- Created XML mappings for mapping certain objects where we upload data using a Salesforce tool called DataLoader.
- Participated in team meetings to analyze requirements of data load.
- Closely Monitored Activity logs.
- Created and used Parameter files in tasks, ran batch files as post processing commands in Informatica cloud to update $$ values of parameters in the file for use in next incremental load.
- Downloaded Secure agents from Informatica cloud and configured PC’s to work as secure agents.
- Used Apex Data loader to read/write CSV files to Salesforce Objects.
Environment: Informatica Power Center 9.6.0, Informatica Cloud RealTime(ICRT), Sales Force,DataLoader, Oracle 11g, SQL, PL/SQL, TOAD, SQL Server 2008, T-SQL, SQL Navigator
Confidential, Richmond, VA
ETL/ Informatica Developer
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Created technical design document based on Plan review document for team based on the business specifications documents.
- Responsible for gathering the data requirements for different test requirements of the project.
- Created complex work flows to load from different upstream source systems into common tables in the Teradata warehouse.
- Responsible for creating database objects like Table, Store Procedure, Triggers, Functions etc using T-SQL to provide structure to store data and to maintain database efficiently.
- Created Mapping document from Source to stage and Stage to target mapping.
- Creating UDF for handing multiple date format while masking the date.
- Coded functions and stored procedures to perform the user defined functions and execute the required query whenever needed.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Managed all technical aspects of the ETL mapping process with other team member
- Developed and modified UNIX shell scripts as part of the ETL process.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts
- Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
- As my role in this project dealt in moving data from ODS to TD ware house using Informatica, I also involved in Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
- Used Informatica and Bteq scripts to load Flat files into databases.
- Worked on Client proprietary WLM scheduler to run the ETL jobs.
Environment: Informatica Power Center 9.6.0, Teradata 14, Oracle 11g, SQL, PL/SQL, TOAD, Erwin, Unix Shell, SQL Server 2008, T-SQL, SQL Navigator, JIRA
Confidential, Richmond, VA
Informatica Developer
Responsibilities:
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Responsible for gathering business requirements, prepare Source to Target Mapping specifications and Transformation rules.
- Developed mappings/reusable objects/ transformations by using mapping designer and transformation developer in Informatica Power Center.
- Extracted data from sources like Mainframes using MVS CV database connection and loaded the data into ODS.
- Developed mappings, sessions and Workflows to load the Historical and Daily data in to ODS tables.
- Developed SCD Type 2 methodology to insert and update the records for the daily load in the target tables.
- Used SQL Over ride to join the source table with Date Range table to extract the data from Source for daily load.
- Developed Pl/SQL Stored procedures, functions, views, triggers and package as per business requirement.
- Created reusable component like User Defined function (UDF), Mapplet, transformation, tasks and worklets.
- Created Unconnected Lookup transformation to look up the data from the source and target tables.
- Created variable ports in Expression Transformation to define the “Effective date” and “Expiration date” for each record.
- Developed mappings and sessions to copy the data from ODS to Reporting database.
- Developed Clone Workflows to copy the data from Development Environment to Volume and Test Environments.
- ETL process to transfer data from staging tables to Star Schema.
- Created Promote requests to move the objects from Development folder to Volume and Test Repositories.
- Defined Global Variables in the “Parm File” for each session and defined the path of the Parm file at the Work flow level.
- Created Unit Test documents for the mappings to test the data load.
- Involved in data validation from Source to staging to Target.
- Conducted UAT with the end users before production release.
- Resolved Defects logged in the Mercury Quality Center during UAT.
- Involved in Performance tuning for sources, targets, mappings, sessions and server.
- Created test cases and completed unit, integration and system tests for Data Warehouse.
Environment: Informatica Power Center 9.1, DB2, Oracle 11g, SQL, PL/SQL, TOAD, Erwin, SQL SERVER 2008, T-SQL,SQL Navigator, JIRA
Confidential, Manhattan, NY
Senior Informatica Developer
Responsibilities:
- Worked as part of DBO migration team and involved in migrating existing Sybase Stored Procedures to Teradata.
- Converted complex Sybase stored procedures to Tera Data.
- Used Client proprietary Event Engine to trigger and monitor the jobs.
- Debugged the Store procedures when the corresponding job fails in the Event Engine by analyzing the logs.
- Documented the common errors that occur while converting a Sybase procedure to Teradata and best practices to be followed to avoid these errors.
- Analyzed complex Sybase Stored Procedures and modified them to suit Teradata database.
- Worked as part of ETL migration team and involved in migrating Ab-Initio graphs to MDF(Meta Data Driven Framework).
- As part of ETL migration team, analyzed complex Ab-Initio graphs and implemented the same functionality in client specific architecture called MDF and Informatica mappings where necessary.
- After converting the Ab-Initio graphs to MDF, they were triggered and monitored using corresponding nodes in Event Engine. If the nodes failed, we used to analyze the logs and debug the nodes.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
- Involved in the Migration process from Development, Test and Production Environments.
- Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
- Identified sources, targets, mappings and sessions and tuned them to improve performance. Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager. Scheduled the batches to be run using the Workflow Manager.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
- Mentored the new team members on Event Engine usage and debugging of jobs using Event Engine.
Environment: Informatica Power Center 8.6, Tera Data 14, Sybase, Unix, MDF, Ab-Initio, HP Quality Center.