We provide IT Staff Augmentation Services!

Etl Developer Resume

3.00/5 (Submit Your Rating)

Summary

  • About 5+ years of experience in designing and implementing Data Marts / Data Warehouses using ETL tools like DMExpress, SQL Server Integration Services, Informatica PowerCenter 8.1.1
  • Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Rank, Expression, Aggregator, Joiner, Update Strategy, Key generation.
  • Worked extensively in Transportation, Financial, HealthCare and Sales Domains.
  • Proficient in using Informatica Workflow manager, Workflow monitor, pmcmd (Informatica command line utility) to create schedule and control Workflows, tasks and sessions.
  • Specialized Knowledge in Change Data Capture techniques like Slowly Growing Target and Slowly Changing Dimension (Type 1, Type2 and Type3) simple pass through mapping.
  • Very good experience in Performance Tuning for both ETL tools and Query Languages by identifying the amount of data volume to time ratio.
  • Experience in Pushdown Optimization, Source Qualifier Optimization, and Partition methods in Informatica performance tuning.
  • Developed SSIS packages for incremental data capture and historical data.
  • Expertise in CDI (Customer Data Integrity) process.
  • Experience in application of metadata for packaged applications from Seibel, Oracle and SAP.
  • Thorough experience in all stages of Software Development Life Cycle (SDLC) with deep understanding of various phases like Requirements gathering, Analysis, Design, Development and Testing.
  • Experience in modeling Transactional Databases and Data Warehouses using tools like Erwin and ER/Studio.
  • Familiar with Sybase ASE, Netezza and Oracle environments.
  • Experience in developing logical and physical models and implementing them in Oracle.
  • Excellent skills in retrieving/manipulating data by writing simple/ complex SQL Queries and optimizing SQL and PL/SQL and PL/SQL Protocols to access the repository.
  • Experience in integrating data from various sources Oracle, DB2 UDB, Sybase, Teradata using ODI
  • Experience in working with models designed to populate multi dimensional models, star and snowflake schema for data warehouse and datamarts.
  • Documented design procedures, operating instructions test procedures and troubleshooting procedures
  • Expertise in ODBC, OLEDB, Stored Procedures, Triggers.
  • Proficient in PERL, UNIX Shell scripting.
  • Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.
  • Excellent written and oral communication skills and a team-player with a results-oriented attitude.
  • Developed VB Scripts to kick off daily jobs through Tidal.

Skills

ETL Tools : Informatica (Power Center 8.1.1), DMExpress, SQL Server Integration Services.
Data Bases : Netezza, Oracle 11g/10g/9i/8i/8.0/7.0, SQL Server 2000, DB2, Teradata, MS Access.
BI Tools : OBIEE Oracle BI Enterprise Edition 10.1.3 and Siebel Analytics 7.8/7.7/7.5
Data Modeling : Erwin, ER/Studio, Toad Data Modeler, MS Visio, Sybase Power Designer
Languages : Visual Basic, SQL*Plus, PL/SQL, NZSQL, Transact SQL, Shell Script, C/C++, PHP, HTML, XML
Operating System : Windows 98/NT/2000/XP/Vista/7, UNIX (Linux, HP-Unix, Solaris).
Job Schedulers : Tidal, Autosys
Other Software : JIRA, MS Office, MS Project, Win SQL, TOAD 10.5.

Professional Experience

Confidential May 10 – Till Date ETL Developer

Project: Managed/Franchised Visibility

Description: Marriott has more than 50% of its properties as franchised. Earlier all the properties are treated as participating properties, which implies every property could see rest of all the properties quotes, opportunities and contacts. Due to some confidential reasons, a new hierarchy is set up which divides the properties into participating and non- participating thereby restricting the data of non- participating properties and limiting their visibility by other properties. Developed mappings and modified existing to absorb the hierarchical change in the warehouse without having any effect on the reports.

Responsibilities:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Parsing high-level design specs to simple ETL coding and mapping standards. Designed mapping document, which is a guideline to ETL Coding.
  • Created Mappings to move from Various Systems into the Data Warehouse.
  • Worked on Informatica tool Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, and Transformations.
  • Created different Transformations for loading the data into target like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Implemented slowly changing dimensions methodology and developed mappings to keep track of historical data.
  • Used Workflow Manager to create various tasks and used the Workflow Monitor to monitor the workflows.
  • Worked extensively on Flat Files, as the data from various Legacy Systems is flat files.
  • Involved in all phases of SDLC from requirement, design, and development and testing..
  • Scheduled workflows using Unix.
  • Implemented slowly changing dimensions (Type 2) methodology to keep track of historical data
  • Documented Design and Unit Test plan for Informatica mappings, design and validation rules.
  • Scheduling and monitored batch jobs in UNIX environment using Shell Scripts depend on the server load.
  • Wrote Stored procedures, Functions, Triggers and UNIX shell scripts to support and Automate the ETL process.

Environment: Informatica PowerCenter 8.1.1, Netezza, DB2, Oracle 11i, SQL, PL/SQL, Windows 2003 Server, Unix, Toad, ERwin 4.1, Win SQL, VB, Filezilla, MS Visio, MS Project.

Confidential Jun 09 – May 10 ETL Developer

Project: Managed/Franchised Visibility
Description Dell 90 day triggers. Dell had 5 triggers that it wanted to send its customers for a period of 90 days from the date of purchase. We had daily feeds from DELL that consisted of buyer’s data. Had to build code so as to select the best contact and maintain integrity through out the process.
Developed the tasks, scripts and coded the packages so as to perform data hygiene and validate according to the functional requirements.

Responsibilities:

  • Designed and developed Informatica Logics and DMExpress tasks and jobs based on the business requirements.
  • Extracted the data from all sorts of Flat-files and loaded the data into Netezza using DM Express or external tables depending on the file size and transformation required.
  • Created automated tasks and jobs in DMExpress so as to run according to daily or weekly based on refresh rate.
  • Worked on the Netezza Admin Console when the issues were not solved at the session/workflow level.
  • Developed Netezza scripts where ever transformations were fast rather than using an ETL tool.
  • Highly used Netezza Performance Server to improve the analysis at the source during streaming speeds.
  • Also used NPS system to combine commodity hardware components with Linux and a high-performance database and lowered the total cost.
  • Developed DTS packages based on the requirements where transformations are complex in DMExpress.
  • Used the DMExpress and DTS packages for Netezza in an optimized manner such that the performance is maintained yet at a higher speed rate.
  • Analyzed the right tool for the right task and there by improved the performance.
  • Worked extensively on different types of transformations like Join, Partition, Merge, Expression, Router, Filter, Connected and Un-connected lookup, Normalizer.
  • Used Sybase Powerdesigner to define the entity-relations – ER Diagrams for OLTP and OLAP (data warehouse).
  • Creating sessions, worklets and workflows for carrying out test loads.
  • Extensively used tidal in setting up DTS packages, DM Express tasks and jobs, FTP jobs to run daily, weekly and monthly.
  • Good experience in using VB scripts to run tasks and jobs at required time using the auto-environment variables.

Environment: Netezza, DM Express , Informatica Power Center 8.5, WinSQL, SQL Server 2000, SQL, Sybase Powerdesigner, Tidal, Filezilla, VBeditor, .

Confidential Oct 08 – June 09 Informatica Developer

Description:

IMAKE Consulting, Inc. provides software and services to the structure finance community. It offers Analytics On-Demand, an Internet-based analytical tool that provides solutions to the structured finance market, including loan level analysis and cashflow generator, securitization structure and collateral modeling, cashflow analysis, monthly bond factor calculation and investor

Responsibilities:

  • Designed and developed Informatica Mappings based on the business requirements.
  • Interacted with DBA and Source systems and resolved the Issues.
  • Extracted the data from Flatfiles and loaded the data into Oracle using Informatica Power center.
  • Created shortcuts for source and target and also created Reusable Transformations in Shared folder.
  • Extracted transactional data from Mainframe sources and loaded to Teradata tables using Power Exchange and Power Center.
  • Extensively used Informatica client tools – Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Worked on the Informatica Admin Console when the issues were not solved at the session/workflow level.
  • Involved in designing and implementing Change Data Capture techniques like Slowly Growing Target and Slowly Changing Dimension(Type 1,Type2,Type3) simple pass through mapping.
  • Involved in creating the repositories and setting up the repository service and integration service.
  • Involved in complete System Software development Life Cycle (SDLC) of Data warehousing, Decision Support System.
  • Worked extensively on different types of transformations like Source qualifier, expression, Router, filter, update strategy, Connected and Un-connected lookup, Normalizer.
  • Used the PL/SQL protocols to access the repository.
  • Wrote UNIX shell scripts to run as a consolidated file of multiple shell scripts in sequential as well as parallel.
  • Experience in UNIX shell scripting, PL/SQL and SQL scripting for automation of ETL processes.
  • Defined the entity-relations – ER Diagrams and designed the physical databases for OLTP and OLAP (data warehouse).
  • Creating sessions, worklets and workflows for carrying out test loads.
  • Created the Deployment groups to move the workflows from one environment to the other.
  • Designed/created the plan for the Master workflows that are to be scheduled to run on daily and monthly basis.
  • Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem troubleshooting.

Environment: Informatica Power Center 8.5, Oracle 10g, Toad 9, Unix Perl Scripts, Unix Server, HP-UX 11i, Toad Data modeler, Teradata, SQL, Autosys.

Confidential Oct 07 – Sep 08 ETL Consultant

Description: Beltway Consulting is a leading business technology solutions company by providing their clients with the finest resources and solutions in the industry.

Responsibilities:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Parsing high-level design specs to simple ETL coding and mapping standards. Designed mapping document, which is a guideline to ETL Coding.
  • Created Mappings to move from Various Systems into the Data Warehouse.
  • Worked on Informatica tool -Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, and Transformations.
  • Created different Transformations for loading the data into target like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Performed Reverse Engineering of the legacy application using DDL scripts in Erwin, and developed Logical and Physical data models for Central Model consolidation.
  • Worked with DBAs to create a best fit Physical Data Model from the Logical Data Model using Erwin.
  • Writing Stored Programs (Procedures & Functions) to do data transformations and integrate them with Informatica programs and the existing application.
  • Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
  • Used mapping parameters and variables.
  • Worked on Parameterize of all variables, connections at all levels in Window NT.

Environment: Informatica Power Center 8.1, Workflow Manager, Workflow Monitor, Erwin 4.0/3.5.2, TOAD 8.6.1.0, PL/SQL, Flat files, XML, Oracle 10g/9i/.

Confidential Dec 06 – Sep 07 Database Developer

Responsibilities:

  • Designed module wise Oracle 8.0 database structure and its implementation on the client server
  • Developed tables, views, stored procedures, functions, and user-defined data types
  • Created table structures along with the triggers, indexes like bitmap and function based indexes
  • Monitored indexes and analyzed their status for performance tuning and query optimization
  • Maintained the data integrity and security using constraints and database triggers
  • Involved in programming database triggers for automatically updating the tables and views
  • Used various forms of control structures including CASE, DECODE, IF-THEN-ELSE, FOR loops, WHILE loops while developing procedures
  • Implemented table partitioning and sub-partitioning to improve performance and data management
  • Troubleshoot performance issues within packages and stored procedures using explain plan and DBMS_OUTPUT
  • Responsible for tuning the application with the help of team members & Oracle support
  • Improved the query performance using better joins and hints appropriately
  • Developed packages, procedures and functions for complex reports
  • Used SQL*Loader for loading the data into the tables from files
  • Developed batch programs using UNIX shell scripts
  • Participated in the design and code reviews and verified compliance with the project’s plan
  • Documented the whole process flow, PL/SQL packages, locations and descriptions, and possible error messages

Environment: Oracle 8i, SQL, PL/SQL, SQL Plus, Sun Solaris 2.6, SQL loader, UNIX

Confidential Jan ‘05 – Mar ‘06 Data Warehouse Developer

Description:
J C Solutions will work within your technical and business requirements to develop computer or web software applications that will meet or exceed your expectations.
Our business solutions in the learning domain help make learning interesting, interactive and fun! They have the right blend of content (instruction) and cutting-edge technologies that offer you the best benefits

  • Expertise in data design and modeling, system study, design, and development by applying Ralph Kimball methodology of dimensional modeling.
  • Imported data from various Sources and loaded to Target database, created Transformations using Informatica Power Center 5.1 Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
  • Used PowerConnect to extract data from PeopleSoft systems.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Used transformations like Connected and Unconnected Lookups, Aggregator, Update, Router and Sequence generator.
  • Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Wrote PL/SQL stored procedures using Toad.
  • Did performance tuning at source, transformation, target, and workflow levels.
  • Involved in creating and managing global and local repositories and assigning permissions using Repository Manager. Also migrated repositories between development, testing, QA and production systems.
  • Upgraded the repository from 5.1 to 6.1 version by doing backup, restore and upgrade.
  • Used PMCMD, PMREP and UNIX shell scripts for workflow automation and repository administration.
  • Used PVCS for version control and AutoSys for Job scheduling.

Environment: Informatica PowerCenter 6.1, PowerConnect, Business Objects, PowerPlay, SQL, PL/SQL, Oracle 8i,TOAD, Flat Files, UNIX and Windows 2000/NT.

Education:M.S, Computer Science and B.E (Bachelor of Engineering

We'd love your feedback!