We provide IT Staff Augmentation Services!

Senior Informatica / Etl Developer Resume

3.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Over 8+ Years of experience in Information Technology and Data warehousing, ETL, applications using Informatica (Power center/Power Mart)
  • Implemented data warehousing methodologies for Extraction, Transformation and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console.
  • Extensively worked on ETL Informatica Transformations effectively including - Source Qualifier, Connected - Unconnected Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator and created complex mappings.
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehousing/Decision Support Systems using Informatica Power Center 9.1/8.6/8.1/7.1
  • Good Experience in developing Unix Shell Scripts for automation of ETL process.
  • Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, Data Validation
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Strong knowledge of OBIEE 10g/11g as a Business Intelligence tool and Data Extraction using Informatica as ETL tool.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Implemented Slowly Changing dimensions Type 1 and Type 2, methodology for accessing the full history of accounts and transaction information.
  • Involved in understanding requirements and in modeling activities of the attributes identified from different source systems like Oracle, Teradata, SQL and Flat files.
  • Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets, Mappings, Mapping, Session and Workflow Variables.
  • Experience in staging, validating and loading the data into Teradata Warehouse using Informatica.
  • Good understanding of relational database management systems like Oracle, TeraData, DB2, and SQL Server and extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems and mainframe Cobol and VSAM files
  • Experienced in Performance of Mapping Optimizations in Informatica.
  • Experienced in using the Oracle SQL* Loader feature for loading the data from Flat Files to Data Base tables for Bulk Loading.
  • Involved in Relational database design and development of data warehouse data feeds.
  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation
  • Very strong in Relational Databases (RDBMS), Data Modeling and Design and build the Data Warehouse, Data marts using Star Schema and Snow Flake Schema for faster and effective Querying.
  • Good experience in using SQL, PL/SQL, Indexes, Functions, Procedures, Triggers and Cursors.
  • Excellent Team Player self-motivated with good communications skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.1/8.6/8.1/7.1 /6.1 (Power Center Repository, Informatica Designer, Workflow manager, Workflow monitor), Power Exchange, Developer (IDQ) and Analyst (IDE).

Databases: MS SQL Server 2005/2008, Oracle 8i/9i /10g, Teradata, IBM DB2, SybaseData Modeling Tools: ERWIN 4.0/3.4, Star Schema Modeling, Snow Flake Modeling, Fact and Dimensions tables, Entities, AttributesDatabase Skills: Cursors, Stored Procedures, Functions, Views, Triggers and Packages

Client Side Skills: SQL, T-SQL, PL/SQL, UNIX Shell Scripting, XML.

Web Servers: IIS v5.0/6.0, Apache Tomcat

Operating System: Windows 2000/2003/XP/Vista/Windows 7, UNIX Linux.

Methodologies: Master Data Modeling Logical/ Physical, Star/ Snowflake Schema, FACT& Dimension Tables, ETL, OLAP.

PROFESSIONAL EXPERIENCE:

Confidential, Chicago IL

Senior Informatica / ETL Developer

Responsibilities:

  • Studied and understood the existing subject areas, source systems, target system, operational data, jobs, deployment processes and Production Support activities.
  • Interacted with the Business Users to analyze the Business Requirements, High Level Document (HLD), Low Level Document (LLD) and transform the business requirements into the technical requirements.
  • Designed and developed end-to-end ETL process from various source systems to staging area, from staging to Data Marts.
  • Designed and created complex mappings using SCD Type II involving transformations such as Expression, Source Qualifier, Joiner, Aggregator, Connected and Unconnected Lookup, Update strategy, Sequence Generator, Router and Filter Transformations.
  • Performed Performance Tuning of sources, targets, mappings and sessions.
  • Extracted source definitions from various databases like legacy databases on Mainframe, Salesforce into the Oracle and SQL Server tables using Informatica Mappings and Sessions
  • Worked closely with BO team in order to meet users/business Adhoc and other reporting needs.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Created reusable transformations, mappings, mapplets and workflow, worklets, and database connections using Mapping Designer and Workflow Manager.
  • Has been involved in creating test cases, test scripts and test data for testing of Informatica mappings and workflows.
  • Involved in Performance tuning for sources, targets, mappings, sessions and server.
  • Used Push down optimization and partitioning for performance improvements
  • Has been involved in unit testing of Informatica mappings and workflows.
  • Involved in enhancement or modification of existing components to improve performance.
  • Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts. Designed and Deployed UNIX Shell Scripts.

Environment: Informatica PowerCenter 9.5.1, Power Exchnage9.1, Information lifecycle Mangement(ILM), Salesforce, Oracle11g, SQL Server, Flat Files, Work Bench, Toad, SQL Worksheet, Erwin 4.0, Autosys, Business objects 4.0, Windows XP, UNIX, Quality Center.

Confidential, VA

Informatica/ETLDeveloper

Responsibilities:

  • Involved in Building scalable, highly available, and reliable business intelligence systems
  • Developed Extraction, Transformation and Loading (ETL) processes to acquire and load data spanning across various applications (enterprise, cloud), data centers, and social graphs
  • Experience in Performance tuning and Optimization of Cache with Informatica and OBIEE.
  • Used concept of staged mapping in order to perform asynchronous Web Services request and response as part of Informatica Mappings
  • Participated in setting the strategy for data warehousing systems; set standards and best practices for programming, security, and operations areas
  • Collaborated with developers, stakeholders and subject-matter experts to establish technical vision and analyze trade-offs between usability and performance needs
  • Played a key role in the setting of standards for both the physical and logical database schema and table design.
  • Modifying the UNIX scripts as a part of Upgrade and making sure they point to the correct directories.
  • Participated in design and specification of the hardware and storage architecture used to support data warehousing systems
  • Involved collaborating with global delivery teams
  • Involved with data modeling using ER and dimensional modeling techniques
  • Experience in defining security standards aligned with corporate data governance and compliance policies
  • Wrote UNIX Shell Scripting for Informatica Pre-Session, Post-Session Scripts and also to run the workflows.
  • Involved in developing OBIEE Repository at three layers (Physical, Business model and Presentation Layers), Interactive Dashboards and drill down capabilities using global and Filters and Security Setups
  • Involved in designing prototypes and working closely with implementation teams in engineering and operations
  • Involved in Using OBIEE 11g for Creating RPD,S and creating Reports and Dashboards.
  • To develop ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Used Cobol files by using Power exchange using data maps
  • Implementing various workflows using transformations such as SQL Transformation, XML transformation, Normalizer, look up, aggregator, stored procedure and scheduled jobs for different sessions.
  • Performance Tuning of Sessions and Mappings.
  • Tune performance of Informatica sessions for large data files by implementing Pipeline partitioning and increasing block size, data cache size, sequence buffer length, target based commit interval.
  • Worked on DB2.

Confidential, Dallas, TX

Sr. Informatica Developer

Responsibilities:

  • Worked as a Sr. Informatica Developer and participated in all the project phases starting from Requirements gathering through Deployment of this Program (SDLC).
  • Worked as a part of Onsite - Offshore model. Being a developer, I have taken the responsibility for creating checklists for coding standards, naming conventions etc. I have also developed reusable code components to maintain ETL standard load practices like Table Analyze script, Stage pre load, file archive and maintaining two months of archived files and logs
  • Used Full PDO (Push down optimization) in infromatica for loading data.
  • Creating mappings to load data using different transformations
  • Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.
  • Extensively involved in creating design documents for loading data into Data warehouse and worked with the data modeler to change/update the Data warehouse model when needed.
  • Used Teradata utilities fastload, multiload, tpump to load data
  • Wrote Teradata Macros and used various Teradata analytic functions
  • Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Developed ETL processes to load data into dimensional model from various sources using Informatica Power Center 9.5.1
  • Developed Mapplets and reusable transformations that were used in different mappings and across different folders
  • Designed and developed the error handling mechanism to be used for all the Informatica jobs, which load data into the data warehouse.
  • Extensively used warehouse designer of Informatica Power center Created different target definitions.
  • Working on Billion record tables ( performance tuning for historical loads )
  • Created robust and complex workflows and worklets using Informatica workflow manager and Troubleshooted data load problems

Environment: Informatica Power Center 9.1 Hot fix2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 11g, SeaQuest, HPDM, SQL Server, TeraData,UNIX, Toad, Control-M.

Confidential, Carlsbad, CA

EDW Informatica Developer

Responsibilities:

  • Worked in all the project phases starting from Requirements gathering through Deployment of this Program.
  • ETL Tool Informatica is used for Design and Development of the code.
  • Data is sourced from 22 different Work Units. These Work Units span across huge variety of data sources like Oracle, Flat Files, MS Access and Spread Sheets.
  • ETL flows are developed from Source to Stage, Stage to Work tables and Stage to Target Tables.
  • Used Conversion process for VSAM to ASCII source files using Informatica Power Exchange
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS
  • Extracted data from various sources such as DB2, Oracle, Teradata fixed width and de-limited flat files, cleansed and loaded data into Teradata and Flat file targets using Informatica and PL/SQL
  • Source and Target Definitions are imported from Source Analyzer and Target Designer connecting to databases using Relational and ODBC Connections.
  • Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router and so on.
  • Worked in developing Mapplets and Re-usable Transformations for reusability and reducing effort.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Migrated Informatica Folders from Development Env to Test and System Test Env and Worked with Admins to migrate the same to Production environments.
  • Wrote PL/SQL procedures for reconciliation of financial data between source and target to automate testing phases and help business for preliminary validation.
  • Wrote UNIX scripts, environment files for Informatica.
  • Developed Metadata driven code for effective utilization and maintenance using technical metadata, business metadata and process metadata.
  • To externalize the business logic instead hardcoding in the mapping I have used Parameter file in Informatica.
  • Generated BO reports to test standardized reports as per business requirements.
  • Tuned Mappings and Mapplets for best Performance on ETL Side and Created Indexes and Analyzed tables periodically on Database side.
  • Organized the dataflow and developed many AutoSys jobs for Scheduling MINT program and moved to production.
  • Primary resource in Production support team so, involved in emergency calls when application outage occurred and resolved defects when raised.

Environment: Informatica Power Exchange, Power Center 8.6, Power Connect, Teradata, Data Explorer and Data Quality, Data Flux, Erwin 3.5, OBIEE 10.1.3, DB2,Oracle 10g, SQL Loader, XML, Unix, Win NT, TOAD, AutoSys.

Confidential

ETL Developer

Responsibilities:

  • Developed Informatica jobs for Conversions which involved loading the mainframe extract data into staging tables
  • Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.
  • Developed and maintained custom PL/SQL packages and procedures for analysis of source system data and also for data validation
  • Extensively involved in creating design documents for loading data into Datawarehouse and worked with the data modeler to change/update the Datawarehouse model when needed.
  • Developed ETL processes to load data into dimensional model from various sources using Informatica Power Center 8.1.
  • Developed Mapplets and reusable transformations that were used in different mappings and across different folders
  • Designed and developed the error handling mechanism to be used for all the Informatica jobs, which load data into the data warehouse.
  • Extensively used warehouse designer of InformaticaPowercenter Created different target definitions.
  • Created robust and complex workflows and worklets using Informatica workflow manager and Troubleshooted data load problems

Environment: Informatica Power Center 8.1(Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 9i, SQL Server, UNIX, Toad.

Confidential, New York, NYC, NY

DW Informatica Developer

Responsibilities:

  • Involved in meetings with the business for gathering requirements and identifying the source data.
  • Developed SQL packages for analyzing the source system data and also for generating the extract scripts for loading data into staging tables.
  • Performed GAP analysis and also determined the cross-reference tables needed.
  • Developed Complex mappings and workflows using PowerCenter designer and Workflow manager for loading data into Datawarehouse from staging.
  • Troubleshooted the Informatica jobs identified the performance bottlenecks and adopted various techniques to improve the performance.
  • Performed unit testing on the Informatica jobs developed and documented the test results.
  • Involved in updating the shell scripts and parameter files, which are used to run the workflows daily, with the new Informatica jobs developed.
  • Involved in integration testing of the jobs developed and migration of jobs from one environment to another.
  • Developed Informatica jobs for loading data into DataMart from data warehouse using reusable transformations and Mapplets.
  • Involved in Informatica Administrative tasks Repository and folder management, User permissions, version controlling, metadata reporting, and server management.
  • Worked on performance tuning for mappings by optimizing the sources, transformations, targets and sessions.
  • Developed test scripts for Unit testing the Informatica jobs developed for loading data into data mart.
  • Involved in production support for the data warehouse Informatica jobs

Environment: Informatica Power Center 8.1, Informatica Power Connect, (Repository Manager, Designer, Server Manager, Workflow Monitor, Workflow Manager), Power Exchange, Power Analyzer, ETL, Oracle 11g/10g, Teradata V2R5, PL/SQL,ODI, Unix commands, Trillium 11, Excel and shell scripting, MS Windows XP professional 2002,UNIX.

We'd love your feedback!