We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Raleigh, NC

SUMMARY:

  • Senior Informatica Developer having 12+ years of IT experience with strong background in ETL Data warehousing using Informatica PowerCenter 9.x/8.x/7.x. Experience in Planning, Designing Developing and Implementing Data warehouses/ Datamarts with experience of both relational & multidimensional database design.
  • Experience in using Informatica PowerCenter 9.6.1/9.5/9.1/8.6.1/8.1.1/7.1 - Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, Repository Manager and Admin Console.
  • Extensive experience in developing complex mappings from varied transformations like Source Qualifier, Connected and Unconnected lookups, Router, Filter, Sorter, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator etc.
  • Expertise in design and implementation of Confidential - slowly changing dimensions types (1, 2 and 3) and Confidential - Change data capture.
  • Experienced in loading data, troubleshooting, Debugging mappings, performance tuning of Informatica (Sources, Targets, Mappings and Sessions) and fine-tuned Transformations to make them more efficient in terms of session performance.
  • Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings.
  • Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, workflow names, log files, input, variable, output ports).
  • Database experience using Teradata 14.10/14, PostgreSQL, H2 Database, Oracle 11g/10g/9i/8.x/7.x, MS SQL Server 2005/2000, AS400, DB2 and MS Access. Also used SQL Editors such as Teradata SQL Assistant, H2 Console, TOAD, SQL PLUS and SQL Analyzer.
  • Experience on using Teradata Parallel Transporter (TPT) load protocols like LOAD, UPDATE STREAM & EXPORT.
  • Experience in using Teradata Macros, Query banding and BTEQ Scripts on Linux/Unix.
  • Strong experience using T-SQL, PL/SQL Procedures/Functions, Triggers and Packages.
  • Experience in Unix/Linux Operating System and Shell scripting.
  • Experience in integration of various data sources like Teradata, Oracle, PostgreSQL, S3, HDFS, MS SQL Server, Flat Files, and XML Definitions.
  • Good understanding of Views, Synonyms, Indexes, Joins, and Sub-Queries.
  • Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLAP.

TECHNICAL SKILLS:

  • Informatica PowerCenter 9.6.1/9.5/9.1/8.6.1/8.1.1/7.1
  • Admin Console
  • Informatica Developer
  • OWB10g
  • Apache Spark
  • Azkaban
  • GitHub
  • PostgreSQL
  • Amazon S3
  • HDFS
  • Teradata 13.10/14/14.10/15.10
  • Oracle 11g/10g/9i/8i
  • MS SQL Server 2005/2000,
  • MS Access
  • DB2
  • AS400
  • H2 Database
  • SQL, PL/SQL
  • C, C++, Data Structures
  • Unix Shell Script
  • Teradata SQL Assistant
  • H2 Console
  • SQL plus
  • PL/SQL Developer
  • Toad
  • SQL* Loader
  • Sqoop
  • Windows Server 2003/2008
  • UNIX/MS - DOS/Linux

PROFESSIONAL EXPERIENCE:

Sr. Informatica Developer

Confidential, Raleigh, NC

Responsibilities:

  • Created functional requirements documents based on business requirements
  • Created Internal Wiki documents with information about the process design and production support steps.
  • Design, develop/modify ETL mappings based on technical specifications
  • Perform unit and integration testing
  • Design and develop ETL Load Audit Control Architecture and processes for reporting and monitoring
  • Designed/Developed a framework to enable the reusability of tables and sessions across the project.
  • Used Confluence as Orginazational Wiki to discuss work, upload process documents, etc..
  • Created Sqoop scripts to ingest data from HDFS to Teradata and from SQL Server to HDFS and to PostgreSQL.
  • Used Azkaban scheduler to run, monitor Hadoop Jobs.
  • Used H2 Console to access H2 database and other databases using JDBC drivers.
  • Used Amazon S3 as storage for all the data.
  • Used Apache Spark to connect to various API and to ingest data.
  • Upgraded Informatica from 9.5.1 to 9.6.1 on Windows server and client.
  • Created Batch scripts to execute workflows, schedule/unschedule jobs, import/export users to Admin console, FTP scripts, BTEQ and TPT scripts.
  • Involved in Informatica server configurations, PowerCenter server/client and Power Exchange Installations, setting up Admin Console, Repository migrations and Database backup/restores.
  • Assist with day-to-day operational tasks related to team responsibilities, and provide both ongoing and off-hour support and maintenance on ETL / DW solutions

Environment: Informatica Power Center 9.5.1/9.6.1 , Teradata 14.10, SAP, SQL Server 2008, Teradata SQL Assistant, Rally Agile tool, Informatica Scheduler, Globalscape, SQL, Batch Scripts, Windows server 2008/2003, GitHub, Amazon S3, Sqoop, Azkaban Scheduler, H2 Console, Spark, HDFS, Shell Scripts, Confluence .

Sr. Informatica Developer

Confidential, CA

Responsibilities:

  • Created functional requirements documents based on business requirements
  • Created Data Lineage / Source-to-Target mappings with appropriate data rules and transformation.
  • Perform Data Profiling on new source system(s), data research / analysis, and gap analysis
  • Design ETL processes and develop / modify ETL programs based on technical specifications
  • Perform unit, integration and regression testing
  • Design and develop ETL Load Audit Control Architecture and processes for reporting and monitoring
  • Involved in design on ETL framework to load Transactional data from SAP to ODS.
  • Used BCI Mappings to activate data sources and to pull the data from SAP.
  • Created Native BTEQ scripts to load historical data from AS400 to Teradata.
  • Used TPT LOAD protocol to load large volumes of data to Teradata.
  • Created Batch scripts to execute workflows, schedule/unschedule jobs, import/export users to Admin console, FTP scripts, BTEQ and TPT scripts.
  • Involved in Informatica server configurations, PowerCenter server/client and Power Exchange Installations, setting up Admin Console, Repository migrations and Database backup/restores.
  • Assist with day-to-day operational tasks related to team responsibilities, and provide both ongoing and off-hour support and maintenance on ETL / DW solutions

Environment : Informatica Power Center 9.5/9.1, Teradata 14, SAP, SQL Server 2008/2005, AS400, Iseries Navigator, Teradata SQL Assistant, Rally Agile tool, Informatica Scheduler, Globalscape, SQL, Batch Scripts, Windows server 2008/2003, TortoiseGIT.

Sr. ETL Developer

Confidential, Tempe, AZ

Responsibilities:

  • Based on the business requirements, created Functional design documents and Technical design specification documents for ETL Process.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Designed, developed, implemented ETL process to extract, transform, and load (ETL) data from inbound flat files and various source systems and load into Confidential using the Informatica PowerCenter.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Teradata, Oracle using the PowerCenter.
  • Created complex mappings and reusable transformations. Made use of mapping variables, mapping parameters.
  • Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, SQL transformation and sequence generator.
  • Created workflows, worklets and used all the other tasks like email, command, decision, event wait, event raise and assignment tasks in the Workflow Manager.
  • Designed and developed process to handle very high volumes of data using Teradata Parallel Transporter (TPT) load protocols like LOAD and UPDATE.
  • Expertise in writing BTEQ scripts on Linux/Unix. Used BTEQ scripts in pre/post load processes for inserting/updating the process activity tables. Also used Teradata Macros and Query banding.
  • Used Informatica Push Down Optimization (PDO) to push the transformation processing from the PowerCenter engine into the relational database to improve performance.
  • Written shell scripts to execute the workflows, cleanup the files and to execute BTEQ Scripts.
  • Used Autosys to Schedule and control the batch jobs
  • Assisted and worked on performance testing, data quality assessment & production deployments.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 9.1, Teradata 13.10, Oracle 10g, Teradata SQL Assistant, TOAD, SQL Loader, Autosys, NDM, SQL, PL/SQL, UNIX Shell Scripts, Linux/Solaris, TortoiseSVN .

Sr. ETL Developer

Confidential, Columbia, MD

Responsibilities:

  • Involved with Business Analysts in gathering requirements.
  • Involved in designing Logical/Physical Data Models.
  • Developed PowerCenter mappings to extract data from various databases, Flat files and load into Confidential using the Informatica 8.6.1.
  • Created complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Union, Expression and Aggregator transformations to pipeline data to Confidential . Also, made use of variables and parameters.
  • Worked on multiple enhancement projects, conversion from PowerCenter 8.1.1 to 8.6.1 and OWB to Informatica.
  • Created T-SQL procedures to detect Confidential methodologies and developed Slowly Changing Dimension for Type 1 Confidential
  • Created Indexes, primary keys and checked other performance tuning at data base level.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Created Technical Design Document and worked with operations team to resolve production issues, created testing metrics using MS-Excel
  • Written shell scripts in UNIX to execute the workflow in a loop to process ‘n’ number of files and FTP Scripts to pull the files from FTP server to Linux Server.
  • Involved with reporting team to generating reports from Confidential using Cognos.

Environment: Informatica PowerCenter 8.6/8.1, SQL Server 2005, TOAD, Rapid SQL, Oracle 10g (RAC), T-SQL, PL/SQL, UNIX Shell Scripts, FTP and OWB.

Informatica Developer

Confidential, CA

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection and involved in Designing Technical Specification.
  • Transforming high-level design spec to simple ETL coding. Designing mappings according to the mapping standards, naming standards and warehouse standards for future application development.
  • Worked on Informatica PowerCenter 8.6.1/7.1.3 tool -Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, Transformations, Work Flow Manager (Task Developer, Worklets, and Work Flow Designer) and Work Flow Monitor.
  • Extensively used Informatica PowerCenter to load data from Flat Files to DB2, Flat Files to SQL Server, DB2 to XML files, Mainframe to DB2 Flat Files to Oracle.
  • Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy and Joiner transformations.
  • Developed and scheduled various pre and post-sessions commands and workflows for all mappings to load data from source files to target tables.
  • Created one time mappings and workflows for better understanding and easy to use.
  • Worked with Variables and Parameters in the mappings.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Involved in Performance tuning for sources, targets, mappings and sessions.
  • Used breakpoints and various test conditions with Debugger tool to test the logic and the validity of the data moving through the mappings.
  • Created logical and physical data models and maintained relationships between tables using Erwin. Well versed with reverse engineering processes in Erwin.
  • Used for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions. Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.
  • Created deployment groups, migrated the code into different environments.
  • Worked closely with reporting team to generate various reports.

Environment: Informatica PowerCenter 7.1, SQL Server 2000, TOAD, Oracle 9i, DB2, SQL, PL/SQL, UNIX Shell Scripts, Erwin.

Sr. ETL Developer

Confidential, Honolulu, HI

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Confidential .
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Confidential using the PowerCenter.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Confidential .
  • Developed Slowly Changing Dimension for Type 1 Confidential
  • Data extraction and d ata Transfer from and to SQL Server Database using utilities / tools like BCP, and BULK INSERT.
  • Written SQL Queries, Triggers, and PL/SQL Procedures to retrieve data from DWH and to apply and maintain the Business Rules.
  • Written Indexes, primary keys and checked other performance tuning at data base level.
  • Conducted Database testing to check Constraints, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and database tuning.
  • Written shell scripts in UNIX to execute the workflow.
  • Involved in generating reports from Confidential using Cognos.
  • Defects were tracked, reviewed and analysed.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.

Environment: Informatica PowerCenter 8.6, SQL Server 2005/2000, TOAD, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripts, Erwin.

ETL Consultant

Confidential, Atlanta, GA

Responsibilities:

  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical Specification.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Used Informatica -Designer for developing mappings, using transformations, which includes aggregation, Updating, lookup, and summation. Developed sessions using Server Manager and improved the performance details.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created reusable transformations called mapplets and used them in mappings in case of reuse of the transformations in different mappings.
  • Created and used parameters and variables.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
  • Involved in creating Technical Specification Document (TSD) for the project.
  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Involved in the development of Confidential and populating the data marts using Informatica.
  • Created sessions to run the mappings. Created mapplets to improve the Performance

Environment : Informatica 7.1, Windows 2000, Oracle 9i, PL/SQL, SQL, Sybase, Windows 2000

Confidential, Durham, NC

ETL Developer

Responsibilities:

  • Extensive experience in using ETL process using Informatica Power Center 8.x/7.x.
  • Have participated in exporting the sources, targets, mappings, workflows, tasks etc. And imported into the new Informatica 8.X, tested, reviewed to make sure that all the workflows are Execute as per the Design documents and Tech specs.
  • Developing the Mappings using needed Transformations in Informatica tool according to technical specifications
  • Created/modified various Informatica Mappings and workflows for the successful migration of the data from various source systems to DB2 which meets the business requirements for reporting and analysis.
  • Extensively used XML Source, XML target and XML Parser transformations
  • Imported Source and Target tables from their respective databases
  • Developing Mapplets and Transformations for migration of data from existing systems to the new system using Informatica Designer
  • Excellent understanding of Star Schema modeling, Snowflake modeling.
  • Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalize, Joiner, Update Strategy, Rank, Aggregator, Sorter.
  • Interacting with off-shore people and Database team if any issues.
  • Preparing the Documentation for the mapping according to the designed logic used in the mapping.
  • Preparing Test Cases and executing them along with the Testing team
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.

Environment: Informatica Power Center 8.6 and 9.X, Unix, PL/Sql, SQL Server 2008, Windows 7, Oracle, XML Files, Flat Files, Mercury Quality center, WinFTP, DB2, Visio,Office-2010 .

Confidential, McLean, VA

ETL Developer

Responsibilities:

  • Developed the mappings using needed 1-1 mappings in Informatica tool according to technical specifications.
  • Importing Source and Target tables from their respective databases
  • Developed Informatica Mappings and workflows for migration of data from existing systems to the new system using Informatica Designer
  • Preparing the Documentation for the mapping according to the designed logic used in the mapping
  • Running Jobs in UNIX
  • Using existing Shell scripts, customized the new jobs and running workflows in UNIX environment.
  • Optimizing performance tuning at mapping and Transformation level
  • Preparing Test Cases and worked on executing the cases.

Environment: Informatica Power Center 9.x, Unix, PL/Sql, Windows 7, Oracle, Mercury Quality center, WinFTP, DB2, Visio,Office-2010

We'd love your feedback!