We provide IT Staff Augmentation Services!

Senior Informatica Etl Developer Resume

0/5 (Submit Your Rating)

Buffalo, NY

SUMMARY

  • Over 7 years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications
  • Strong expertise in using ETL Tools Informatica Power Center 9.6 (Repository Manager, Designer,Workflow Manager, Workflow Monitor), IDQ, PowerExchange and ETL concepts .
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Cursors, Triggers, Views, Materialized Views, indexes, partitions, table partitions and query performance tuning.
  • Worked with various transformations like Normalizer, expression, rank, filter, group, aggregator, lookup, joiner, sequence generator, sorter, SQL, stored procedure, Update strategy, Source Qualifier.
  • Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snow - Flake Schema, FACT and Dimensions tables.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Teradata, Oracle, SQL SERVER, DB2), VSAM, XML and Flat Files.
  • Experience in working with POWER EXCHANGE to process the VSAM files.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Coordinating with Business Users, BI teams, functional Design team and testing team during the different phases of project development and resolving the issues.
  • Worked with Informatica Data Quality (IDQ) tool kit, Analysis, data cleansing, data matching, data conversion, exception.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Identified and eliminated duplicated in datasets through IDQ components of Biagram Distance, Edit Distance, Hamming Distance.
  • Worked with various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression.
  • Basic knowledge and understanding of Informatica Cloud.
  • Experienced in Teradata SQL Programming and Worked with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter.
  • Experienced in using advanced concepts of Informatica like PUSH DOWN OPTIMIZATION (PDO), PIPELINE PARTITIONING.
  • Good hands on experience in writing UNIX shell scripts to process Data Warehouse jobs.
  • Experience in working with big data Hadoop stack tools like HDFS, HIVE, Pig, Sqoop.
  • Expert in importing and exporting data into HDFS and Hive using Sqoop.
  • Experience in performance tuning the HiveQL and Pig scripts.
  • Applied various techniques at both database level and application level to find the bottle necks and to improve performance.
  • Good skills in defining standards, methodologies and performing technical design reviews.
  • Executed software projects for Banking and financial services.
  • Good communication skills, interpersonal skills, self-motivated, quick learner, team player.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.6, Informatica Data Quality IDQ, Informatica Cloud, Data Stage 8.7, Informatica PowerExchange, Pentaho

Languages: SQL, PLSQL,UNIX Shell Scripting

Methodology: Agile RUP, SCRUM, Waterfall

Databases: Teradata14/13/v2r12/v2r6/v2r5,Oracle11i/10g/9i,DB2,SQLSERVER2005/2008,Netezza

Operating Systems: Windows, UNIX, Linux

IDEs: Eclipse, PL/SQL Developer, TOAD, Teradata SQL Assistant, SQL * Loader, Erwin 3.5

BI Reporting Tools: Crystal Reports, Business Objects, OBIEE

Scheduling Tools: Control-m, Autosys, Tidal

Big Data Technologies: Hadoop, HDFS, Map Reduce, Hive, Pig, HBase, Sqoop, Oozie

Tracking Tool: JIRA, VersionOne

PROFESSIONAL EXPERIENCE

Confidential, Buffalo, NY

Senior Informatica ETL Developer

Responsibilities:

  • Coordinated with Business Users to understand business needs and implement the same into functional
  • Translated requirements and high-level design into detailed functional design specifications.
  • Extracted data from Verizon files and loaded into different systems like Oracle Database, SQL Server using UNIX scripts
  • Data landed in oracle and AS400 systems are migrated to frontier system interface files using Informatica Power center
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy
  • Verified the data quality of landed files and involved in data validation through Informatica data validation option tool (DVO) and Data Quality (IDQ)
  • As a part of conversion, worked on converting subscriber information, high speed internet and treatment files to frontier systems using Informatica Power center
  • Created and associated a Power Center Integration Service with the Power Center repository that the Data Validation Option (DVO) and IDQ repository can give required user access to the user
  • Integrated IDQ mappings, rules as mapplets within Power CenterMappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data.
  • Extensively involved in debugging Informaticamappings, testing of Stored Procedures and Functions, Performance tuning and Unit testing of Informatica Sessions, Batches and Target Data.
  • Implemented restart strategy and error handling techniques to recover failed sessions and errored/rejected out data
  • Used Unix Shell Scripts to automate pre-session and post-session processes
  • Used ERStudio to analyze and optimize database and data warehouse structure.
  • Used Autosys scheduler to schedule and run the Informatica workflows on a daily/weekly/monthly basis.
  • Developed Oracle PL/SQL Package, procedure, function and trigger.
  • Involved in documentation of unit test cases and system test cases
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Modified /Created parameter files in development, test and production environments using UNIX
  • Copies data files from windows to UNIX and vice versa
  • Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex problem troubleshooting
  • Implemented error handling logic for all CDC mappings
  • Helped the QA team to develop test scripts for the validation of the process. Also performed unit testing and system testing.
  • Provided production support for Informatica daily loading process using Informatica 9.5
  • Execute the test cases for the code changes.

Environment: Informatica Power Center 9.6/9.6.1, Oracle 11g, UNIX,AS400,PL/SQL, SQL Server, Embarcadero ERStudio Data Architect, TOAD 12.8, Autosys, Putty, Erwin 8.0, WINSCP,SQL *Loader, HPQC

Confidential, Fort Wayne, IN

Senior Informatica ETL Developer

Responsibilities:

  • Data from Lincoln systems is moved to frontier systems (i.e. oracle and DB2 database) using sql loader in Unix by creating control tables
  • Involved in the source data (AT&T data) landing process, wrote scripts to load various source files like txt, csv, delimited and oracle dump
  • Responsible for the creation of several Informatica mappings, sessions and workflows to load data from Oracle to DB2 databases according to the specification document
  • Designed and developed Informatica mappings to perform the Extraction, Transform and Load process by studying the business requirement from the users and the mapping documents
  • Extensively used various active and passive transformations like Source Qualifier, Lookup, Router, Aggregator, Filter, Joiner, Expression, sequence generator, sorter and Rank
  • Involved in the extraction of data from flat files and loading into Oracle tables using custom built Perl scripts
  • Used Informatica Debugger to check the errors in the mapping and made appropriate changes in the mappings to generate the required results
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application
  • Created ETL process to generate XML files from oracle tables using Informatica
  • Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files
  • Implemented Informatica Data Quality solutions to do data cleansing, data matching and reporting
  • Worked with tools like TOAD, SQL Developer, SQL Plus to connect to Oracle and DB2 databases to write queries and analyze data.
  • Client has implemented Sales force application only for business customers and implemented Microsoft Dynamics Application for all the customers
  • Involved in requirement analysis, ETL design and development for extracting data from the source systems like Teradata, Oracle, flat files and loading into sales force application
  • Involved in Real time data load from AS400 DB2 into SQL Server staging
  • Created various tables in SQL Server 2012 management studio and extracted data from SQL server to load into Dynamics
  • Did Performance tuning and Optimization of mappings for better performance and efficiency
  • Good exposure to Teradata database and involved in using FastLoad, MultiLoad, Fast Export and TPump to load into/extract data out of Teradata staging
  • Worked with business users, testing team and application development teams on analyzing, resolving and documenting the defects using HP ALM tool

Environment: Informatica Power Center 9.5/9.5.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Linux, Oracle 11g, DB2, Flat files, XML Files, SQL Developer, WinSCP, Putty, Text Pad

Confidential, Piscataway, NJ

Informatica Developer

Responsibilities:

  • Prepared mapping documents, System Requirement Specification and High level Design Document as a part of project
  • Extracted data from Flat files, AS400 sources and loaded in to Oracle staging and target tables applying business rules as per the requirement
  • Designed and developed mapping using various transformations like Expression, Lookups, Joiner, Filter, Sorter, Aggregator, Router, Union, Update strategy, source qualifier transformation and Sequence generator
  • Implemented SCD Type1 and SCD Type2 techniques to update slowly changing dimension tables
  • Created reusable components like reusable transformations, Maplets and Worklets
  • Created different reusable tasks like email tasks to send email and command tasks to create, rename and delete files
  • Scheduled and ran Workflows for Extraction, Loading processes and monitored the Workflows using Workflow Monitor
  • Implemented error handling by routing invalid records to error tables, and reloading them to target tables.
  • Developed complex mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Router, Update strategy, source qualifier transformation, Unconnected stored procedure, Transaction commit control and Sequence generator
  • Did Performance tuning and Optimization of mappings for better performance and efficiency
  • Modified /Created parameter files in development, test and production environments using UNIX
  • Wrote Oracle PL/SQL procedures and functions to call them in the informatica mappings.
  • Implemented change data capture (CDC) on Teradata to update data in Sales force and Dynamics target tables using control table logic
  • Created Reusable components like reusable Transformations (Expression, Aggregator, Sequence, Joiner, Router, Lookups, Rank, Filter), Mapplets and Work lets for using them across the projects
  • Merged few mappings wherever possible to minimize the maintenance, optimize the ETL system and to reduce space occupancy on sales force application
  • Created Informatica mappings with PL/SQL Procedures / Functions/triggers to build business rules to load data
  • Wrote complex SQL scripts to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy
  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables
  • Scheduled and ran Workflows for Extraction, Loading processes and monitored the Workflows using Workflow Monitor
  • Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex problem troubleshooting
  • Developed unit test scripts and test cases for unit testing and system testing.

Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Repository Administration Console), Oracle Developer, TOAD, Oracle 11g, TOAD, UNIX

Confidential

ETL Developer

Responsibilities:

  • Analyzed source and target systems for loading data into CDW and to create external data feeds.
  • Involved all the phases of SDLC design and development.
  • Involved in gap analysis and STM documentation to map flat file fields with relational table data in CDW.
  • Developed mappings, mapplets, and reusable transformations to load external data into CDW.
  • Extracted data from CDW to create delimited and fixed width flat file data feeds that go out to external sources.
  • Analyzed Design documents and developed ETL requirement specs.
  • Created re-usable objects and shortcuts for various commonly used flat file and relational sources.
  • Developed a shell script in order to append Date and Time stamp for output xml files, to remove empty delta files and to FTP the output xml files to different servers.
  • Validate Logical Data Models relationships and entities, Determine data linage by including all associated systems in data profile.
  • Excellent Data Profiling experience used Data Polling validating data patterns and formats.
  • Integrated data into CDW by sourcing it from different sources like Oracle, Flat Files and Mainframes (DB2 and VSAM) using Power Exchange.
  • Created UNIX scripts to deal with flat file data like merging delta files with whole files and to concatenate header, detailed and trailer parts of the files.
  • Developed Mappings, which loads the data in to Teradata tables with SAP definitions as Sources.
  • Created mappings to read parameterized data from tables to create parameter files.
  • Used XML Source Qualifier is used only with an XML source definition to represent the data elements that the Informatica Server reads when it executes a session with XML Developer.
  • Used XML Parser transformation lets you extract XML data from messaging systems.
  • Used ODI to perform data integration to ELT processing. Data is extracted from multiple sources, sent through several transformation processes and loaded into a final destination target.
  • Used SAS for data entry, Retrieval, Management report writing and Statistical analysis.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Used Message broker as translate messages between the interfaces.

Environment: Informatica Power Center, Power Exchange, Windows, Oracle, Toad for Oracle, BODS, UNIX, workflow scheduler, Perl.

We'd love your feedback!