We provide IT Staff Augmentation Services!

Lead Informatica Developer Resume

0/5 (Submit Your Rating)

Indianapolis, IN

SUMMARY

  • 8 ½ years of experience in designing, developing, and maintaining large business applications involving data migration, integration, conversion, and data warehousing.
  • Experience includes thorough domain knowledge of Banking, Insurance (and reinsurance), Healthcare, Pharmacy, and Telecom industries.
  • Experience working with various versions of Informatica Power center - Client and Server tools
  • Business requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation.
  • Expertise in data warehousing, ETL architecture, data profiling and business analytics warehouse (BAW)
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations
  • Involved in generating reports from Data Mart using OBIEE and working with Teradata.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes.
  • Strong experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
  • Thorough knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in development of reports using Business Objects, Cognos and Micro strategy.
  • Knowledge in Installation and configuration of Informatica server with sql server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, Star & Snowflake Schema, Fact & Dimension Tables, Physical & Logical Data Modeling, Data Stage, Erwin

Business Intelligence Tools: Business Objects, Cognos

Databases: MS SQL Server, Oracle, Sybase, Teradata, MySQL, MS-Access, DB2

Database Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Development Languages: C, C++, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Other Tools and Technologies: MS Visual Source Safe, PVCS, Autosys, cron tab, Mercury Quality center

PROFESSIONAL EXPERIENCE

Confidential, Indianapolis IN

Lead Informatica Developer

Responsibilities:

  • Acted as Technical lead and managed a team to create Mappings from Flat files and connect with Xml’s.
  • Created Mappings Connecting the Xml’s with Payload Db2 Databases and WPS tables in the Event store databases.
  • Created a successful integration with Flat files, Web services and Databases and integrated them in a Network also worked with power exchange CDC features to capture environment changes
  • Created several Informatica Mappings in all the EDW Environments ran the workflows and monitored it.
  • Also Created Audit and Controls using Teradata Metadata tables and ran the scripts using Audit and controls statistics and scheduled jobs using Autosys.
  • Created several Design patterns, Standards documents, ETL strategies explaining all above mentioned processes.
  • Also tested the Databases using PLSQL and analyzed several mapping scenarios.
  • Created all the paths and folders accessible from UNIX and ran all the Informatica workflows in UNIX using cron tab.

Environment: Teradata 14, Erwin, Informatica Power Center 9.1 version, Informatica power exchange,SQL Server 2005/2000, Oracle 11, BAW, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential

Lead ETL BI Data warehouse (Informatica Senior) developer

Responsibilities:

  • Acted as technical lead in Modeling, Estimation, Requirement Analysis and Design of mapping document and planning using ETL, BI tools, MDM, Toad by various environmental sources.
  • Acted in coordinating offshore ETL Development for EDW and weblogs and planned analysis using Deliverables include ERD, Data Models, Data Flow Diagrams, Use Cases, Gap Analysis and process flow documents and have expert understanding of Ralph Kimball.
  • Worked on upgrading and data analysis of ERIC (Employment resource information center) data modeling and practiced Informatica B2B power center and power exchange standards similar to Axway, Implementation of Integrated EDI applications using Gateway protocol and Enterprise Application Integration (EAI) for business processes across applications.
  • Extensively worked with Teradata in data Extraction, Transformation and loading from source to target system using Bteq, Fast Load, and Multi Load and Tpump.
  • Worked with Teradata 14 version and involved in writing scripts for loading data to target data Warehouse for Bteq, Fast Load, MultiLoad and Tpump
  • Designed and developed ELT (Extract transform & Load) solutions for Bulk transformations of client’s data coming from Mainframe Db2 and customized COBOL coding.
  • Maintained database activities regularly using Toad and also trouble-shouted several issues with Toad.
  • Experience using all Power exchange Change data Capture (CDC) features including Easy to use graphical interface, Noninvasive capture changes, codeless access to capture changes etc.
  • Developed complex Pl/Sql procedures and packages as part of Transformation and data cleansing.
  • Developed UNIX shell scripts to control the process flow for Informatica workflows to handle high volume data and also scheduled jobs using Autosys.
  • Set up batches and sessions to schedule the loads Confidential required frequency using Power Center Workflow manager and accessing Mainframe DB2, performed COBOL coding and AS400 systems.
  • Extensively used Informatica debugger to validate Mappings and to gain troubleshooting Information about data and error conditions and also participated in file admin tasks.
  • Designed, developed Informatica mappings using Informatica 9.1, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Created complex mappings using Aggregator, Expression, Joiner transformations and also worked in administration part of Oracle 11 and Oracle 10.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment.

Environment: Teradata 14, Erwin, Informatica Power Center 9.1 version, Informatica power exchange,SQL Server 2005/2000, Oracle 11, BAW, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential, Columbus Ohio

Senior Informatica Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data Mart.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology for EDW and weblogs.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata 12, Flat files and load into Data Mart using the Power Center and acted as PR4/PR3 programmer for manipulating CDC modules also used Pentaho for ETL.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions, worked on integration services and reporting services SSIS & SSRS.
  • Used Informatica Designer in Informatica 9.0 to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Developed Slowly Changing Dimension for Type 1 SCD and worked on ETL ODI applications.
  • Used mapplets for use in mappings thereby saving valuable design time and effort and also worked in Power exchange CDC features of Informatica for capturing database inserts
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, work lets and workflows accessed information using Mainframe DB2 written Cobol Coding.
  • Written procedures, Queries to retrieve data from DWH and implemented in DM also connected Informatica with Teradata 12 and did Migrations for e-commerce applications.
  • Data extraction and data Transfer from and to SQL Server Database using utilities / tools like Toad and BULK INSERT and work on contingency plan using SQL Queries.
  • Used Store Procedure as Data provider to retrieve data from scheduled tables and complex queries
  • Developed centralized schema console using Business Analytics warehouse (BAW), wrote analytical queries, designed and developed ELT (Extraction, loading, Transformation) solutions and ensured only checked data is loaded.
  • Developed core system components utilizing the SQL, Oracle, Informatica, Maestro, and Harvest.
  • Wrote SQL queries, triggers, and PL/SQL procedures to apply and maintain the business rules.

Environment: Informatica Power Center 9.0/8.6, SQL Server 2005/2000, Oracle 11i/10g, Teradata 12, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, Lansing MI

Sr. Informatica Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables, accessed AS400 mainframe db2 systems with COBOL.
  • Tested the reports like Drill Down, Drill up and pivot reports generated from Cognos.
  • Used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio and Pentaho.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Checked performance tuning/debugging Confidential different levels like workflows, mappings, database etc,. And documented using Microsoft office, performed shell scripting.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment also troubleshooted data issues with Oracle warehouse builder.
  • Using cognos, Developed the Web Intelligence and Full Client reports
  • Performed System Testing, Regression Testing, Acceptance Testing, Functional Testing and Stress Testing.
  • Created reports like Master/Detail reports, Cross Tab reports, slice and dice reports, and drill down reports.

Environment: Informatica, SQL Server 2000, Teradata 6, Oracle 9i, DB2,SQL, PL/SQL, Mainframes, Sun Solaris, UNIX Shell Scripts, Business Cognos 8, Erwin, Autosys, Remedy.

Confidential, Lansing MI

Sr. Informatica developer

Responsibilities:

  • Worked closely with business users while gathering requirements, analyzing data and supporting existing reporting solutions.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations.
  • Created complex mapplets for reusable purposes, Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Fine-tuned existing Informatica maps for performance optimization, also used MQ series for passing distributed data and also worked on Power center and Power exchange B2B.
  • Worked on Informatica Designer tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Involved in the Unit testing, Event & Thread testing and System testing.
  • Analyzed existing system and developed business documentation on changes required.
  • Made adjustments in Data Model and SQL scripts to create and alter tables.
  • Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
  • Worked on various issues on existing Informatica Mappings to produce correct output.
  • Database relationships & Data Models.
  • Involved in intensive end user training (both Power users and End users in Report studio and Query studio) with excellent documentation support.

Environment: Informatica, Oracle 10g/9i, SQL, SQL Developer, Windows 2008 R2/7, Toad

Confidential, Seattle WA

ETL Informatica developer

Responsibilities:

  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Business intelligence database.
  • Based on the EDS business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Reviewed data models using Erwin tool to find out data model dependencies.
  • Designing and developing ETL solutions in Informatica Power Center 8.6 and Toad, performed shell scripting.
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files from web servers and hosted files.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables for some e-commerce based applications.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window.
  • Effectively used all the kinds of data sources to process the data and finally creating load ready files (LRF) as out bound files which are inputs to the BID.
  • Created Workflow, Work lets and Tasks to schedule the loads Confidential required frequency using Maestro scheduling tool. Created Maestro control files to handle job dependencies.
  • Expertise in writing BTEQ scripts in Teradata and running them by writing korn shell scripts in HP UNIX and Sun OS environments.
  • Created Dashboards using Crystal X-Celsius for senior management for business decision-
  • Making for BO Mobile interfaces.
  • Performed Server Management Tasks, using Central Configuration Manager (CCM) and Central
  • Management Console (CMC), worked in Unix and windows.
  • Worked on SAP BI reporting which involves Query building, Filtering, Free Characteristics,
  • Restricted Key Figures and Variables, Query variants using BEx Analyzer.
  • Extensively worked on BO XI 3.1 for reporting purposes.
  • Involved in Adhoc query development and data mining.
  • Extensively worked in INFOVIEW to create Web Intelligence, Desk Intelligence Reports over the Universe Created
  • Expertise in creating MLOAD, Fast load and T Pump control scripts to load data to the BID.
  • Expertise in creating control files to define job dependencies and for scheduling using maestro tool.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 8.6, Business objects, ETL, Teradata V2R5 as a BID, Business Objects, Oracle 10g/9i/8i, HP - Unix, Sun OS, Perl scripting, Erwin, PL/SQL, Maestro for scheduling.

Confidential, Sacramento, California

ETL Informatica Developer

Responsibilities:

  • Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load intervals.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Designed and implemented mappings using SCD and CDC methodologies.
  • Designed and developed process to handle high volumes of data and large volumes of data loading in a given load window.
  • Extensively involved in migration of ETL environment, Informatica, Database objects.
  • Involved in splitting of Enter price data warehouse environment and Informatica environment in to 3 of each company.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 7.1.1, Teradata V2R6, Micro Strategy, MS Sql server 2000, Oracle 10g, 9i/8i,Trillium, HP Unix, Perl scripting and Windows 2000, Erwin 4.2, PL/SQL.

Confidential, Portland Oregon

ETL Informatica Developer

Responsibilities:

  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications
  • Analyzed and created Facts and Dimension tables.
  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning and object creation) for Oracle as per business requirements using Erwin
  • Used DB2, legacy systems, Oracle, and Sybase sources and Oracle as target.
  • Developed Informatica Power Center mappings for data loads and data cleansing.
  • Wrote stored procedures in PL/SQL and Unix Shell Scripts for automated execution of jobs
  • Wrote Shell Scripting for Informatica Pre-Session, Post-Session Scripts.
  • Designed technical layout considering Standardization, Reusability, and Scope to improve if need be.
  • Documented the purpose of Data Warehouse (including transformations, mapplets, mappings, sessions, and batches) so as to facilitate the personnel to understand the process and in corporate the changes as when necessary.
  • Developed complex mappings to extract source data from heterogeneous databases Tera- Data, SQL Server Oracle and flat files, applied proper transformation rules and loaded in to Data Warehouse.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked closely with Production Control team to schedule shell scripts, Informatica workflows and pl/sql code in Auto-sys.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analyzed.
  • Conducted UAT (User Acceptance Testing) with user community
  • Developed K-shell scripts to run from Informatica pre-session, post session commands. Set up on Success and on Failure emails to send reports to the team.

Environment: Informatica, Oracle, PL/SQL, Cognos Impromptu 6.0, Cognos Power Play 6.6, Oracle 9i, Erwin 4.0, UNIX, Windows NT.

Confidential

ETL Informatica Developer

Responsibilities:

  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, work lets and workflows.
  • Created new mappings according to business rules to extract data from different sources, transform and load target databases.
  • Debugged the failed mappings and fixed them.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analyzed.
  • Modified the mappings according to the new changes and implemented the persistent cache in several mappings for better performance.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows for E-commerce based applications.
  • Involved in writing stored procedures and shell scripts for automating the execution of jobs in pre and post sessions to modify parameter files, prepare data sources.
  • Identified the issues in sources, targets, mappings and sessions and tuned them to improve performance.
  • Created and used reusable mapplets and work lets to reduce the redundancy.
  • Developed robust Informatica mappings and fine-tuned them to process lot of input records.

Environment: Informatica power center, Oracle9i, SQL, PL/SQL, Solaris, MS Vision, Ms-Access

We'd love your feedback!