Etl Informatica Devloper Resume
Richardson, TX
SUMMARY
- Over 7+ years of IT experience in analysis, design, development and implementation of Data warehouses, data marts using Informatica Power Center with Oracle, MS SQL server, DB2 and Teradata databases.
- Experience in Talend open studio and Talend Integration services.
- Strong expertise using Informatica Power Center Client tools - Designer, Repository manager, Workflow manager/monitor and Server tools - Informatica Server, Repository Server manager.
- Extensive work experience in ETL processes consisting of data sourcing, data transformation, mapping and loading of data from multiple source systems into Data Warehouse using Informatica Power Center.
- Strong knowledge of data warehouse methodologies like STG, ODS, POST ODS, Star and Snowflake Schemas, SDLC & ERD process.
- Proficient in implementing complex business rules through Informatica transformations, Workflows/Worklets and Mappings/Mapplets.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
- Experience with Informatica Server Manager to create sessions and workflows to run with the logic embedded in the mappings.
- Experience in Installation, Configuration, and Administration of Informatica Power Center/Power Mart Client, Server and having exposure to FTP and Release Management.
- Experience in production support to resolve critical issues and mingle with teams to ensure successful resolution of the reported incident.
- Good Knowledge in using the data Integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts.
- Expertise in taking backups, recovery of workflows, sessions and mappings from repositories.
- Experience in dimensional modeling to design star schema databases.
- Experience in document process and training team members by creating and maintaining technical documentation, deployment diagrams and checklists.
- Knowledge in creating cubes by using Pentaho Schema Workbench.
- Experience in Data warehouse OLAP reporting using Business Objects.
- Experience in DVO data validation by creating complex views against the target data.
- Good working skills in writing PL/SQL, SQL, Database Triggers and Oracle SQL query tuning concepts.
- Experience with writing daily batch jobs using UNIX shell scripts, and developing complex UNIX Shell Scripts for automation of ETL.
- Involved in SQL tuning and Informatica performance tuning. Tuned performance of Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Expert in Unit Testing, Integration testing and Performance Testing.
- Worked with different SDLC Environments Agile, Iterative, and Water Fall Methodologies.
- Committed team player with multitasking capabilities. Excellent interpersonal and human relations skills. Strong communication skills, both verbal and written, with an ability to express complex business concepts in technical terms.
TECHNICAL SKILLS
Hardware/OS: Windows XP/2000/2003, Sun Solaris, AIX, UNIX LINUX,Tomcat Apache, IIS Server
ETL Tools: Informatica Power Center (Repository Admin console, Repository Manager, Designer, Workflow Manager, workflow Monitor), Talend open studio and Talend Integration services.
Tools: Toad 7.4, Erwin Data Modeller, Oracle Designer Professional SQL *Plus, PL/SQL Developer, SQL Developer, Test Directory Autosys, Control-M, Version One, HP Quality Center.
Languages: SQL, PL/SQL, Shell Scripting
Data Bases: Oracle 9i/10g/11g, DB2, Sybase, SQL Server 2005& 2008
Reporting Tools: Crystal Reports, Cognos Report, SSRS, Business Objects
Methods: Dimensional modeling, Business intelligence reporting, Agile methodology
PROFESSIONAL EXPERIENCE
Confidential, Richardson, TX
ETL Informatica Devloper
Responsibilities:
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the target Metasolv traget system from the various source systems.
- Worked on Informatica power centre 9.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Involved in design and development of complex ETL mappings.
- Implemented partitioning and bulk loads for loading large volume of data.
- Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
- Developed Mapplets, Worklets and Reusable Transformations for reusability.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
- Performance tuning by session partitions, dynamic cache memory, and index cache.
- Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
- Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
- Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
- Created Stored Procedures in PL/SQL.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process.
- Created UNIX Shell scripts and called as pre session and post session commands.
- Developed Documentation for all the routines (Mappings, Sessions and Workflows).
- Involved in scheduling the workflows using UNIX scripts.
Environment: Informatica Power Center 9.1, Oracle 11g, Flat Files, PL/SQL, ERWIN 7.3 Data Modeling tool, Windows 2000, UNIX scripting.
Confidential, Gardner, KS
Talend/Informatica Devloper
Responsibilities:
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Used talend components such as tmap, tOracleInput, tOracleoutput, tFlowToIterate
- Responsible for Business Analysis and System Analysis. Gathered user/business requirements by from end users and business representatives.
- Analyzed business requirements and created Mapping Speci cations documnents.
- Maintain documentation revision control to organize and track revision histories for all documents
- Developed the jobs in Talend enterprise edition from stage to source, conversion, intermediate and Target.
- Track and coordinate approval and implementation of Informatica ETL
- Worked on Talend ETL & used features such as context variables and components like tFileExist, tFileCompare,tETLAggregate,tFileOutput etc.
- Utilized this platform af liates to load large data les on demand, extraction of data from Sources
- Used Informatica for Data extraction related to claims transactions.
- Used Informatica for Data load into data warehouse from disconnected data sources like af liate data les etc
- Create ETL jobs using Informatica Data Integration Platform to process the data les
- Analyze data and situations to develop plans and solve problems using the tool Informatica and support business functions.
- Worked on Explain Plan, TKProf for tuning SQL queries using Rule Based Optimization (RBO) and Cost Based Optimization (CBO)
- Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.
- Scheduled and executed sessions, sequential and concurrent batches for proper execution of mappings; sent noti cation e-mails using Server Manager.
- Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with backend database using PL/SQL.
- Designing automation of ETL jobs using Informatica batches and Pre—Session and Post—Session UNIX scripts
- Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend.
- Analyzed the Source and Target Data Models in order to provide suggestions to the business owners.
- Analyzed Source and Target Data Sources and validate the relationships between tables.
- Created Data Mapping documents between different databases and Flat les ( xed length and Comma separated).
- Provided impact analysis on the change requests as part of the change management process.
- Responsible for processing of Project Documents, Project Plans, Team Member Co—ordination, Scheduling Meetings, Task Assignment, Monitoring efforts on Timely basis, daily and weekly status report creation, Quality checks, diagnosis and issue resolution, Project tracking, Project documentation.
- Worked as liaison between Development and Quality Assurance.
- Developed, Reviewed and Validated the SQL Queries and PL/SQL procedures for optimization.
- Developed the Test plans for quality assurance based on functional requirements.
- Made Changes in requirements document and kept version control in SharePoint so that all team leaders and other responsible associates can stay tuned to latest information or artifacts.
- Reviewed Test cases to ensure all the functional scenarios are captured.
Environment: Talend 5.1.2, Informatica, Java, TIRKS, CIRAS, MSS, SQL and MS Word, Excel, MS Project, SharePoint.
Confidential, San Diego, CA
Oracle/ ETL Developer
Responsibilities:
- Analysis, Requirements gathering, function/technical specification, development, deploying and testing.
- Involved in the Design, Development, Testing phases of Data warehouse.
- Logical and Physical Database Layout Design using Erwin and Involved in Design and Data Modeling using Star schema.
- Created Informatica mappings for initial load and daily updates.
- Involved in fixing invalid mappings, testing of stored procedures and functions, unit and integration testing of Informatica Mappings, Sessions, Workflows and the target data.
- Involved in creating new table structures and modifying existing tables to fit into the existing Data Model.
- Performed data manipulations using various Informatica Transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Connected and Unconnected Lookup.
- Developed several mappings to load data from multiple sources to data warehouse.
- Developed, tested Stored Procedures, Functions and packages in PL/SQL for Data ETL Performed Data conversions.
- Involved in troubleshooting the load failure cases, database problems.
- Extensively used mapping parameters, mapping variables and parameter files.
- Informatica Metadata Reporter installation and Configuration.
- Involved in the Design, Development, Testing phases of Data warehouse.
- Generated Reports using Metadata Reporter.
- Customization of Informatica Metadata Reporter.
Confidential, Chicago, IL
Oracle/ETL Devloper
Responsibilities:
- Worked with technical/business analysts for DQ requirements, business analysis & project coordination.
- Developed mappings to extract data from various source systems and load them into target tables.
- Worked with various levels of software developers to load data into the data warehouse and identify potential problem areas in the source system.
- Written SQL code to extract, transform, and load data ensuring compatibility with all tables and customer specifications.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Created workflows and worklets with parallel and sequential sessions that extract, transform, and load data to one or more targets.
- Worked on various tuning issues and fine-tuned transformations to make them more efficient in terms of performance.
- Developed PL/SQL stored procedures for database updates and to create the necessary indexes in the target tables.
- Created and Scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
- Extensively used Autosys for Scheduling the Sessions and tasks. Performed Unit testing, Integration testing and System testing of Informatica mappings.
- Used windows Scripting and Scheduled PMCMD to interact with Informatica Server from command mode. Modeling and populating the business rules using mappings into the Repository for Meta Data management.
- Created scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from file names, such as date, for continuously incoming sources.
- Involved in Unit testing, integration testing, system testing and UAT.
Environment: Informatica power centre 8.1,Oracle 10G,SQL Server Management studio2008, SQL Plus, Windows XP and AIX UNIX, Agile Methodology, Version One, HP Quality Center, Flat Files, PL/SQL and Unix Scripting.
Confidential, Dallas, TX
Informatica Developer
Responsibilities:
- Involved in Requirement Analysis, ETL Design and Development for extracting from the source systems and loading it into the Data mart.
- Involving in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
- Interacted with the source system to gather the requirements.
- Gathered requirement and prepared mapping documents and the data flow documents.
- Extensively used ETL to load data from multiple sources to Staging area (SQL Server 2008) using Informatica Power Center 7.1 Worked with pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Fact and Dimension tables.
- Worked with power Center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Created Complex Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Experience analysing user requirements and translating them into system data structure designs.
- Extensively used Informatica Transformation like Source Qualifier, Rank, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
- Solid Expertise in using both connected and unconnected Lookup Transformations.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Involved in writing T-SQL Procedures Using SQL Server.
- Involved in the developing & debugging of Sybase and SQL server procedures.
- Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
- Developed reusable transformations and reusable maplets.
- Worked with session parameters, mapping parameters, mapping variables and mapping parameters for incremental loading.
- Created reusable transformations and mapplets and used them in mappings.
- Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
- Tuned sources, targets, mappings and sessions to improve the performance of data load.
- Involved in Performance tuning for sources, targets, mappings, and sessions.
- Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
- Involved in Production Support.
Environment: Informatica Power Center 7.1, Informatica Power Exchange 7.1, Mainframes, SQL Server 2005Netezza, UNIX, PUTTY and MS Visio.
Confidential
Database Developer
Responsibilities:
- Designed/developed/Modified tables, views, materialized view, stored procedures, packages and functions
- Coded PL-SQL packages and procedures to perform data loading, error handling and logging. Tied the procedures into an existing ETL process
- Used SQL loader to load the data from the files provided by the interfacing applications
- Created new Procedures, Functions, Triggers, Materialized Views, Packages, Simple, Ref & Traditional Cursors, Dynamic SQL, Table functionsas part of Project/Application requirements
- Created partitioned tables and partitioned indexes to improve the performance of the applications
- Createdrecords, tables, collections(nested tables and arrays) for improving Query performance by reducingcontext switching
- Wrote PL/SQL Database triggers to implement the business rules in the application
- Optimized lot of SQL statements and PL/SQL blocks by analyzing the execute plans of SQL statement and created and modified triggers, SQL queries, stored procedures for performance improvement
- Developed various backend application programs, such as Views, Functions, Triggers, Procedures and Packages using SQL and PL/SQL language for the top management for decision making
- Implemented all modules logic by using Triggers and Integrity Constraints
- Good understanding of database objects and ability to triage issues
- Involved in PL/SQL code review and modification for the development of new requirements