We provide IT Staff Augmentation Services!

Etl Informatica Tech Lead Resume

5.00/5 (Submit Your Rating)

St Louis, MO

PROFESSIONAL SUMMARY

  • Over 9 years of experience in IT including extensive Functional and Technical experience in Data Warehousing, Data Modeling, Business Intelligence (BI), Decision Support Systems (DSS) and Customer OLAP, ROLAP, MOLAP, ETL tools, product development and Reports Generation and programming in PL/SQL and shell scripting.
  • Involved in Business Analysis, Logical and Physical Data modeling and Conceptual Data.
  • Modeling and Implementation of Data mart / Data warehousing solutions.
  • Involved in the full Project Life Cycle of the Data Warehouse Projects.
  • Knowledge of transaction system database and data modeling design approaches ER diagrams, business intelligence approaches.
  • Expertise in design, develops, and implements data extraction, staging, transformation and loading ETL procedures to populate an operational data store (ODS) and enterprise data warehouse (EDW) from multiple data sources.
  • Involved in installation, configuration and upgrade of Informatica.
  • Involved with administration of Informatica repository and user administration.
  • Well versed in developing and maintaining metadata repositories.
  • Experience in UNIX Shell Scripting.
  • Expertise in fine-tuning the mapping and sessions by identifying the bottlenecks to improve the performance.
  • Extensive Experience in application design and implementation of Financial and Business Data according to business needs and requirements with Financial, Banking, Insurance Domains.
  • Proficient in PL/SQL Stored Procedures, Triggers, Packages along with tools like TOAD, Rapid SQL and PL/SQL Developer.
  • Experience in Systems Analysis, Relational and Dimensional Data Modeling, Design, and Development on Oracle databases.
  • Experience in Informatica administration.
  • Extensive experience in design and development of ETL processes from DB2, Oracle, SQL Server, and Flat files, Mainframe Files, Excel Spreadsheets.
  • Expertise in the concepts of building Data warehouse/DataMarts; Data Access Layer, Data Staging Layer (ETL) and Data Presentation Layer.
  • Well Experienced with ORACLE,DB2 database for Data warehouses. Developed Data marts and integration data with heterogeneous source systems.
  • Expertise in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and usage of Surrogate Keys. Extensive experience in backup and recovery process of Informatica Repository.
  • Expertise in Developing Mappings and Mapplets/Transformations between Source and Target using Informatica Designer.
  • Experience in scheduling Sessions and Batches in Informatica Server Manager.
  • Experienced in Data Mining, Data Validation, and documentation.
  • Experience with data cleansing issues and Meta Data Management in the implementation of ETL methodology in Data Extraction, Transformation and Loading using Informatica.
  • Proven experience in Data Profiling, Analysis and Data cleansing.
  • Involved in functional and technical Systems Analysis & Design, Systems architectural design, Presentation, process data flow design system impact analysis, star schema, and snowflake, changing dimension, Gap analysis and documentation.
  • Conducted reviews and Co-ordination experience with offshore teams.
  • Team player with good interpersonal and communication skills and the ability to work in a team environment.
  • Experience with Co-Ordination between Onsite-Offshore teams.

TECHNICAL EXPERIENCE

ETL Tools : Informatica Power Center 8.X/7.X/6.X, Informatica PowerMart 6.0
Business Intelligence : Business Objects XI/6.x, Cognos Impromptu 7.1, transformer, Powerplay
(Cognos Series 7 and 8.x)
Data Modeling : Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake
Modeling, FACT and Dimensions Tables, Physical And Logical Data
Modeling, Erwin 3.5/4.0/4.1, Microsoft Visio2000
RDBMS : Oracle 10g/9i/8i, Sybase, SQL, PL/SQL, SQL*Plus IBM DB2 UDB,
MS SQL Server 2000, MS Access 7.0/2000
Tools : Toad, SQL* Loader, ERWIN 4.5/4.1/4.0, XML, Visio
Languages : SQL, PL/SQL, UNIX Shell Scripting (KSH), Perl, C, C++, COBOL,
Main frames, .NET, AS400.
Operating Systems : Windows 2000/2003, OS/390, and UNIX AIX.

PROFESSIONAL EXPERIENCE

Confidential, St.Louis, MO
ETL Informatica Tech Lead 06/09 to Present

Confidential, Securities maintains the data related tosecurities through various applications such as Brokerage Data Ware House (BDW), Operational Data Store (ODS) and Advisory Compensation etc. The BDW application deals with the maintenance of the Current as well as History data related to Accounts, Reps, Securities, Positions, Holdings ax Lots, Mutual Funds in the Brokerage Data Warehouse(BDW). The ODS application behaves as a staging area to BDW. This is an operational data store in which all the data which comes from various source systems is staged and then after going through various transformations get loaded into the warehouse (BDW). During this period, worked on various Data Management data integration, data conversion and data migration project Implementations as part of Wachovia Securities and Wells Fargo Advisors merger at Wells Fargo Advisors (WFA).

Responsibilities:

  • Involved in all SDLC phases including Requirement Analysis, Client Interaction, Design, Coding, Testing, Support and Documentation.
  • Involved in Supporting ETL Applications production jobs and changing existing code based on business needs.
  • Interacted with business community and gathered requirements based on changing needs.
  • Involved in Analysis and translate functional specifications and change requests into technical specifications.
  • Responsible for data quality profiling, Data Modeling, ETL and gap analysis.
  • Extensively worked on Informatica 8.6 for designing and developing the ETL processes.
  • Worked with heterogeneous sources from various channels like Oracle, DB2, flat files, and COBOL files.
  • Developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling
  • Designed and created complex source to target mappings using various transformations.
  • Coding & Debugging, Sorting out their time-to-time technical problems
  • Designed and implemented conformed dimensions and Type1 and Type2 Slowly changing dimensions.
  • Responsible for choosing proper partitioning methods for performance enhancements
  • Participated in functional meetings to understand various source systems to be consolidated and involved in designing the data model for the Warehouse.
  • Built-in mapping variable/parameters and created parameter files for imparting flexible runs of sessions/mappings based on changing variable values.
  • Developed stored procedures on Oracle database to impart business logic into mappings and trigger pre/post session activities through shell scripts.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Involved in performing data mining, cleansing techniques.
  • Involved in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements, creating aggregate tables, and modifying target tables.
  • Involved in the creation of oracle, DB2 Tables, Table Partitions, and Indexes. Involved in writing PL/SQL Procedures, Functions and packages to synchronize the BDW data.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships. Developed and tested stored procedures and functions, cursors, packages for data ETL.
  • Involved in upgrading Informatica Power Center.
  • Involved in Fine-tuning SQL overrides for performance Enhancements.
  • Worked on database connections, SQL joins. Optimized Query Performance, Session Performance and Reliability.
  • Created sessions, database connections and batches using Informatica server manager. Responsible for scheduling, monitoring and supporting production ETL jobs
  • Scheduling the sessions to extract, transform, and load data in to warehouse database on business requirements. Involved in reviewing and approve existing ETL jobs.
  • Developed Shell Scripts to facilitate data loads, process implementation, streamlining Operations and to integrate different load processes.
  • Involved in unit, integration and user acceptance testing's.
  • Migrating from development to QA and then to production
  • Created & converted the PL/SQL Procedures to Informatica mappings.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections. Developed triggers and stored procedures for data verification and processing.
  • Extensively worked with file transfers between internal and external environments UNIX, Mainframe, and AS400.
  • Extensively worked with file transfer protocol NDM, FTP, Copy.
  • Extensively worked on multiple data management projects with onsite and offshore development Teams.
  • Extensively worked on production DQ issues.
  • Supporting production jobs based on tickets and on-call support.

Environment : Informatica power center 8.6/8.1, Designer, workflow Manager, Workflow Monitor, Toad, oracle10g , DB2 , flat files, and Cobol files, PL/SQL, ERWIN4.1, Unix AIX , Shell-scripting KSH, Mainframe, Auto Sys Job Scheduler, Harvest version control system.

Confidential, Jersey City, NJ 10/07 to 06/09
Sr. ETL Informatica Designer /Developer

CAMEO is the firm\'s global margin exposure and collateral management system. It offers a single platform to manage client exposure across all business lines along with operational tools to create margin calls, book collateral pledge/receives, approve payments, and generate/distribute client reports and access rich reporting for risk management, financial control, exposure management and front-office business analytics. CAMEO integrates with Lehman Risk to provide VAR/scenario Risk-Based Margin and NYSE Customer Portfolio Margining for clients seeking to improve leverage potential. Cameo-re-engineering objective is to design and Develop a single integrated data warehouse for all firm's PME (post margin exposure) and collateral calculated Business lines.

Responsibilities:

  • Extensively involved in ETL conversion and existing PL/SQLsystem analysis.
  • Involved in developing and maintaining metadata repositories.
  • Develop logical and physical data models that capture current state/future state of the data elements and data flows using Erwin4.5/4.1.
  • Involved in dimensional modeling to design and develop STAR schemas and Snowflake schemas.
  • Interact with business users and source systems to define, agree and document incoming data feed specifications.
  • Involved in conversion of PLSQL source code (stored procedures and functions) into ETL code using Informatica power center 8.1
  • Created complex and robust mappings, mapplets to convert load jobs. Analyze the source data structure and perform the Logical mappings.
  • Used different transformations such as Source qualifier, Joiner, Filter, Router, Look-up, Union, Aggregator, and Normalize, update strategy to transform the data.
  • Involved in delta load process.
  • Involved in designing and tuning most of the Informatica custom transformations.
  • Involved in Debugging and performance tuning for Load Transformations.
  • Used Parameter files and variables in mappings and sessions.
  • Created reusable objects such as mapplets, reusable transformations to incorporate the common functionally between Future margin center cust and rest batch loads.
  • Extensively involved in Re-design the load flow by eliminating the intermediate steps and Optimizations in order to get significant ETL conversion performance gains.
  • Enhanced session performance, and improved response times, through extensive performance tuning of the mappings, ETL Procedures and processes.
  • Created repository, users, groups and their privileges using Informatica Repository Manager. Responsible for Backup and Recovery procedures for Production and Development servers.
  • Worked with old legacy systems main frames.
  • Designed and implemented Type1 and Type2 Slowly changing dimensions.
  • Interact with end users in a support role to fulfill requests and resolve issues.
  • Involved in identify common processing patterns across margin centers and create a conceptual architectural model with technology choices.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Involved in Implementing ETL jobs exception handling and framework.
  • Generated SQL Loader scripts and Shell scripts for automated daily load processes. Developed triggers and stored procedures for data verification and processing.
  • Responsible for choosing proper partitioning methods for performance enhancements.
  • Worked on database connections, SQL joins. Optimized Query Performance, Session Performance and Reliability.
  • Involved in Plan, execute, and document testing of database and ETL programs.
  • Created detailed work plan, post implementation checks, rollback/back out plan and trouble shooting plan for individuals load for the source/target environments Stage/Prod.
  • Involved in the creation of oracle Tables, Table Partitions, Materialized views and Indexes and PL/SQL stored procedures, functions, triggers and packages. Developed triggers and stored procedures for data verification and processing.
  • Involved in Fine-tuning SQL overrides for performance Enhancements.
  • Involved in unit, integration and user acceptance testing's.
  • Setting up Batches and Sessions to Schedule the Loads at regular intervals.
  • Involved in writing shell scripts to schedule, automate the ETL jobs.
  • Conducted reviews and Co-ordination with offshore and onsite development teams.
  • Extensively worked on production DQ's and Tickets.
  • Responsible for monitor production status and support production schedule jobs.

Environment: Informatica power center8.1, Power exchange/Power connect, Designer, workflow Manager, Workflow Monitor, oracle10g, DB2 , Toad, Business objects XI 3.0, PL/SQL, HP-UX, AIX4.3.3, Shell scripting KSH, Windows NT4.0, Main frames, SQL* Loader, Autosys, VB Application, Mainframe.

Confidential,St.Louis, MO 11/06 to 10/07

Sr. ETL Informatica Designer /Developer

UMB Bank handles commercial and consumer banking operations and offers financial products, services, ideas and solutions to customers and clients in 50 states. The Project objective is to design and develop a Single Integrated Data Warehouse for reporting the loan data for the organization, which enables the Loan servicing Department to track the loans and improve the overall performance. This Data Warehouse which replaced the legacy mainframe reporting tools is an Enterprise level source for ad-hoc and summarized loan information capable of supporting the growing loan portfolio and increased user queries.

Responsibilities:

  • Involved in the business requirements gathering and analysis.
  • Involved in support of ETL application batch processes on a rotational, on-call basis.
  • Involved in dimensional modeling to design and develop STAR schemas using ERWIN4.1 to identify fact and dimension tables.
  • Interact with business users and middleware team to conform data sources into facts and dimensions of the star-schema data model. Conducted peer design reviews.
  • Worked with heterogeneous sources from various channels like Oracle, SQL Server, flat files, and web logs.
  • Created complex and robust mappings and mapplets.
  • Used different transformations such as Joiner, Look-up, Rank, Expressions, Aggregator and Sequence Generator
  • Involved in Fine-tuning SQL overrides for performance Enhancements.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Mainframe Files and DB2 and developed reusable Mapplets and Transformations and designed data Mappings between source systems and warehouse components using Mapping Designer.
  • Involved in designing and tuning most of the Informatica custom transformations.
  • Responsible for choosing proper partitioning methods for performance enhancements.
  • Involved in unit testing of mappings, mapplets also involved in integration testing and user acceptance testing.
  • Scheduling the sessions to extract, transform, and load data in to warehouse database on business requirements. Involved in reviewing and approve existing ETL jobs.
  • Created sessions, database connections and batches using Informatica server manager. Responsible for scheduling, monitoring and supporting production ETL jobs.
  • Involved in writing shell scripts for load data and process data.
  • Involved in writing shell scripts to schedule, automate the ETL jobs.
  • Scheduled and monitored transformation processes using Informatica server manager.
  • Used Parameter files in mappings and sessions
  • Involved in creating and managing repositories.
  • Involved in creating Ad-hoc and summarized reports.
  • Involved in unit, integration and user acceptance testing of ETL Applications.

Environment: Informatica Power Center 8.0/7.1, Oracle10g/9i, SQL Server2005, DB2, Flat files, Toad, Erwin 4.1, UNIX-AIX, Cogon\'s 7.1, Mainframes.
Confidential, Pleasanton, CA 02/05 to 11/06
Sr. Data warehouse Analyst/Develop
Providian Financial Corporation is a company specializing in financial services, credit cards etc…. A third party vendor called Total Access maintains the data of Providian. Efforts are on from the Providian side, to build its own Data warehouse from the DB2 mainframe database (12 terabytes) being maintained by Total Access. The plan is to build two different types of data warehouses, EDW and OIS, while the latter maintains data for 6 months, the former maintains all the history spanning over several years. The ETL process involved extracting the data from DB2 mainframe as source, with the target data warehouse being in Oracle.

Responsibilities:

  • Participated in the entire requirements engineering process right from requirements elicitation phase (where we gather the requirements from the retail business owners and marketing people) to the phase involving documenting of the requirements.
  • Involved in designing, developing, and maintaining the data in the OIS data warehouse and then building the EDW using the data from OIS.
  • Managed the entire "ETL process" involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales area.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems like Oracle, Teradata, and Sybase into the Oracle target database.
  • Analyzed and understood all data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse for each of the areas Finance, Credit Cards, Brokerage.
  • Involved in the creation of oracle Tables, Table Partitions, and Indexes. Involved in writing PL/SQL Procedures, Functions and packages to synchronize the EDW data.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships. Developed and tested stored procedures and functions, cursors, packages for data ETL.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Mainframe Files, SQL Server, and DB2 and developed reusable Mapplets and Transformations and designed data Mappings between source systems and warehouse components using Mapping Designer.
  • Involved in designing and tuning most of the Informatica custom transformations.
  • Implemented delta load process.
  • Involved in writing DB2 load data commands.
  • Handled alerting mechanisms, system utilization issues, performance statistics, capacity planning, integrity monitoring, population, maintenance, reorganization, security, and recovery of databases.
  • Worked in Offshore On-shore Co-ordination setting, delegating and managing a group in India's Accenture.
  • Involved in quality assurance of data, automation of processes.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Used Task developer in the Workflow manager to define sessions. Supporting production jobs based on tickets and on-call support.
  • Handled change data capture (CDC).
  • Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Created repository, users, groups and their privileges using Informatica Repository Manager
  • Involved in writing UNIX shell scripts to automate the ETL workflows.

Environment: Informatica Power Center 7.1, Business Objects, Oracle 9i, DB2, SQL/PLSQL, SQL*Loader, UNIX Shell programming, Erwin 4.0, UNIX (IBM - AIX) and Windows 2000.

Confidential, IN 10/03 to 02/05
ETL Informatica Designer /Developer

Life Insurance system is designed and developed using PL/SQL and Stored Procedures for design, development and deployment of Customer & Services Data Marts enterprise wide. This data warehouse is the data feed for the Reporting System using Business Objects as reporting tool and oracle as back end. The system is designed for assisting doctors in their diagnosis and analysis processes. System requires wide range of reports to be generated to provide help in decision-making. Patient visits details; lab report details, group reports, historical details of patients, and analysis reports etc are generated using Business Objects.

Responsibilities:

  • Responsible for Design and Development of complete ETL application.
  • Responsible for the development and production support of a data warehouse by using Informatica Power center.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Most of the transformations were used like the Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Sequence.
  • The project called for extensive work using the Informatica tool involving - Source Analyzer, Data warehousing designer, Mapping designer & Mapplets and Transformations
  • Analyze the source data structure and perform the Logical mappings.
  • Analyzed the systems, met with end users and business units in order to define the requirements.
  • Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
  • Involved in the creation of oracle Tables, Table Partitions, Materialized views and Indexes and PL/SQL stored procedures, functions, triggers and packages. Developed triggers and stored procedures for data verification and processing.
  • Extensively used slowly changing dimensions type 1 & 2.
  • Involved in Debugging and performance tuning for Load Transformations.
  • Worked with Data Transformation Services (DTS) to extract, transform, and consolidate data from disparate sources into single or multiple destinations.
  • Done complex Transformations like Filter and Lookup Transformations
  • Setting up Batches and Sessions to Schedule the Loads at regular intervals.
  • Applied various Transformations on the source data to populate the data marts
  • Involved in creating target database for the Data marts.
  • Developed PL/SQL Packages to synchronize data in warehouse with various data marts.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections. Developed triggers and stored procedures for data verification and processing.
  • Documented user requirements, Translated requirements into system solutions and develop implementation plan and schedule
  • Wrote shell scripts to run ETL workflows.
  • Involved production on-call support and changing existing code based on business needs.

Environment: Informatica PowerCenter 6.2/7.0, Business objects, SQL server2000, Oracle8.i, TOAD7.6, Windows NT/2000 and HP- UNIX.

Confidential, 09/01 to 10/03
Software Engineer

  • Project: District Management Information System
  • Client: National Association of Securities Dealers (NASD)
  • The District management Information System (DMIS) is one of the main legacy applications used by National Association of Security Dealers Regulation (NASDR). It helps in collecting, analyzing and reporting information about the examinations conducted for member firms and new firms applying for NASD Membership. This application forms the backbone of the District Offices since it relates to the regulatory activities. The system has six main sub systems like Membership profile, Cycle, Cause, Actions, Time Tracking and Staffing and Security.
  • Responsibilities:
  • Developed Screen Prototypes.
  • Coding and unit testing
  • Environment: Pentium PC, Windows NT4.0, Visual Basic, Oracle

EDUCATION

Bachelor of Engineering for Electronics & Instrumentation

Professional Diploma in Advanced Software Technologies

We'd love your feedback!