Lead Etl Developer/production Support Analyst Resume
Dallas, TX
SUMMARY
- Over 14 Years of total IT experience in Design, Development and Deployment of various Data marts and Data warehouses which are Decision Support Systems for Business, using Informatica ETL tool, Data Modeling, Data Analysis and Business Intelligence Tools
- Worked as senior level Data Warehouse developer with strong experience in fulfilling many Data Warehouse project tasks, such as Analysis, Design, and ETL (Extract, Transform and Load) process using Informatica, Oracle and UNIX, with extensive Production Support experience.
- Good knowledge on Installation and configuration of Informatica server and client environments.
- Extensive software development life cycle experience in Data warehousing projects with data analysis, data extraction, transformation and loading using ETL Tools on Relational Databases for Development and Data Migration Projects.
- Good knowledge on Logical and Physical data modeling and database design.
- Well versed on Oracle development tool set (Including SQL*Plus, PL/SQL, SQL*Loader, PL SQL developer, TOAD)
- Has extensive experience in developing Packages, Stored Procedures and Database triggers.
- Extensive experience in designing and Building of Universes and business reports using Business Objects.
- Extensive experience in Performance Tuning for Oracle Stored Procedures and Informatica Mappings and Sessions.
- Well Versed on Teradata Utilities BTEQ, BTEQWIN, Fastload, Fastexport, NCR’s Queryman, Mload and Tpump.
- Experience in integration of various data sources like Oracle, MS SQL Server, XML, Flat files (COBOL) into the staging area.
- Experience with Finance, Pharmaceutical, Mutual fund, Airways, Insurance and Banking Domains.
- Good understanding of Data warehousing concepts, Star schema, snow - flake schema, Master Data Management and Strong Analytical, Interpersonal, Programming and Problem solving skills.
TECHNICAL SKILLS
LANGUAGES: SQL, PL/SQL, Shell script, C, C++, Perl, AWK, SED, Java and COBOL
ORACLE TOOLS: SQL*Loader, SQL* Plus, Designer2000
DATABASE: ORACLE 8i/9i, DB2, MS SQL Server, Teradata V2R6, Sybase and MS-Access
DATABASE TOOLS: Toad, SQL Navigator, Shareplex and DB2 Connect
DATA MODELING TOOLS: ERWIN 4.1
ETL Tools: Informatica-PowerCenter/PowerMart 7.1/8.1/9.1, Power Exchange 7i/8.1/8.5/9.1, Data Stage 7
GUI/TOOLS: HTML, XML, JavaScript, Visual Basic5.0, Trillium, Test Director and Remedy
OPERATING SYSTEMS: HP UNIX, Linux, AIX, AS/400, Windows 95/NT/2000/2003/XP, MS-DOS and UNIX Sun Solaris
SCHEDULIN TOOLS: Control-M, Autosys, AppWorx and Cron Tab
VERSION CONTROL: Visual Safe Source, PVCS, STAR Team, SVN
WEB TECHNOLGIES: Web Logic, Web Services
BUSINESS INTELLIGENCE TOOLS: Business Objects, Qlikview, Crystal Reports, COGNOS- Power play, Transformer, Impromptu and OLAP Cubes
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Lead ETL developer/Associate
Environment: Informatica-Power Center 8.6/9.1, Informatica PowerExchange 8.6/9.1, Oracle 11g, JAVA/J2EE, Hibernate, Teracotta, MS-Office tools, UNIX, LINUX, Control-M/Autosys Batch Scheduling.
Responsibilities:
- Working on BAU defects and providing L3/L4 support for the Global Collateral Engine (GCE) Data Repository.
- Worked as an active participant with architecture team in defining ETL architecture and data mapping between the systems.
- Involved in requirement gathering meetings with business analysts and provided inputs as needed for the requirement gathering, in preparing Requirement Specification Documents.
- Worked closely with Informatica Administration team on Informatica up gradation project.
- Developed mapping specification documents for various projects like EA 1.0, EA2.0, EA3.0, GCF Project, ACCE Project, Long Box Project, and EVARE Project.
- Developed real-time Informatica mappings as part of LongBox project to get the Positions, Securities and Obligations data in real-time from Mainframe system in to Global Collateral Engine (GCE) Data Repository.
- Helped on ACCE project and developed Informatica workflows using various transformations to bring the custody and non custody positions and obligations from GDW into (GCE) Data Repository.
- Worked on GCF project to get FICC & BONY obligations and security data into Global Collateral Engine (GCE) Data Repository and lead 4 ETL developers.
- Developed Implementation documents for weekly releases and monthly PROD releases with step by step release instructions.
- Worked closely with the Release management teams in reviewing implementation plans and change requests for higher environments and Production releases.
- Developed MQ ETL workflows to get account and legal entity updates from IM to GCE data repository.
- Developed UNIX shell scripts for ABC frame work and for automation of ETL Deployments.
Confidential, Dallas, TX
Lead ETL developer/Production Support Analyst
Environment: Informatica-Power Center 8.6, Data Stage 7.5, Oracle 10g/11g, TeradataV2R6/Teradata 12, Cogno’s 8: Power Play Transformer, Power Play, Query Studio, Report Studio,, PL/SQL Developer, Perl, Sun Solaris, MS-Office tools, UNIX,LINUX, Control-M.
Responsibilities:
- Worked on Migration Project to convert ETL Code from Data Stage to Informatica and lead 3 ETL Developers who sits in offshore.
- Worked on Data Integration Project to Integrate data from Getronic Data Warehouse to Confidential Data Warehouse.
- Worked on 24/7 on call support environment, weekly on call rotation basis between the Production Support Engineers.
- Involved in planning for migration Project from UNIX to LINUX and tested ETL jobs as part of the testing.
- Worked with the business users and Data architects to define data mapping between Getronics Data Warehouse to Confidential data Warehouse.
- Worked as an active participant with Data Warehouse architects in defining ETL architecture and data mapping between systems.
- Work on Operational and Non-Operational tickets.
- Extensive experience on Problem and Change management Process.
- Provided the LOE (Level of Efforts) to the management for coding the Informatica Mappings and oracle stored Procedures for the Getronics data Migration projects.
- Helped Reporting team by tuning SQL queries to improve the performance in Production.
- Involved in meeting with users for requirement gathering and prepared System Requirement Specification Document (SRS).
- Developed Informatica mapping to pull data from XML files to Oracle as part of US Bank Project.
- Developed Informatica mappings for the Getronics data migration project.
- Used various transformations such as Filter, Expression, Aggregate, Look-up, sequence, Joiner, Router and Update Strategy for the better data massaging and to migrate clean and consistent data.
- Created and troubleshoot (Debugged) Oracle Stored Procedures.
- Developed SQL Query’s for data analysis and PL/SQL to implement business logic.
- Worked closely with the business users in resolving Production data issues.
- Developed MLOAD scripts and data validation scripts using BTEQ.
- Created Cogno’s cross tab, simple reports.
- Developed the UNIX shell scripts running against the data bases according to the DWH Standards.
Confidential, Dallas, TX
Sr. ETL Developer
Environment: Informatica-Power Center 8.1.1, Oracle 9i, Qlikview, SQL Server, Mainframe, Erwin 4.1, Informatica PowerExchange 8.5, TOAD, Perl, Service Center, MS-Office tools and Unix Sun Solaris and Crontab.
Responsibilities:
- Prepared technical documentation of Source and Target mappings.
- Prepared Technical Specification documents (TSD) for Mappings, Sessions and workflows for new requirement changes.
- Developed ETL mapping using various transformation like source qualifier, joiner, expression, aggregator, rank, lookups, filters, routers, unions, normalizers and sequence generators.
- Developed Informatica Mapping to pull Claims data from Mainframe systems to load into EDW.
- Troubleshooted and resolved ETL job failure as part of post Implementation monitoring.
- Prepared test data and executed unit and system test scripts, analyzed, captured and published test results.
- Worked on defects that were discovered in System and Integration testing.
- Extensive performance tuning of Informatica Mappings.
- Developed Visio diagrams for management to better understand ETL architecture and Production environment.
- Developed data maps in Power Exchange from copy books.
- Prepared the Production support operational manual for the Production support team.
- Worked on 24/7 on call support environment on call rotation basis between the Production Support Engineers.
- Worked with System owners to resolve source data issues and refine transformation rules.
Confidential, Dallas, TX.
ETL Developer
Environment: TeradataV2R6, Oracle 9i, Informatica-Power Center 7.1/8.1, Business Objects, Erwin 4.1, SQL Navigator, Perl, Star Team, SAP BW, Peregrine, MS-Office tools and Unix.
Responsibilities:
- The Confidential Enterprise Data Warehouse (PEDW) team is responsible for transferring data files created in SAP and ODS to ETL servers for processing.
- Involved in Phase1 Project Implementation in Production.
- Worked on 24/7 on call support environment as part post implementation monitoring.
- Support data extraction, cleansing, aggregation, reorganization, transformation, and load operations.
- Access data from multiple operational data sources,
- Re-map source data into a common format, Filter data, calculate derived values, aggregates, Load cleansed data to the central DW
- Extensive use of Informatica 8.1 and ETL performance tuning.
- Error Handling: Handled in Informatica Mappings & sessions and also through Validation Scripts of BTEQ in Teradata 6.2
- Hands-on working experience on StarTeam, Configuration Tool.
- Extensive experience working on Problem and Change management tools.
- Control-M as Scheduler for the Jobs
- Developed Wrapper scripts in Perl and Korn UNIX Shell scripts to implement invoking Informatica workflows and to execute Teradata MLOAD scripts.
- Developed ETL mapping using various transformation like source qualifier, joiner, expression, aggregator, rank, lookups, filters, routers, unions, normalizers and sequence generators. Error Handling, Automate the Batch ID Generation.
- Worked in Teradata performance tuning by defining statistics and join indices on tables. Also worked to improve query performance by creating primary indices on tables to ensure more even distribution of data.
- Developed the UNIX shell and Perl scripts running against the data bases according to the DWH Standards.
Confidential, Denver, CO
Sr. Informatica Developer/Team Lead
Environment: Oracle 9i, Informatica-Power Center 6.2/7.1, Erwin 4.1, SQL Navigator, SQL, PL/SQL, Perl, SQL Loader, SAP R/3, ALE, SAP BW, SOA, Toad, MS-Office tools and Unix.
Responsibilities:
- Involved in preparing a Project plan with Project Managers by giving LOE for the informatica coding and in identifying the dependencies.
- Involved in meeting with users for requirement gathering and prepared requirement specification document.
- Coordinated meetings with the users to review prepared Requirement Specification Document and to get sign-off for further Informatica coding and for preparing TSD.
- Prepared Technical Specification documents (TSD) for Mappings, Sessions and workflows for new requirement changes.
- Developed Informatica mappings, workflows to validate the data by pulling the required accounting, master and transactional data from three source systems (MLS, RES and PAM) and transform it into the specified sender’s structure formats required to load into the SAP BW for users to fulfill reporting needs.
- Developed shell scripts for file watcher, file validation and to call informatica workflows from UNIX shell prompt.
Confidential, Houston, TX
Sr. Informatica Developer/Team Lead
Environment: Oracle 9i, SQL Server, Teradata V2R5, Informatica-Power Center 6.2/7.1, Erwin 4.1, SQL Navigator, SQL, PL/SQL, Perl, SQL Loader, SQL Server, Sybase, Siebel, Toad, MS-Office tools, Windows2003, Unix and Remedy Action Request System.
Responsibilities:
- Involved in preparing a Project plan with Project Managers in identifying the dependencies and pre-requisites in order to load different entities into One3.
- Coordinated with the DBA Team, Source systems, BSA team, offshore team and Data Quality team for Different data loads (System test Load, Integration System Test Load, UAT Load and Pre production deployment load) and in Identified the bugs, new requirement changes and Data Quality Issues.
- Worked on Offshore/Onshore model by leading 5 developers in offshore for Development, data loads and RCS track tickets
- Prepared Technical Specification documents (TSD) for Mappings, Sessions and workflows for new requirement changes.
- Developed Informatica mappings, workflows to pull the SMBU Issue data from Remedy21 and to load into the One3.
- Developed ETL (Extract, Transformation, Load) processes on both Teradata and Oracle using kron shell scripting and RDBMS utilities such as MultiLoad, FastLoad, FastExport, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
- Developed SQL Query’s for data analysis and PL/SQL to implement business logic.
- Created and troubleshoot (Debugged) Oracle Stored Procedures.
- Worked on Bugs/Issues/Requirement Changes/ DQ Issues, which has been raised in the
- RCS track tracking tool by assigning Bugs to corresponding developer or team.
Confidential, Thousand oaks, CA.
Sr. Informatica Developer
Environment: Oracle 9i, DB2, Informatica-Power Center 7.1, Power Exchange, Erwin 4.1, SQL Navigator, SQL, PL/SQL, Business Objects 5.1, SQL Loader, Trillium, UNIX Shell Script and Toad, MS-Office tools, Test Director and Control-M Scheduling Tool.
Responsibilities:
- Involved in Preparing CSD (Consolidated Summit Deliverable) from SRS (System Requirement Specification) document, which has provided by Client.
- Prepared Technical Specification documents (TSD) for Mappings, Sessions and workflows.
- Prepared Implantation plan and Production support manual for GP Model-N project.
- Used various transformation such as Filter, Expression, Aggregate, Look-up, sequence, Joiner, Router, Update and stored procedure transformation for the for better data massaging and to migrate clean and consistent data.
- Created Mappings, sessions and workflows.
- Created data maps in Power Exchange from Cobol copy book.
- Scheduled workflows to refresh the data from multiple sources to a single staging area using Informatica Workflow Manager.
- Used Informatica as the ETL for extracting data from the flat files and transforming the data according to the target database and performed loading
- Performance tuning of the ETL - by utilizing dynamic cache for Lookups and partitioning the sessions
- Tuned Various SQL queries, by Identifying bottlenecks using Explain plan and TKPROF
- Involved in Unit testing, System testing and Integration testing.
Confidential, San Mateo, CA.
Production Support Engineer/ Sr. Informatica Developer/Informatica Administrator
Environment: Oracle 9i, Sybase, Informatica-Power Center 6.2/7.1, Ab-Intio, Erwin 4.1, SQL Navigator, SQL, PL/SQL, Power Exchange, VMS, Star and Snow flake schemas, Shareplex, Business Objects 5.1, Broadcast Agent 6.5, Perl, SQL Loader, Trillium, UNIX Shell Script, Toad, MS-Office tools, NDM (Network Data Mover), MQ- Series Messaging System, Test Director and Control-M Scheduling Tool and AppWorx.
Responsibilities:
- Working on Operational and Non-Operational ticket.
- Worked on 24/7 on call support environment, weekly on call rotation basis between the Team members.
- Worked on Offshore/Onshore model by leading 4 developers, who sits in offshore for enhancement projects.
- Extracting data from Flat Files and Various RDBMS like Oracle, DB2 and loading into Oracle and Teradata Relational Data databases.
- Supported Production systems, on a rotation basis, for a week at a time.
- Have done troubleshooting on ETL production problems.
- Resolved tickets related to feeds from different application into the data warehouse.
- Done reload of input files into staging and production databases.
- Used Mload and scripts to correct data and fix production problems.
- Development of new ETL code for enhancements and for bug fixes.
- Installed and Configured Informatica Power Center 6.2/7.1 client tools and connected with each database in Data Ware house using repository server
- Extract the data from the legacy system and loaded into staging database.
- Used various transformation such as Filter, Expression, Aggregate, Look-up, sequence, Joiner, Router, Update, Normalizer and stored procedure transformation for the for better data massaging and to migrate clean and consistent data.
- Used debugger to test the mapping and fixed the bugs
- Developed Mapplets and reusable Transformations
- Tuned Various SQL queries, by Identifying bottlenecks using Explain plan and TKPROF.
- Tuned Informatica sessions by Partitioning sessions and tuning session parameters.
- Involved in the Designing and Building of Business Object Universes for ISR Data Mart.
- Drill-down/Drill-up capabilities are included in universes using Aggregate awareness function
- & Hierarchies.
- Created Business Objects cross tab, simple reports.
- Fine tuning the universe for effective queries by implementing shortcut joins and Aggregate Aware functionality.
- Scheduled and Distributed Business Object reports through BCA (Broadcast Agent)
- Created Reporter and Explorer reports (cross tab and graphical) covering different types of analysis with drill-down, alerts features
- Created various mappings to load the data into interface tables later it is moved to dimension tables.
- Created mappings for loading the data into Fact tables.
- Developed Shell Scripts to integrate Informatica sessions with Control-M scheduler.
- Scheduled jobs in Control-M to refresh the data from multiple sources to a single staging area using Informatica Workflow Manager.
Confidential
ETL Informatica/Report Developer
Environment: Oracle 9i, Informatica-Power Center 4.7/5.1, Cogno’s 6, SQL Navigator, SQL, PL/SQL.
Responsibilities:
- Developed Informatica Mappings to load the British Airways customer data into the Data Warehouse and used transformation like source qualifier, joiner, expression, aggregator, rank, lookups, filters, routers, unions, normalizer and sequence generators.
- Developed Oracle Stored procedures for data validations.
- Worked on Problem and Change management tools.
- Coordinated code review sessions with co developers and worked closely with DBA teams for the implementation of projects in production.
- Developed Cross tab reports using Cogno’s.
- Helped Reporting team by tuning SQL queries to improve the performance in Production.
- Developed the UNIX shell scripts running against the data bases according to the DWH Standards