Data Warehouse Etl Developer Resume Profile
Professional Summary
- Six years of experience as an ETL Developer in different IT domains like Pharmaceuticals, Educational, Airlines, Healthcare, Consulting, Financial, and Retails.
- Expertise in Data warehouse development and systems using Oracle 10g/11g, SQL Server 2005/2008, DB2, UDB, SQL, PL/SQL, Informatica 9.1/8.6/7.1.4 and UNIX Shell Scripts at various stages of Data Warehouse, Data Conversion, History Data Migration and ETL projects.
- Trained mentored a team of 15 people on Informatica 9.1 in the DWH group at UCSD.
- Experienced in understanding Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Strong knowledge in writing complex PL/SQL Packages, Stored Procedures, XML DOM Parsers, Triggers, Performance tuning, Application tuning and Query Optimization using Hints, Explain plan, TKPROF.
- Worked on Reporting tool Cognos 10.1 to develop complex reports using Report Studio, Business Insight Advanced and Query Studio.
- Actively participated in gathering business requirements for ETL needs.
- In Depth experience in using Informatica transformations like Router, Update strategy, Connected/Unconnected Lookup, Aggregator, Stored Procedure, User Defined Functions, Techniques for pivoting rows to columns and columns to rows.
- Proficient in Normalization/Denormalization techniques in relational/dimensional database environments.
- Involved in the design, development and testing of PL/SQL database applications to support project delivery, production systems and business development activities.
- Proficiency in designing physical database objects based on data volumes, CPU utilization, end user data accessibility, archival and load strategies for the optimized database performance.
- Excellent proficiency in managing the entire testing process using Global Test management tool HP Quality Center.
- Currently playing Senior ETL Developer role at University of California at San Diego UCSD .
- Designed complex ETL mappings, reusable transformations using Regular Expressions, Dynamic Lookups, and Update Strategies for slowly changing dimensions, critical performance tweaking stages with partitioning, persistent cache, and extensive use of mapplets for common sub routines.
- Improved ETL performance by implementing Archival strategy, partitioning, using Parameter files at the session level and fine tuned SQL with unwanted loops, indexes, and pagination and recursive indexes between parent and child tables.
- Excellent communication skills to effectively communicate technical information to end users and ability to work individually on multiple projects
- Experience with software Development life cycle SDLC process performing detailed analysis, collecting requirements, document requirements, coding and unit testing,
- Integration and system testing.
Skills
Operating Systems | Windows XP/2000/NT/98/95, Ms-DOS, HP-UNIX, LINUX |
Languages | SQL, PL/SQL, UNIX Shell Scripting, XML |
Database | Oracle 10g/9.x/8.x,/7.x, SQL Server 2005/2008, SQL, DB2, UDB,MS ACCESS |
ETL Tools | Informatica Power Center 8.1/7.1.3/6.1, OBIEE BI Tool |
Reporting Tools | Cognos 10.1/10.2 Report Studio/Query Studio/Business Insight Advanced |
Tools | TOAD, SQL Loader, PL/SQl Developer, Schema Builder, Procedure Builder, Graphics Builder, Erwin 7.1, |
Documentation Tools | MS Visio, MS Office. |
Project Experience
Confidential
External Affairs Data Warehouse EADW : The EADW Datawarehouse is a critical business model which deals with the customer information and the ability of the donors by using statistical information stored in the data warehouse. The dimensional model is a star schema with nearly 30 data marts and 4 facts. Informatica 9.1 is the main ETL tool to pull data from various legacy systems to the Oracle database. Cognos was the reporting tool to create the reports for business needs with the data from EADW Datawarehouse.
Senior Informatica Developer
Responsibilities:
- Involved in End-End development of the implementation and Roll out.
- Coordinated with the business to gather requirements and preparing Functional Specification Document.
- Senior developer in developing the Informatica Mappings according to the business needs.
- Developed over 90 mappings to support the business logic including the historical data for reporting needs.
- Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
- Developed Type-1 and Type-2 mappings for current and historical data.
- Incorporated business logic for Incremental data loads on a daily basis.
- Written complex PL/SQL procedures for specific requirements.
- Developed Mapplets, worklets for organizing the data feeds.
- Used Parameter Variables and Mapping variables for incremental data feeds.
- Used Shared folders for Source, Targets and Lookups for reusability of the objects.
- Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
- Installed/Migrated Informatica from 8.6 to version 9.1
- Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD .
- On-call support for production maintenance
- Platform: Informatica 9.1, DB2 UDB, UNIX, Autosys, SQL Server 2008,Cognos 10.1
Confidential
The SCEPTRE application which is used to store aircraft maintenance information is built upon an IMS database. IMS is a hierarchical database and is not setup such that the users can access the data easily. The M E DWH was developed to replicate the SCEPTRE IMS tables in an Oracle database in order to perform reporting and analysis on the maintenance data. Date/Time Stamp Project: The newly added segment in the IMS database will have the audit trails of all the segments which keeps track of the changes happened to the old segments. The DWH holds 18 months' worth of data which is used for reporting and analysis purposes.
Senior Data Warehouse ETL Developer
Responsibilities:
- Played the role of a Senior ETL Developer in the implementation and Roll out Plan.
- Point of contact to the end users by gathering the requirements and setting up process plans by coordinating with business analysts.
- Contributing in developing business and functional Requirements, including requirements traceability, categorization, and prioritization.
- Involved in requirements gathering meetings and Estimating the project timelines and tasks.
- Developing Informatica maps to load the data from Sceptre IMS database to oracle tables.
- Developing sessions, workflows which involve command tasks to invoke the ETL jobs from UNIX shell scripts.
- Involved in designing the Informatica maps for Full refreshing of the data from IMS database.
- Also responsible for capturing the changes on the Sceptre by creating the CDC maps that runs continuously for 24 hours to capture the changes.
- Designed various ETL jobs for pulling data from various legacy systems and for loading into the integrated database.
- Performing data profiling to provide high quality of data to business users and customers.
- Scheduling the workflows to run on a daily basis.
- Involved in Phase-2 development for the Data marts that are used for reporting purposes.
- Developing complex oracle PL/SQL procedures, packages and functions to support the business needs.
- Developed various reusable mapplets for common code lookups
- Analyzing the existing packages, shell scripts and converting into Informatica with some enhanced functionality as demonstrated by business users
- Used various transformations like Expressions. Lookups, Mapplets for the ETL maps
- Written manual SQL procedures for some of the ETL and data cleansing tasks in ETL load to ODS
- Accountable for developing the core coding of the ongoing interfaces meeting with the PL/SQL coding standards.
- Accountable for the production support activities and UAT.
Platform: Informatica 8.6, Oracle 10g, and UNIX Shell Programming
Confidential
- EVEREST Cargo MDR is a data warehouse of Booking, Revenue and Billing systems in American Airlines Cargo. MDR proposes to consider the replicated servers for extracting data on a real time basis. Data files provided by the source systems in the required format would facilitate the periodic loads for MDR. Oracle Warehouse Builder OWB , an ETL tool, with components of PL/SQL, Oracle 10g and Shell scripts would facilitate the load both for real time and periodic loads.
- The real time feeds facilitate the data needed for an External Application called JDA Forecasting that estimates the forecast for future 14 days on a daily basis.
Senior Data Warehouse ETL Developer/Data Cleansing
Responsibilities
- Organized multiple requirement discussion meetings with Business Users and Source system experts and involved in functional and technical spec designs for ETL part.
- End to end development of ODS and OLTP Online Transactional Processing for Cargo.
- Migrating ETL portion of the project from PL/SQL packages to Oracle Warehouse Builder OWB
- Worked on Oracle Warehouse Builder OWB for ODS rebuild from old system which is functioning with Oracle 10g and UNIX shell scripts.
- Involved in the Real time implementation from Requirement Analysis phase.
- Design and Development of ETL Extraction Transformation Loading using Informatica and PL/SQL packages.
- Designed various ETL jobs for pulling data from various legacy systems and for loading into the integrated database.
- Integration Testing and History data migration
- Implemented Range partitioning to major transactional tables with quarterly limit at any single partition.
- Performing data profiling to provide high quality of data to business users and customers.
- Performed User Acceptance/Business process testing and suggested enhancements to improve the End User functionality
- Accountable for developing the core coding of the ongoing interfaces meeting with the PL/SQL coding standards.
- Educated the team members on the various features of Quality Center by giving demos and migrated data from all the previous releases into all modules like Requirements, Test Plan, Test Lab, Defects.
- Migrated the whole process of testing lifecycle including Defect Tracking to Mercury Quality Center.
- Developed various reusable mapplets for common code lookups
- Analyzing the existing packages, shell scripts and converting into Informatica with some enhanced functionality as demonstrated by business users
- Used various transformations like Expressions. Lookups, Mapplets for the ETL maps
- Written manual SQL procedures for some of the ETL and data cleansing tasks in ETL load to ODS
- Developed corresponding ETL jobs, batches and automated real time data load.
- Developed SQL procedures for handling data profiling while extracting data from source systems
- Creating control files and data files to load the flat file data into Oracle 10g table using SQL Loader.
- Used Explain Plan, TKPROF utility, SQL Trace and various HINTS to tune the complex queries of Reports, Procedures, Functions and Packages both at the Server Level and Client Level
- Conducted quality reviews of code developed by other development staff.
- Platform: Oracle Warehouse Builder OWB , Oracle 10g, Erwin 7.1, Mercury Quality Center and UNIX Shell Programming
Confidential
- Model N Implementation: Model N is a revenue management suite that links people, process and information across departmental and corporate boundaries. This cross-functional capability enables Model N to deliver a breakthrough in the improvement of revenue management process. Model N is a unique approach to technology by offering a platform with shared tools and services that enable each of the applications to function standalone or together as an integrated suite when various applications are deployed.
- The Operational data Store is maintained by Oracle 10g. There are various applications that use the data being outputted from Model N and the data is also fed to Model N. The data Transfer and transformation is carried out by the ETL tool Informatica. Oracle PL/SQL Procedures are used to perform the cleansing activities. The periodic loads are scheduled by the ETL jobs and workflows.
Senior Data Warehouse ETL Developer/PL SQL Developer/Data Conversion Specialist
Responsibilities:
- Playing a crucial role from the End-End development of the Implementation.
- Acted as a Senior Developer in organizing, architecting, developing and assigning various tasks to both the Development and Testing Team.
- End to end development of ODS and OLTP Online Transactional Processing for Model N
- Using Oracle 10g and Informatica for history data migration and ODS real time data.
- Involved in gathering the functional and technical requirements from the business and well documenting the details in the Design Specification Document.
- Used MS Visio to develop Physical Data Models and Business Models.
- Identify, analyze and translate business needs into system applications and improve business processes.
- Involved in data analysis, data profiling and data conversion from various legacy systems like SAP, Gentran and I-Many.
- Involved in the three stages of the development as Extraction, Transformation and Loading using complex Stored Procedures, Packages and Functions.
- Used various Extraction Techniques to pull data from Legacy Systems dealing with file formats like XML, CSV and Fixed Length flat file.
- Developed complex XML DOM Parsers to load the data from an XML file to a relational table and otherwise.
- Developed various reusable mapplets for common code lookups.
- Incorporated various transformations, Lookups, Expressions, Routers, Sequence generators, mapplets ,dynamic lookups and reusable transformations in ETL maps
- Developed various stored procedures, Packages and views for ETL needs.
- Written manual SQL procedures for some of the ETL and data cleansing tasks in ETL load to ODS
- Used Various Custom Return Types and Functions to meet the application requirements.
- Developed various database objects like Views and materialized views for backend and as middle level Data Access Layer between Java and Oracle 10g.
- Conducted quality reviews of code developed by other development staff.
- Developed corresponding ETL jobs, batches and automated real time data load.
- Extensively used Built-in packages like UTL, DBMS to deal with the different data formats like Flat File structures, CSV, DAT according to the SAP file formats.
- Accountable for the historical data conversion of the Direct Sales and Indirect Sales from I-Many system to Model N.
- Used various lookups to pull the data from Model N utilizing various Database links that refer to the tables/columns/views that are part of the database schema.
- Involved in Unit Testing the interfaces and documenting the test cases for various scenarios to check the compatibility of the functionality with Model N.
- Involved in System Integration Testing SIT that interacts with the legacy systems to pull the data and load the data into Model N ensuring that the business needs are met accordingly.
- Involved in User Acceptance Testing UAT wherein sample data is used to test the actual data that flows in-out from Model N.
Platform: Informatica 8.1, Oracle 10g, MS Visio, PL/SQL Developer, SAP
Confidential
Taco Bell has chain of restaurants across United States ,ODS stores their integrated real time data at a centralized server and reports generated out of ODS and data marts are used for management's decision support. The operational data store is intended to store information at the lowest granular level. With the nature of the history to be maintained, ODS would house data for 13 months. ODS would be the golden source of data which would be a normalized structure with all the data tags requested by the business to start with. The real time reports would also be facilitated by the ODS.
ETL Developer/Tester/PL/SQL Developer
Responsibilities:
- Responsible for creating and maintaining the stored procedures. Creating data base objects like tables, indexes, Views and Sequences
- Developed Control Files and Data files using SQL loader and successfully loaded the data from the flat files into the main tables.
- Implemented Archival Strategy to hold data up to 13 months.
- Developed complex ETL mappings like slowly changing dimensions, Fact tables and aggregated tables with Dynamic Lookups using various stages like Expressions, Routers, Update Strategies, Sequence Generator and Mapping parameters/variables global to that mapping.
- Played a vital role in organizing the team meetings and assigning tasks to the developers as an ETL Team Developer.
- Played a vital role in Data analysis of the Legacy data structures and ODS data structures.
- Worked since the Requirement phase and played a major role in gathering the documents from the business users as per the ETL needs.
- Created several complex PL/SQL Packages, Procedures, Triggers and Functions to implement business logic at front end and also perform validations on the imported data from legacy system.
- Improved Performance of database queries and ETL maps using various Optimization Techniques like Query Tuning, avoiding unnecessary/repetitive database calls, persistent cache, and Modifying Buffer size at session level.
- Responsible for Backup recovery to all the mappings, sessions and workflows at the repository server.
- Worked on PL/SQL code optimization techniques and error handling mechanisms.
- Worked on SQL Tuning using Explain Plan and TKPROF.