Etl Resume
Monterey Park, CA
Professional Summary
Sr. ETL Professional having 6+years of software development experience with strong hands on expertise in Design, Develop and Deployment of Data WarehousingApplications based on Kimbal Data Warehousing Methodology.
- Deep understanding of Concepts and Components used in implementation of Enterprise Data Warehouse, Data Marts and experienced in providing ETL solutions in OBIEE environment.
- Well versed with development of ETL solutions with Informatica PowerCenter 7.x.,8.x
- Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Sequence Generator,Router, Update Strategy , Joiner, , Rank, Aggregator, Sorter, and Union, Normalizer,Java
- Proficiency in using Power Center Transformation Language inbuilt Functions to design data conversions routines from wide variety of sources such as Flat Files & RDBMS.
- Skilled in Development of complex and optimized SQL queries for extracting data from multiple relational data source.
- Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing. Conducted Session Thread Analysis to identify and fix Performance Bottlenecks. UsingBulk Loading to reduce data write IO cycles.
- Possess good understanding of Dimensional Modeling concepts such as Star Schema, Snow FlakeSchemas applied in development of Data Marts & Enterprise Data Warehouses.
- Proficient in translating Business Processes / Requirements into technical requirements.
- A Team Player with excellent communication, analytical, verbal and writing skills.
Technical Skills
ETL
Informatica PowerCenter 7.x.,8.6.1 , Toad ,PL-SQL Developer,SQL* Loader
Data Sources
Oracle 9i, Oracle 10g, Cobol Flat Files,CSV Files
Languages
SQL ,PL-SQL ,Unix Shell scripting ,C
Operating Systems
Linux ,HP-Unix, Windows 2003 Server
CASE Tools
Mercury Quality Center10, Rational Clearcase
Scheduling
DAC,Autosys , Control M
Education
Specialization Computer Science& Technology
Professional Experiences
Confidential, Monterey Park,CA Jan 2012 - Till Date
Responsibilities
- Interacting with Data Analysts to review the business rules of IDQ(Integrated Data Quality) specification document specifically on the tables / columns / schema and data availability.
- Coding and implementation of of rule set in the Informatica Powercenter tool and data quality metrics are captured in the reporting database / schema for valid and error records of each rule.
- Develop new ETL mappings & mapplets or modify existing mappings to code the rules and analyzing the results to map with data quality percentage set for each element by Data Stewards.
- Develop Mappings and Mapplets that includes transformations like Source Qualifier,Expression, Lookup, Filter, Joiner, Union and Aggregator and reusable transformations.
- Consolidation of existing mappings to reduce the number of mappings in each workstream for code maintainance.
- Coding complex SQL Queries as part of ETL transformation logic to implement the rules.
- Coding unix shell scripts for parameterization and split, sort, concatenate, merge & SFTP data files.
- Unit Testing, Change Management(Migration) and Production Support.
Environment: Informatica 8.6.1, Oracle 10g, Unix, Business Objects, Toad for Oracle10, Autosys, WinSCP, Putty,
Confidential, NewPortBeach, CA Sep2011- Dec 2011
Responsibilities
- Involve in Business process analysis and technical design sessions with business and technical staff to develop requirements document, and ETL source to target specifications.
- DevelopSource to target mappings for ODS, ODS_CONS(Staging) & DW_CON(Namasco DWH) layermappings,mapplets& Reusable Transformations for historical load & incremental loadusing various transformations like source qualifier,expression,lookup,filter,update strategy, joiner, router,Java,union, and stored procedure.
- Implement parameterization to support the loads based on region and the time window.
- Load data in Type -2 slowly changing dimension for customer, product category.
- Extensively worked in Oracle SQL Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes and Sequences etc. Closely worked with DBAs to create Physical Databases.
- Develop Shell Scripts to apply various operations such as split, sort, concatenate,merge files.
- Create the ETL Test Case documents for Unit Testing ,Integration & User Acceptance
- Monitor the production cycles for Daily, Weekly & Monthly loads.
Environment: Informatica 8.6.1, Oracle 10g, Linux, DAC,OBIEE ,Toad,FileZilla,Putty
Confidential, Moline,IL July 2010 Aug 2011
Responsibilities
- Interact with user and Involved in requirement gathering, prototyping & publishing report layouts.
- Perform impact analysis for change requests and coordinate change management with the deployment teams.
- Perform data analysis on source system to provide inputs for building logical Data model for fact and dimension. Coordinate with DBA for physical data model implementation.
- Develop mappings & mapplets for historical load & incremental load using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.
- Implement Parameterization to support the loads based on region and the time window. Load data in Type -2 slowly changing dimension for new product category, dealers, geographies and customers.
- Design and develop PL/SQL procedures and call them through stored procedure transformations in the mapping.
- Develop Shell Scripts to apply various slicing & dicing operation such as split, sort, concatenate, merge & FTP data files.
- Create the Unit test cases & Support System ,Integration & User acceptance
- Monitor the production cycles for Daily, Weekly & Monthly loads.
Environment: Informatica 8.6.1, Oracle 10g, Linux, PL-SQL, Unix Shell Scripting, Business Objects
Confidential, Santa Clara,CA Mar 2009 June 2010
Responsibilities
- Created Source to target mappings for staging & ODS layer.
- Mapped the dimensions sources to dimension hierarchy tables. Build the self referencing lookup transformations to convert serial dimension attributes to fit into hierarchical structure.
- Implemented ETL mappings loading Type-2 Slowly changing Dimensions.
- Implemented transformation logic for applying various business rules and data standardization rules for source data coming from multiple systems into the data warehouse.
- Documented the Unit Test Cases & Integration Testing Steps and captured the results. Prepared release notes for migrating the objects from DEV to INT to UAT environments.
- Member of team performing Production support, job monitoring, troubleshooting on Level-II SLA.
Environment: Informatica 8.1, Oracle 9i, HP-UNIX, PL-SQL, Unix Shell Scripting, Autosys
Confidential, Hyderabad Apr 2007 Feb 2009
Responsibilities
- Involved in creating business rules, data cleansing rules& source to target mapping documents.
- Developed Mappings that includes transformations like Source Qualifier, Aggregator, Expression, Lookup, Filter, and Joiner.
- Developed Mapplets, Reusable Transformations to populate the Data into Data Warehouse.
- Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
- Designed mappings with mapping parameters and mapping variables for incremental loading.
- Designed test mappings to identify performance bottle necks.
- Involved in improving the performance for mappings and sessions.
- Involved in ETL testing.
- Interacted with dependent source system business users and technical team to understand the overall data flow cycle and develop various data audit measures to perform sanity checks on the incoming transactional data through FTP process.
- Participated in identifying the job flow and creating Autosys Job scripts based on dependencies between these jobs.
Environment: Informatica 7.1, Oracle , Autosys, Cognos Impromptu
Confidential, Hyderabad Feb2006 Mar 2007
Responsibilities
- Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
- Extensively interacted with user and Involved in requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.
- Analyzed data sources to identify the common attributes between multiple tables and finalize the single source of data for Master Data Attributes.
- Designed and developed historical load & incremental load mapping.
- Implemented transformation logic for applying various business rules and data standardization rules for source data coming from multiple systems into the data warehouse.
- Extensively worked in Oracle SQL Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes and Sequences etc. Closely worked with DBAs to create Physical Databases.
- Involved in creating the schema from the data model to the target server using forward and reverse engineering and generated the script file using ERWIN 4.0.
- Improve ETL process performance by Query Optimization, Pipeline Partitioning, memory /CPU management. Interface with DBA’s to review the volumetric requirements.
Environment: Informatica 7.1, Oracle, Linux, PL-SQL, Unix Shell Scripting, Control M