Etl Resume
TX
SUMMARY
- 7+ years of vertical experience in Business Intelligence with a keen ability to ascertain business needs and mirror business goals in Data Warehouse and Dimensional Modeling.
- Informatica Designer Certified for version 7.x and 8.x
- Knowledge of other ETL tools like SQL Server 2000 DTS, SQL Server Integration Services (SSIS 2005) and Ascential DataStage
- Knowledge of Data Warehousing Business Intelligence concepts, Design principles and Software Architecture.
- Responsible for interacting with Business Counterparts to identify information needs and business requirements for reports.
- Responsible as a Mentor in training newly hired contractors with the knowledge of Business and the Data flow concepts.
- Knowledge in designing Dimensional models for Landing, Cleansing and Staging areas.
- Knowledge about Data Warehouse methodologies like Bill Inmon, Ralph Kimball and Hybrid.
- Extensive Knowledge in designing functional and detailed design documents for data warehouse development.
- Experience in writing UNIX Shell scripts.
- Experience in writing Stored Procedures and Functions (PL/SQL, T-SQL).
- Working experience in designing and programming for relational databases, including Oracle, SQL Server, Teradata, DB2 and Access.
- Knowledge on Informatica PowerConnect and PowerExchange, import sources from external systems like for instance Mainframe (IMS, DB2, and VSAM) or ERP.
- Knowledge on Informatica Data Explorer (IDE) and IDQ Workbench for Data Profiling.
- XML source and target schema and object definitions.
- Extensive knowledge in handling Slowly Changing Dimensions (SCD) Type2 / Type1.
- Knowledge of Change Data Capture (CDC) using Oracle 9i and Informatica Power Exchange.
- Skilled in developing Test Plans, Creating and Executing Test Cases.
- Followed effectively industry standards of HIPAA, ANSI837 and PHI concepts.
- Involved with every phase of the SDLC, including feasibility studies, design, and coding for large and medium business intelligence projects and continually provided value-added services to the clients.
EDUCATIONAL BACKGROUND : MBA Financ
TECHNICAL PROFICIENCIES
Informatica PowerCenter 5.x/6.x/7.x/8.1.1/8.5/8.6.1, Informatica Power Exchange, IDE, IDQ, SQL Server DTS, SSIS 2005, Oracle 8i/9i/10g, SQL Server 2000/2005, Teradata v2R4/R6, DB2, MS Access, TOAD 7.x/8.x, SQL Query Analyzer 8.0, Oracle SQL Developer 1.1.2, Teradata SQL Assistant 7.1.0, SQL * Plus, Windows 95/98/00/NT/XP, HP-UNIX, SUN-OS, Business Objects Xi, MVS, Rational Rose, Eclipse, ERWin, AutoSys, Tivoli, Crontab, Microsoft Office (Access, Excel, Project, Visio, Word), XMLspy, Ultra-edit, mload, tpump, fast load, fast extract, bteq, PVCS, Perl, Shell, C/C++, Java, JavaScript, HTML, XML, VB, COBOL, SQL, PL/SQL, T-SQL.
CAREER PROGRESSION
Confidential, Michigan, MIJul 2008 till date
Sr. Application Developer, Data Feeds, Business Intelligence Center (BIC)
- Studied the existing environment and accumulating the requirements by querying theClients on various aspects.
- Identified various Data Sources and Development Environment.
- Data modeling and design of for data warehouse and data marts in star schema methodology with Dimensions and Fact tables.
- Prepared user requirement documentation for mapping and additional functionality.
- Extensively used ETL to load data using PowerCenter / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
- Analyzed the existing systems and made a Feasibility Study after performing technical and business Gap Analysis.
- Completed the data profiling of various subject areas attributes using IDE.
- Accomplished the proof of concept (POC) of deploying the Workbench Plans in IDQ Workbench environment.
- Executed the IDQ plans after selecting source, target and dictionary files.
- Applied IDQ for Data Masking of the sensitive PHI data like Key Masking, Random Masking.
- Prepared technical specification to load data into various tables in DataMarts.
- Applied the concept of slowly changing dimensions Type2 / Type1.
- Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
- Maintained the documents with various versions using PVCS.
- Worked extensively on Mappings, Mapplets, Sessions and Workflows.
- Scheduled and monitored the jobs using Tivoli Systems (TWS).
- Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
- Involved in PROD support for various SQL Server DTS jobs to load the SQL tables from one server to another.
- Worked with different Source Systems to perform transformations and load to different Target Tables.
- Corrected the return data from the Health Care Association and performed different Levels of Certifications.
- Completed the BCBS Industrial Certifications.
- Responsible in obliging through many industry standards like HIPAA, ANSI837 and guidelines in developing and migrating the code from one environment to another.
- Lead the activities, growth and professional development for three staff members.
Environment: Informatica PowerCenter 8.6.1/8.1.1/7.1.4 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange Navigator 8.6.1/8.1.1, IDQ Workbench, SQL Enterprise Manager 8.0, SQL Query Analyzer 8.0, Oracle 9i/10g, TOAD 8.6.1, SQL Developer 1.1.2, DB2, Tivoli, AIX, Sun Solaris & Windows NT, Shell Scripting.
Confidential, TX Jan 2008 May 2008
ETL Developer
- Working on a Highly Visible Time Constraint project.
- Performed the migration of jobs / mappings from Ascential ETL tool to Informatica ETL tool.
- With limited guidelines and with no documentation available, studied the existing jobs in DataStage and created specifications for respective mappings.
- Developed the Low complexity, Medium complexity and High complexity corresponding mappings along with any defects in the old processes.
- Created Triggers and Stored Procedures using PL/SQL.
- Generated Parameter Files for all the mappings using KSH.
- Developed the nightly batch run KSH to update param files using batch_id.
- Developed the nightly integrity run to email succinct report to Integrity group.
- Performed the Unit Testing and Integration testing with the help of Integrity queries.
- Validated the results using MicroStrategy reports with both ETL tools.
- Involved in migrating the newly developed Informatica process in QA and PROD environments.
- Create and maintain documentation related to production batch jobs.
Programmer Analyst / ETL Developer, Business Information Application (BIA)
- Detailed study and analysis of the Health Insurance data and determining the storage Warehouses requirements and capacities.
- Worked as a mentor in explaining the inbound and outbound processes.
- Responsible for generating XML source and target definitions.
- Responsible for tuning ETL procedures and 3NF to optimize load and query Performance.
- Wrote UNIX scripts and SQL cards/scripts for implementing business rules.
- Code Comparison between COBOL development and its corresponding Informatica development.
- Output validation against the Production Mainframe files.
- Generating email notifications through scripts that run in the Post session implementations.
- Developed Database Triggers and Stored Procedures in Teradata.
- Responsible for submission of BTEQ queries through Mainframes.
- Generated minor reports and created universes using Business Objects Xi.
- Converted historical healthcare data into the new streamlined standards.
- Collaborated with Business Analysts to ascertain the issues with the existing data warehouse. Modified mappings to conform to business rules
- Developed complex mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Router, Lookup - Connected & Unconnected, Sequence Generator, Filter, Sorter, Source Qualifier, Stored Procedure transformation etc
- Worked on different data sources like facets, Medco, labcorp; tasks included cleansing the data, providing the claims data coming from facets into a flat file in the desired format.
- Heavily worked on reading the data from VSAM files and transferring into staging area
- Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and sequence buffer length.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, Worklets and Workflows
- Worked with DBA on SQL scripts to automate the process of populating the various columns in the tables with surrogate keys
- Conceived unit, system, integration, functional, and performance test plans.
- Involved in preparing detailed ETL design documents
- Responsible in obliging through many industry standards like HIPAA, ANSI837 and guidelines in developing and migrating the code from one environment to another.
- Studied the existing environment and accumulating the requirements by querying theClients on various aspects.
- Identified various Data Sources and Development Environment.
- Data modeling and design of for data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
- Prepared user requirement documentation for mapping and additional functionality.
- Created PL/SQL packages, Stored Procedures and Triggers for data transformation on the data warehouse.
- Designed and developed Data validation, load processes, test cases, error control routines, audit and log controls using PL/SQL, SQL.
- Used Update strategy and Target load plans to load data into Type2/Type1 Dimensions.
- Created and used reusable Mapplets / Transformations using Informatica PowerCenter.
- Flat files being copied from Target definition as Source definition, using PowerExchange.
- Designed and developed ETL routines, using Informatica PowerCenter within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Stored procedures, functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
- Prepared technical specification for the development of Informatica ETL mappings to load data into various tables in DataMarts and defining ETL standards.
- Designed and developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
- Documented the complete Mappings.
Intern / Temporary Staff, Education & Training Services
- Developed several forms and reports in the process. Also converted several standalone procedures/functions in PL/SQL to packaged procedure for code reusability, modularity and control.
- Designed tables, indexes and constraints using SQL*Plus and loaded data into the database using SQL*Loader.
- Developed Stored Procedures and functions to accomplish several computations.
- Prepared Low Level Design documents and Unit Test Plans for the ETL jobs.
- Involved in both Designing part and Construction phases of Mappings and Workflows based on user requirements.
- Getting the source data from Oracle, Flat Files (CSV) and loading into target database, Oracle and Flat file (fixed length).
- Coordinated, supervised and implemented technology related adds moves and changes
- Configured and supported computer clients to gain remote, off-site access.