We provide IT Staff Augmentation Services!

Sr.etl Developer Resume

2.00/5 (Submit Your Rating)

Denver, CO

SUMMARY

  • 6+ years of experience in Data warehouse (Ralph Kimball/Bill Inmon methodologies) and Business Intelligence (BI) projects involving Informatica and Cognos 8 suite of products.
  • Extensive experience in working with data warehouses and data marts using Informatica Powercenter 6.1, 7.1, 8.1, 8.6,9.1 (Designer, Repository Manager, Workflow Manager and Workflow Monitor).
  • Experience with Teradata as the target for the datamarts and worked with BTEQ, FastLoad, and MultiLoad Experience with Change Data Capture (CDC) of over 2+ years.
  • 3+ years of Data Modeling experience using ERWIN 3.x
  • Performance tuning of Oracle using Tkprof, SQLtrace, SQLplan, SQL hints, Oracle partitioning, various indexes and join types and PL/SQL tuning.
  • Extensive knowledge with Dimensional Data modeling, Star Schema/Snowflake schema, Fact and Dimension tables.
  • Experience in creating folders, assigning securities to users and creating repositories in Informatica Repository Manager.
  • Experience in setting up groups/users/permissions for Informatica users in Repository Manager.
  • Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, XML, Stored procedure in Informatica Powercenter Designer.
  • Experience in creating workflows, sessions in Workflow Manager and running the workflows in Workflow Monitor and analyzing them.
  • Designed and developed several ETL scripts.
  • Experience with slowly changing dimension methodology and slowly growing targets methodologies.
  • Experience with Type 1, Type 2, Type 3 Dimensions.
  • Architected mappings and business processes related to data mapping, data migrations & testing.
  • Experience with writing and executing test cases for data transformations in Informatica.
  • Experience in resolving various configuration issues in Informatica.
  • Experience working with SDLC, SCRUM, RUP, Waterfall and Agile methodologies.
  • Experience with OLAP, ROLAP and other BI tools.
  • Proficient in network installations and troubleshooting
  • Reliable, responsible, hard working and good team player

TECHNICAL SKILLS

Hardware

IBM PC Compatibles and NT servers

OS

Windows, UNIX, Linux

OLAP tools

Cognos 8 (Report Studio, Frame work manger, Analysis Studio), Transformer 8.3, Impromptu Administrator, Powerplay Transformer, Powerplay, ReportNet,Oracle

ETL Tools

Informatica 6.0, 7.1,8.1,8.6

RDBMS

Oracle 7.x/8.x/9.x/10g, SqlServer 2003,2005,2008, UDB/DB2,Teradata

Databases

Stored Procedures, Database Triggers, and Packages.

Languages

C, C++, Java, Cobol, VB. net, C#.NET, ASP.NET 3.5, Assembly language

Tools

Make, RCS,Tivoli, Autosys, Clear Case, Crontab,Toad, BTEQ, Visual Studio.NET (2008 and 2005)

Miscellaneous

TCP/IP Sockets, Sun RPC, XML

Version Control

Visual Source Safe 6.0, CVS, Documentum SmartSpace

EDUCATION

Bachelor of Computer Engineering

PROFESSIONAL EXPERIENCE

Client: Confidential,Denver, CO Feb’11-Present
Role: Sr.ETL Developer
Project Description:
Proclaim Data Integration (PDI)
Confidential,is built in the process of expanding the scope of Information Factory. The Project acquires and Integrates Member, Eligibility, Claims, Benefits and Accumulations data from Proclaim into the Information Factory. The Effort included stage, integrate and position the data making it available for the downstream data consumers.

  • Created Data Maps in Power Exchanges Navigator for legacy IMS Parent sources to replicate as relational sources.
  • Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables.
  • Performed CDC capture registrations.
  • Assisted in building the ETL source to Target specification documents by understanding the business requirements.
  • Low level design documents including all the application as well as technical details are prepared.
  • Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.
  • Reusable transformations and Mapplets are built wherever redundancy is needed.
  • Crated Teradata scripts to do following load operations like Fast load, Mload, Tpump with the interaction with Informatica.
  • Developed Teradata loading and unloading utilities like Fast Load, Multiload and Fast Export and also developed BTEQ scripts to load data from Teradata Staging area to Data Warehouse
  • Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput.
  • Designed the Process Control Table that would maintain the status of all the CDC jobs and thereby drive the load of Derived Master Tables.
  • Created Post UNIX scripts to perform operations like gunzip, remove and touch files.

Environment: Informatica PowerCenter 9.1.0, Power Exchange 9.1.0, Oracle 11g, Toad, RHEL 5.0, DB2 (AS/400,Z/OS), Quality Center, Harvest( Version Control software).

Confidential, Downers Grove, IL Jan’10 Jan’11
Position: SR.INFORMATICADeveloper
Description:
Confidential, is a health insurance company which operates health plans, insurance companies, network rental and workers’ compensation services companies. Coventry provides a full range of risk and fee-based managed care products and services to a broad cross section of individuals, employer and government-funded groups, government agencies, and other insurance carriers and administrators.
This project was aimed at making a data warehousing system of customers having different health insurance policies. The objective was to implement ETL process using Informatica with flat files and oracle files coming as a source in three layers, ODS, Data Warehouse and Data Mart. This application provides business intelligence analysis to decision-makers using Business Objects as an interactive OLAP tool with its Web Intelligence module.
Environment:Oracle 10g, SQL developer, Erwin 4.1, ,UNIX, Flatfiles, TIBCO Sources,TOAD,PL/SQL,MS SQL Server 2005, SQL*Plus, SQL*Loader , Informatica Data Explorer, Power Exchange,Crystal Reports, Autosys, Cognos 8.4.
Responsibilities:

  • Analyzed the Functional Specs provided by the Architect and created Technical Specs documents for all the mappings.
  • Worked as part of the development team, along with the Systems Analysts/Business Analysts.
  • Design and development of mappings and mapplets usingInformaticaPowerCenter Designer to populate staging tables and ODS.
  • Analyzed the source data coming from TIBCO,Oracle, Flat Files, and MS Excel coordinated with data warehouse team in developing Dimensional Model
  • Created ftp connections, database connections(ODBC)for the sources and targets..
  • Developed complex mappings andSCD Type-I, Type-II and Type IIImappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router, XML and SQL transformations. Created complex reusable mapplets.
  • Created medium to complex PL/SQL stored procedures for integration with Informatica using Oracle 10g.
  • Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Used Workflow Manager for managing workflows and Session, database connections, memory properties and perform Pre- and Post-Session tasks.
  • Used ANALYZE,DBMS_STATS, EXPLAIN PLAN, SQL TRACE and TKPROFto tune SQL queries.
  • Performance tuned the workflow by identifying the bottlenecks in sources, targets, mappings and sessions. Identifying read and writes errors using Workflow and Session logs.
  • Used Parameter files to initialize workflow variables, Mapping parameters and mapping variables and used system variables in mappings for filtering records in mappings.
  • Proficient with Visual Source Safe for version control for Informatica objects, Oracle views and packages and test scripts.
  • Developed and modifiedUNIX shell scriptsto meet the requirements after the system modifications andwas also involved in monitoring and maintenance of the batch jobs.
  • Used Cognos for generation of reports.

Confidential, MA Apr 2009 Dec 2009
Position:InformaticaConsultant

I was involved in the OLTP to OLAP transformation of data sources. Created tables to develop reports used for business. And implemented dashboard and the functionalities using Siebel analytics which allows users to view reports and analyze the effectiveness of all the customized supporting sales.

Environment: : InformaticaPower Center 8.3, OBIEE10.1.3,DAC, Oracle 9i/10g, Toad, MS SQL Server 2005, Flat files and UNIX, Windows XP, Microsoft Word, Microsoft Excel, PowerPoint.

Responsibilities

  • Extensively involved in interacting with Project coordinators, Business Analysts and End-users to gather requirements.
  • Prepared the ETL spreadsheet after identifying the Facts and Dimensions.
  • Designed the Star Schema after identifying the Facts and Dimensions.
  • Prepared the Fact Dimension Matrix Spreadsheet prior to the development of rpd for the required reports
  • Used theInformaticaPower centre designer to develop the required mapping based on the ETL spreadsheet.
  • Used theInformaticaPowerCenter Workflow Manager to develop the required sessions for the mappings developed based on the ETL spreadsheet.
  • Used different transformations Viz Source Qualifier, Expression, Filter, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner, Router, Aggregator transformation and SQL override extensively.
  • Used the Workflow Manager to create tasks, sessions, worklets and batches.
  • Debugged the failed mappings.
  • Implemented the best practices for creating the mappings, sessions and workflows.
  • Identified bottlenecks in the ETL process and edited/corrected the mappings and sessions to improve the ETL process performance and achieved the required. Full ETL used to take 18 hours to complete which I fine tuned to 12 hours.
  • Analyzed performance tuning of Oracle 10g using SQL Optimizers and creating several indexes to improve its performance.
  • Unit tested the mappings before including the developed sessions into the already existing batch.
  • Integration tested the entire batch after including the individual sessions into the existing batch.
  • Involved in configuring new jobs and importing the newly developedInformaticasessions into DAC.
  • Designed and developed the OBIEE/ Metadata Repository (.rpd) using Siebel Analytics Server Admin tool by importing the required objects.
  • Worked on Siebel Answers to build OBIEE Interactive Dashboards with drill-down capabilities.
  • Debugged reports and OBIEE Dashboards visibility with respect to user’s responsibility and web groups in an integrated environment.

Confidential, Raleigh,NC Oct 08 April 09
Position:ETL/InformaticaDeveloper

Confidential, that follows a bottom-up, company-by-company approach to investing. The financial data warehouse represents the balances and transaction data information extracted and integrated from various source systems. The objective of Financial Data Warehouse is to represent a financial reporting infrastructure that centralizes access to information, data, applications, reporting and analytical systems for the required financial content dealing with different products and services.
Environment :InformaticaPowerCenter 8.6, Erwin 7.2, Oracle 10g,Sql Server,oracle forms 6i, DB2 UDB, PL/SQL, UNIX, TOAD, AutoSys, Microsoft VSS.
Responsibilities:

  • Analyzed business process workflows and developedEtlproceduresto move data from various source systems to target systems.
  • Developed database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modelingusingERWIN.
  • Involved in design usingERwinData Modeler and development ofInformaticaMappings using transformations likeSource qualifier, Aggregator, Connected & unconnected Lookups, Filter, Update Strategy, Rank, Stored Procedure, Expression and Sequence Generator and Reusable transformations.
  • CreatedPL/SQL proceduresto transform data from staging to Data Warehouse Fact and summary tables.
  • Extensively usedStored Procedures, Functions and PackagesusingPL/SQLfor creatingStored Procedure Transformations.
  • Developed, Modified and TestedUNIX Shell scriptsand necessary Test Plans to ensure the successful execution of the data loading process.
  • Created various UNIX Shell scripts for automation of events, File Validation, File Archiving.
  • Developed scripts usingPMCMDcommand to run the Workflows, Sessions from UNIX environment which are called in Autosysjobs.
  • Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings on the development server.
  • Involved in Testing, Debugging, Data Validation and Performance TuningofETLprocess, help develop optimum solutions for data warehouse deliverables.
  • Performance tuningofInformaticamappings for large data files by managing the block sizes, data cache sizes, sequence buffer lengths and commit interval.
  • Created unit test plans, test cases and reports on various test cases for testing theinformaticamappings.
  • Worked onIntegration testing and Regressiontestingto verify load order, time window and lookup with full load.

Confidential,Chicago, IL May 07 to Aug 08
Position: InformaticaDeveloper
Confidential, is a full-service of insurance, financial, and auto registration agency has been in business since 1988. This is a Claim Data warehouse (CDW) providing Policy Transactions by credit cards, Premium and Claims Transactions data to the business users. Users want to measure profit over time by coverage, coverage item type, and geographic, demographic and sales distribution channels. The system is fully integrated, enabling the entry and sharing of policy and claims data among all components of the system and capturing data to complete policy administration, claims administration, policy issuance, billing, reinsurance accounting, and bureau reporting and management reports.
Environment:InformaticaPowerCenter 8.6/8.1,Business Objects, Business Process Modeling (BPM),Control-M,Sql Server, Oracle 10g/9i,Mainframe,DB2,Sybase,Teradata,Flat files, SSIS,Toad,PL/SQL,SQL, Unix Sell Scripting,Windows XP.
Responsibilities:

  • Responsible to meet with business stakeholders and other technical team members to gather and analyze application requirements.
  • Responsible for Design database solutions to satisfy application (business and technical) requirements.
  • Responsible to implement database solutions using available database development tools likeInformaticaPowercenter.
  • Responsible for developing complexInformaticamappings using different types of transformations likeUNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations forLarge volumes of Data.
  • Involved inperformance tuningof mappings and sessions.
  • Used to schedule the workflow ininformaticausingControl-M.
  • CreatedPL/SQL proceduresto transform data from staging to Data Warehouse.
  • DevelopBusiness Objectsin accordance to client’s needs and requirements and implement Business Objects development and testing.
  • Used Business Objects Universe which acts a interface between database and end user.
  • Responsible forUnitandRegressiontestingof theInformaticamappings created according to the business requirements.
  • Responsible for creating Workflows and sessions usingInformaticaworkflow manager and monitor the workflow run and statistic properties onInformaticaWorkflow Monitor.
  • Responsible for DefiningMapping parametersandvariablesandSession parametersaccording to the requirements and performance related issues.
  • Responsible forPerformance tuningof Informatica code for better performance.
  • UsedVersion controlprovided byInformaticaon each and every object used. In this way, each and every version of the process will be available for recovery or research purposes.
  • DevelopedUnix Scriptsfor updating the control table.

Confidential, Dublin-CASep 2006 May 2007
Position: ETL Developer/Analyst
Franklin Templeton Investments is a global investment management organization with offices in 30 countries around the world and offers investment solutions under the Franklin, Templeton, Mutual Series, Bissett, Fiduciary Trust and Darby Overseas names in more than 100 countries. The project team Corporate Financials is involved in building in DW for the purpose of measuring Fund Profitability. Data from main frames is being pulled from mainframes in the form of flat files and dumped into ODS using Informatica 5.1.From ODS, which is maintained in Oracle the data will go to ODE and then to the Data-Mart using DataStage Ascential. The reporting was done using Cognos.
Environment: Informatica Power Center 7.1.x, 8.1.1 SP4, Oracle 10g, MS SQL Server 2005 PL/SQL, SQL*Plus, SQL*Loader, Toad, Unix, Linux, Windows XP.
My Job responsibilities:

  • Assist data modeling using ErWin 4.x.
  • Involved in areas like project planning, risk management and the proper tracking of the developed output.
  • Designed and developed several ETL scripts using Informatica, Unix shell scripts.
  • Extensively used all the features of Informatica Versions 6.x and 7.x including Designer, Workflow manager and Repository Manager, Workflow monitor.
  • Developed and modified Unix shell scripts to reset and run Informatica workflows using pmcmd on Unix Environment. Conversant with the Informatica API calls.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Worked with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.
  • Partitioned sources to improve session performance.
  • Created users in Informatica, assigned them various permissions.
  • Backup/Restoration of repositories.
  • Upgrade of repositories when upgrading to new versions of Informatica.
  • Created various mappings like Update Strategy, Look up, Sequence Generator, expression, Aggregator, XML, Stored Procedure and tested the mappings for data verification in Designer.
  • Created workflows and tasks in Workflow Manager and linked the database through server setup and various other connections.
  • Completed various data load simulations to stress test the mappings.
  • Involved in monitoring Informatica jobs/processes using Workflow Monitor.
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length, and target based commit interval.
  • Extensively worked on Unix Shell Scripting, Using pmcmd to automate the process by Autosys on UNIX environment.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Improved session run times by partitioning the sessions. Was also involved heavily into database fine tuning (creating indexes, stored procedures, etc), partitioning oracle databases.
  • Extensively usedInformaticaPowerCenterclient toolsSource Analyzer, Warehouse designer, Mapping designer and Mapplet Designer.
  • Extracted data from different sources of databases. Createdstaging areato cleanse the data and validated the data.
  • Designed and developed complexAggregate, expression, filter, join, Router, Lookup and Updatetransformation rules.
  • Developed schedules to automate the update processes andInformaticaPowerCenter sessions and batches.
  • Analyze, design, construct and implement theETLjobs usingInformaticaPowerCenter.
  • Developedmappings/Transformations/mappletsby usingmapping designer, transformation developer and mapplet designer inInformaticaPowerCenter Power Center 6.1.
  • DevelopedShell scriptsto setup runtime environment, and to run stored procedures, packages to populate the data in staging tables.
  • Created Users, user groups, database connections and managed user privileges using supervisor.

We'd love your feedback!