Lead Etl Developer Resume
Minneapolis, MN
SUMMARY:
- Over 7 years of experience in Analysis, Designing, Development, Testing and Maintenance for various software applications in client-server environment in providing Business Intelligence Solutions in Data Warehousing for Decision Support Systems, OLAP and OLTP and worked on Database Application Development.
- Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Exchange (Power Connect), Metadata Manager Informatica Data Quality, Informatica Data Explorer, Data Transformation, MDM, DataStage, etc.
- Extensive experience in ETL design, development and maintenance using SQL, PL/SQL, SQL Loader, Informatica Power Center 9.x/8.x/7.x.
- Experience in designing, development and implementing Extraction, Transformation, and Loading (ETL) techniques on multiple database platforms and operating system environments.
- Basic Informatica administration such as creating users, folders, assigning privileges, optimizing server settings, deployment groups, etc,
- Experience in Data Warehousing, Data Migration, Data Quality, and Data Cleansing.
- Strong work experience in Data Mart life cycle development, perform tune ETL procedure to load data from different sources into data marts and data warehouse.
- Involved in data migration and up-gradation to informatica 9.0/9.1 from 8.6.1 and to informatica 8.6.1 from 8.1.1.
- Worked with various source data such as Relational sources, Flat file, XML, Netezza, COBOL file, SAP sources/targets, etc.
- Worked on Data Modelling, E-R diagrams, Logical, Physical design, Star/Snow flake Schemas using Erwin.
- Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session, and System in ETL process.
- Created complex mappings/mapplets, shortcuts, reusable transformations and Partitioning sessions.
- Strong Database experience and back-end procedure development in Oracle 11g/10g/9i, MS SQL Server 2005/2008, Sybase, Teradata, DB2, Netezza, etc.
- Experiencein UNIX Shell Scripting for automation batch and ETL jobs.
- Expertise in OLAP and Reporting tools like Qlikview, Business objects, Cognos, etc.
- Mentoring the team for any technical issues. Managed a team of 4 technical persons.
- An excellent team member with an ability to perform individually, good interpersonal skills, strong communication skills, hardworking and high level of motivation.
TECHNICAL SKILLS:
ETL Tools:Informatica Power Center 9.1/8.6/8.1/7.1, Power Exchange 9.1/8.6/8.1, Metadata Manager, Informatica Data Quality, etc.
OLAP/BI:
Cognos Impromptu, Cognos Power play, Cognos IWR, Business Objects 5.0/6.5, Qlikview.
Data Modeling
Erwin 4.0, Star-Schema Modeling, FACT and Dimension Tables
DBMS
Oracle 11g/10g/9i, Microsoft Access, SQL Server 2005/2008, MS Excel, Flat Files, Teradata, Sybase, Netezza, DB2, etc.
Languages:
C, C++, Java, SQL, PL/SQL, T-SQL, HTML, DHTML, XML, UNIX, Shell Scripting, Visual basic, ASP, JSP, XML, Macromedia Software, JCL.
Scripting Languages
Java Script, VB Script and UNIX Shell Scripting.
Operating Systems
Windows 2008/2003/NT/XP, HP-Unix, Linux
Design Skills
Object Oriented Analysis Design using UML.
Others
MS Project, MS Office Suite, Toad
EDUCATION:
Bachelor of Technology Jawaharlal Technological University
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis, MN
Role: Lead ETL Developer
Duration: Jan 2011 – Till Date
Confidential acquired the long-term asset management business of Columbia Management from Bank of America
The project is about integrating the Mutual Fund division of Columbia (BoA) with Ameriprise Infrastructure. The project involves analyzing and migration of mutual funds to the existing Ameriprise system and once again sending the feeds back to Bank of America Corporation (BAC) and other Ameriprise business partners.
Responsibilities:
- Working with Data Architects and Business Analysts of both Ameriprise and Bank of America to understand the data flow, existing logic and create Data mapping sheet for Informatica mapping development to convert the existing logic to integrate with Ameriprise Infrastructure and technology.
- Creation of Business Requirement Document on the business logic involved in creation of feeds.
- Analysis of business requirements and preparing the Design Documents.
- Reviewing the Design and Requirements documents with Architects and Business Analysts to finalize the Design.
- Performance tuned Informatica mappings, sessions and the database.
- Reconciliation of the feed generated at mainframes with the feed generated at distributed end by automated testing.
- Basic Informatica administration such as creating folders, users, privileges, deployment groups, etc.
- Extract data from mainframes using Power Exchange and cobol files.
- Involved in the preparation of Test cases and Test Scripts.
- Handling User Acceptance Testing (UAT) with the downstream feed users for validation of feeds.
- System Integration Testing will be done to check whether all the applications [Mainframe, TWS, ETL & Databases] are working fine as an end-to-end process..
- Monitoring the jobs in TWS & Informatica Monitor to ensure the success of jobs.
Environment: IBM Mainframe, IBM Z/OS, UNIX, WINDOWS, NATURALS, JCL, COBOL file, Informatica Power Center 9.0/9.1, Informatica Power Exchange 9.0/9.1, SQL Server 2008, Oracle 11g, TWS, ADABAS, TSO/ISPF & Smart IS, Unix.
Confidential, Fort Myers, FL, USA
Role: ETL Developer (Healthcare Domain)
Duration:Oct 2009 – Dec 2010
This project includes creation of a Data Warehouse on a Oracle platform. It included extraction and transformation of data from various sources such as Flat Files, MS SQL Server, DB2 and loading them into a star schema with fact & dimension tables.
Responsibilities:
- Configured Informatica Repository Manager to manage all the repositories, create folders, deployment groups, etc. Used Informatica Admin console to create users, user groups, security access controls, etc.
- Created High-level as well as Low-level design documents.
- Involved in fixing of Mappings issues, testing of Stored Procedures and Functions, Testing of Informatica loads, Batches and the Target Data.
- Worked extensively on performance tuning SQL.
- Performance tuned the Mappings, Tasks, Sessions and Scripts to optimize Data-Load performance.
- Utilized VISIO to document and present complicated data mappings, from source to target.
- Created scheduling of jobs in Workflow Manager to trigger ETL tasks on a daily, weekly and monthly basis.
- Performed Data validations, Unit Testing and preparation of Unit test cases.
- Resolved issues arising from User Acceptance Testing (UAT).
- Monitored the ETL production loads and fixed them when necessary.
Environment: Informatica 8.6/8.1, Oracle 11g, PL/SQL, SQL Server 2008, T-SQL, MS Excel, TOAD
Confidential, NY
Role: Informatica Developer
Duration: Nov 2008 - Sept 2009
The objective of this project was to extract data from various OLPT’s from mainframe (and other sources) and converts to flat files and finally loads the data into the Data Mart using ETL process as per the business requirements.
Responsibilities:
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin 3.5 to design the business process, grain, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Extensively migrated data from different source system such as flat files, relation sources, XML sources to ODS, Data marts and Data warehouse And working on Data partitioning.
- Standardized parameter files to define session parameters such as database connection for sources, targets, last updated dates for Incremental loads and many default values of fact tables.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Monitor troubleshoots batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
- Resolved production issues.
- Used Multiload, FastLoad and Tpump utilities to load to Teradata database.
- Read from SAP source using Power Exchange and also loaded data into SAP targets.
- Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
- Extensively used SQL* loader to load data from flat files to the relational database.
- Involved in Unit, Integration, system, and performance testing levels.
Environment:Informatica Power Center 8.6/8.1, Oracle 11g, Teradata, SQL server 2008, SAP, PL/SQL, SQL* Loader, Erwin 3.5, TOAD 7.0, Micro strategy, PERL, UNIX Shell Scripts, Crystal Reports XI R2, Flat files, MS-Office , Windows XP, IBM AIX.
Confidential, Rocky Mount, NC
Role: Informatica developer
Duration: Jan 2008 - Oct 2008
Confidential is a multinational corporation which has many retail stores across United States. The project deals with building a real time view of enterprise wide data. A decision support system is built to compare and analyze product prices and their quantities. QVC product data is combined with data from other sources and is made available for reporting. This Enterprise Data Warehouse (EDW) is used to deliver reports and information to sales and marketing management.
Environment: Informatica Power Center 8.1, Business Objects XI R2, Oracle 10g, DB2 UDB, Sybase, Netezza, Erwin 4.1.4, Windows NT, Tidal, TOAD
Confidential
Role: Oracle Developer
Duration: Jun 2006 – Dec 2007
VS Technologies is an IT services company providing consulting services to many sectors of the industry for the past ten years. This project included designing a system to monitor the inventory of the Client organization.
Environment: Informatica Power Center 7.1, Oracle 10g, SQL *PLUS, PL/SQL, TOAD
Confidential
Role: Jr. Oracle Developer
Duration: Mar 2005 – May 2006
Worked on this project as offshore team in India, this project included designing a system to monitor the inventory of the organization. This system records all transactions in a store, like material receipt against purchase order, material issue, and stock adjustment voucher, material returned to stores, material receipt/issue from/to production unit.