Data Warehousing Consultant Resume
KansaS
SUMMARY
- Over 8 years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center.
- Worked on various domains including Insurance, Finance, Banking, Healthcare and Retail.
- Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
- Worked in various Heterogeneous Source Systems like DB2 Mainframes, Oracle, Teradata, MS SQL Server, Flat files and Legacy systems.
- Expert in DataWarehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star - Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Good understanding of views, Synonyms, Indexes, Joins and Sub-Queries. Extensively used Cursors and Ref Cursors.
- Involved in creating detailed design documents and performing Proof of Concepts (POC).
- Created mapplets, common functions, reusable transformations, look-ups for better usability.
- Experience in Performance tuning of ETL process. Reduced the execution time for huge volumes of data for a company merger projects.
- Experience in Tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.
- Experience with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
- Used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.
- Expertise in building/enhancing/migrating Universes, Retrieving data using Universes, Personal data files
- Experience in Exception Handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.
- Experience in UNIX shell scripting, job scheduling and communicating with server using pmcmd.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.1.0/9.0.1, 8.6.1, 8.1/7.1.2/6.2, Powe Exchange
Databases: Oracle 10g/9i/8i/7i, IBM UD2 DB2, Sybase SQL Server 11.0, MS SQL Server 6.5/7.0/9.0, MS Access 2000, Teradata
Programming Languages: C, C++, SQL, PL/SQL, UNIX, XML
BI Tools: Business Objects XI6/5, Micro Stratergy 9, Cognos
Web Technologies: JavaScript 1.2, HTML 4.0
Others: Erwin 4.1.2/3.5.2, TOAD,, SQL Loader, MS Office, Smart FTP,Ultra Edit, Autosys, Unicenter, Control-M,Quality Center, MS.Visio
Operating Systems: Sun Solaris, Windows NT 4.0, Windows 95/98/2000/XP,HP Unix, MS DOS 6.22, IBM-PC Compatibles
PROFESSIONAL EXPERIENCE
Confidential - Orlando, FL
Sr. ETL Developer
Responsibilities:
- Interacted with business user in the analysis and development phase and was involved in requirement gathering, prepared various documents like Technical design documents
- Integration test plan, unit test plan and resolved business related issues.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints
- Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
- Parsing high-level design specifications to simple ETL coding and mapping standards.
- Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
- Worked extensively on transformations like Lookups, Aggregator, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations
- Hands on experience in using Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
- Hands on experience in writing, testing and implementation of the PL/SQL Triggers, Stored procedures, Functions and Packages.
- Performance tuning of the Informatica mappings using various components like, Parameter files, Variables and Dynamic Cache.
- Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.
Environment: UNIX, Informatica Power Center 9.1 (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager), Power Mart, Power Connect, Power Analyzer,PL/SQL,XML,SQL*Loader, postgreSQL, SQL Server, Erwin 4.1
Confidential - Littleton, CO
Sr. ETL Developer
Responsibilities:
- Responsible for design and development of Salesforce Data Warehouse migration project leveraging Informatica Power Center ETL tool.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
- Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
- Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
- Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
- Created Stored Procedures to transform the Data and worked extensively in Confidential -SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
- Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
- Responsible for the data management and data cleansing activities using Informatica data quality (IDQ).
- Worked with Informatica Cloud Data Loader for Salesforce, for reducing the time taken to import or export critical business information between Salesforce CRM, Force.com.
- Performed data quality analysis to validate the input data based on the cleansing rules.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
- Used the sandbox for testing to ensure minimum code coverage for the application to be migrated to production.
- Improved performance testing in Mapping and the session level.
- Worked with UNIX shell scripts extensively for job execution and automation.
- Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
- Documented Data Mappings/ Transformations as per the business requirement.
- Created XML,Autosys JIL for the developed workflows.
- Extensively involved in code deployment from Dev to Testing.
Environment: Informatica Power Center 9.1.0,8.6.1,Power Exchange 9.1.0, Saleforce.com platform, SQL Server 2008, Shell Scripts, Oracle 11g, SQL, PL/SQL, UNIX, SalesForce.com sandbox, Toad, SQL Developer, XML Files, Apex Data Loader 29.0.0, Quality Center.
Confidential, Greenwood Village, CO
Sr. ETL Developer
Responsibilities:
- Responsible for design and development of Clinical Data Warehouse project leveraging Informatica Power Center ETL tool
- Responsible for Impact Analysis, upstream/downstream impacts.
- Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet, and Transformation Developer.
- Extracted data from from DB2, XML, Flat files, Sequential files, IMS and loaded in Enterprise Datawarehouse on Oarcle
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Extensively worked in the performance tuning of SQL, ETL and other processes to optimize session performance.
- Worked extensively with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Developed Unit test cases and Unit test plans to check if the data is correctly loading.
- Data quality checking and interacting with the business analysts
- Developing control files, Stored Procedures to manipulate and load the data into Oracle database
- Created shell script for archiving files and trigger files
- Responsible for Unit Testing,Day2 testing and helped with User Acceptance Testing.
- Scheduling Informatica jobs and implementing dependencies if necessary
- Managed post production issues and delivered all assignments/projects within specified time lines.
- Responsible for Hypercare Support for Interfaces
- Responsible for creating Design document and AOG document for Level 2 support.
Environment: Informatica Power Center 9.1.0,8.6.1,Power Exchange 9.1.0, SQL Server 2008, DB2, Shell Scripts, Oracle 10g, SQL, PL/SQL, Ultra Edit,Quality Center.
Confidential, Bloomington IL
Sr. ETL Developer
Responsibilities:
- Responsible for design and development of Sales Data Warehouse
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Extracted data from DB2 Mainframes, Cobol Copy books, DL1, XML, Sequential files to Oracle Dataware house
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Extracted Erwin physical models into repository manager
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Mapplets and used them in different Mappings.
- Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
- Managed Scheduling of Tasks to run any time without any operator intervention.
- Leveraged workflow manager for session management, database connection management and scheduling of jobs.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Involved in testing universes and reports for correct mapping of the objects and data correctness
- Created Unix Shell Scripts to automate sessions and cleansing the source data.
- Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations
- Generated List reports, Cross-tab reports, Drill through reports using Cognos Impromptu tools querying Database
- Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Oracle 10g, SQL Server 2008, DB2, Cognos,Aqua Data Studio, Erwin 4.1.2, Toad, Winscp, Autosys, Rational Clear Case, Rational Req.pro, Rational Clear Quest, UNIX
Confidential, St. Louis, MO
Programmer Analyst
Responsibilities:
- Responsible for design and development of Brokerage Data Warehouse multiple projects leveraging Informatica Power Center ETL tool, Oracle and DB2 database, Business Objects reporting tools.
- Actively participated in a team in the logical and physical design of the data warehouse.
- Closely associated with data architect in resolving the data issues.
- Developed the Informatica mappings using various transformations, Sessions and Workflows.
- Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and unit tested the mappings.
- Extracted data from Flat files, DB2 UDB, Oracle, XML, and Loaded in to Teradata datawarehouse.
- Creation of customized Mload scripts on UNIX platform for Teradata loads
- Written several Teradata BTEQ scripts to implement the business logic.
- Created Test tables and worktables on development and production on Teradata.
- Worked extensively with the Teradata Queryman to interface with the Teradata.
- Developed views on departmental and claims engine database to get the required data.
- Developed application views in Teradata and using expression and router implemented the Change Data Capture (CDC) process.
- Used Teradata Utilities (FastLoad, MultiLoad, FastExport). Queried the Target database using Teradata SQL and BTEQ for validation.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Used Pre-SQL and Post-SQL scripts for loading the data into targets according to the recquirement
- Migrated mappings, sessions, source/target definitions from the development repository to the production environment.
- Involved in performance tuning of the Informatica sessions and workflows.
- Created the reusable transformations for better performance.
- Created and reviewed the Design and Code review Templates.
- As a part of the testing team, Involved in conducting the Unit tests and System tests.
- Scheduling jobs using Unicenter to automate the Informatica Sessions.
- Assisted in developing different kinds of Grid reports using SAP Business Objects.
Environment: Informatica Power Center 8.1, Teradata v2r6, Oracle 9i, Flat files, MS SQL Server 2005, SSIS, DB2, UDB, Erwin 4.1.2, Business Objects XI r2, Winscp, Control-M, MS. Visio, Harvest, Mercury Quality Center, Shell Script, UNIX.
Confidential, Kansas
Data Warehousing Consultant
Responsibilities:
- Actively participated in a team in the logical and physical design of the data warehouse.
- Closely associated with data architect in resolving the data issues.
- Developed the Informatica mappings using various transformations, Sessions and Workflows.
- SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
- Involved in creating stored procedure to support recovery.
- Responsible for Informatica administrator to migrate Source and Target definitions, Mappings, Workflows, and Flat Files from development environment to the production environment.
- Extensively used the Lookup and Update Strategy Transformations for implementing the Slowly Changing Dimensions.
- Responsible for migrating the mappings, sessions, source/target definitions from the development repository to the production environment.
- Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
- Developing Informatica mappings and shell scripts for loading trading and clearing data from various clients.
- Involved in the mirroring of the staging environment to production.
- Created and reviewed the Design and Code review Templates.
- As a part of the testing team, Involved in conducting the Unit tests and System tests.
- Scheduling jobs using Autosys to automate the Informatica Sessions.
- Optimizing the Autosys batch flow.
- Developing control files, Stored Procedures to manipulate and load the data into Oracle database
- Optimizing queries using SQL Navigator
Environment: Informatica Power Center 6.1, Windows NT, Sun Solaris, Shell Scripts, Oracle 8i, PL/SQL, Pro*C, SQL Loader, Discoverer 2000, SQL Server, Autosys, DB2, Erwin 3.5.2.