We provide IT Staff Augmentation Services!

Etl Developer Resume

0/5 (Submit Your Rating)

San Diego, CA

SUMMARY

  • 8+ years of Software Life Cycle experience in Data Warehouse/ Data Mart design and development, data analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/ Business Intelligence (BI) applications in various domains like Insurance, Health, Retail and Banking.
  • 6+ years of Data Warehouse experience using Informatica Power Center 9.1/9.01/8.6/8.1/8.0 , Power Center, Client tools - Mapping Designer, Workflow Manager/Monitor and Repository Manager, IDE, IDQ and Power Exchange.
  • 6+ years of experience in Relational Databases like Oracle, Sql Server, and Teradata and on SQL
  • Well versed with Teradata and its utilities like FEXP, Fload, Mload, TpT and Bteq and writing analytical Queries.
  • Good knowledge on writing Bteq scripts.
  • Well experienced with Teradata SQL and its architecture.
  • Extensively used Informatica Repository Manager and Workflow Monitor.
  • Experienced in Integration of Heterogeneous sources like flat files, Excel Spread sheets, xml, Oracle and Teradata using in Informatica.
  • Experienced in working with transformations like Expression, Aggregator, Normalizer, Joiner, Union, Lookup, Router, Update strategy, Filter and Stored Procedure transformations to extract and integrate the data from different databases and files.
  • Worked on Slowly Changing Dimensions (SCD's) and implemented Type1, Type 2 (Flag and time stamp)..
  • Expert in troubleshooting/debugging and improving performance at different stages like database, workflows, mappings
  • Exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support
  • Interacted with Stakeholders to get the Requirements and completed BRD (Business Requirement Document)& RTM (Requirement Traceability Matrix).
  • Created HLD (High Level Design), LLD (Low Level Design), STM (Source to Target Mapping), UTR (Unit Test Results), and Deployment and Production Transition documents
  • Excellent communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high adaptability to new technologies and tools.
  • Well versed with data load strategies, unit testing, and integration testing in development.
  • Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.
  • Hands on Experience in writing automated scripts using shell scripting.
  • Experienced in Partitioning, Performance Tuning databases like Oracle, Teradata, by creating Indexes, collect stats, creating Partitions.
  • Practical understanding of the Data Modeling (Dimensional & Relational) concepts like Star-Schema modeling, Snowflake Schema Modeling, Fact and Dimension Tables.
  • Experienced in working with various database Clients like Toad, Oracle SQL Developer, SQL Loader and Teradata Sql Assistant.
  • Skilled in writing Stored Procedures in Oracle, Teradata and My Sql.
  • Worked on Cognos 8x, Cognos 10 Components Report Studio, Query Studio and Analysis Studio.
  • Creating and Modified Models for Reporting and published packages in Cognos connection using Frame Work Manager.

TECHNICAL SKILLS

Programming Languages: C, SQL, Teradata SQL, PL SQL, Shell scripting

Database: TERADATA, Sql Server, Oracle 10g, 9i, 8i

Operating Systems: Windows XP, 2000, 2003, UNIX, RHEL V 6.0

Utilities: FEXP, FLOAD, MLOAD, TPUMP, BTEQ.

Tools: SQL Assistant 12, SQL Developer and SQL loader, Toad.

ETL Tools: Informatica Power center (9.5/9.1/9.0.1 , 8.6.1, 8.x), IDE, IDQ, BODS.

OLAP Tools: Cognos 7.x, 8.5.1 and 10.1.1, 10.2.1, Framework Manager and Business objects.

Other tools: Erwin, Crontab, Control- M, Autosys

PROFESSIONAL EXPERIENCE

Confidential, OH

Sr. Informatica Developer

Responsibilities:

  • Gathering requirements from the end users and creation of the Business Requirement Documents.
  • Implementing Informatica design standards and processes for the load of data from the sources to the target warehouse.
  • Creating mappings as per business requirement documents and performance tuning of existing mappings if required.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle, DB2, Sql server and Teradata.
  • Loading and extracting data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FEXP, FLOAD and MLOAD.
  • Used Informatica power center to create mappings, sessions and workflows for populating the data into dimension, fact and lookup tables simultaneously from different source systems (Sql server, Oracle, Flat files)
  • Created mappings with various transformations like source qualifier, Aggregator, Expression, Filter, Router and Joiner, Stored procedure, lookup, Update strategy, Sequence generator and Normalizer.
  • Deployed Reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Addressing Production Support Issues and resolving tickets using Session Logs, and Workflow Logs for Error handling and Troubleshooting in DEV environment
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Implemented collecting statistics, analyzing explain plans and determining which tables needed statistics increased performance by 35- 40% in some situations.
  • Written SQL overrides in source Qualifier according to business requirements.
  • Written pre session and post session scripts in mappings.
  • Implemented Partitioning and Push down optimization while using Incremental loads and Bulk data loads.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions and scheduling them to run at specified time with required frequency.
  • Responsible for Unit Testing and Integration testing of mappings and workflows.
  • Scheduled jobs in ESP for all the workflows created in Informatica.
  • Worked on migration of objects after upgrades of Informatica 8.6 to 9.0.1 and 9.0.1 to 9.5.
  • Creating Informatica Source Instances and maintain shared folders so that shortcuts are used in project.

Environment: Informatica Power Center 9.0.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer) Oracle 10g, 9i, Teradata v13, Teradata SQL, BTEQ, TPT, FEXP,SQL Server, Flat Files, UNIX, Windows 7, CA Workstation Scheduler(ESP)

Confidential, Norcross, GA

Informatica Developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Responsible for creating Business Requirement Documents and converting Functional Requirements into Technical Specifications
  • Responsible for Code Review of Mappings developed by other developers
  • Provided ETL integration services and design solutions.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2, Teradata, MS Access, and Flat Files
  • Responsible for Production Support and Ticket Resolutions.
  • Extensively used Source Qualifier Transformation, and used most of its features like filter, sorter, and SQL Override.
  • Extensively used various active and passive transformations like Filter, Router, Expression, Source Qualifier, Joiner, Look up, Update Strategy, Sequence Generator, Rank and Aggregator
  • Extensively used LTRIM, RTRIM, IS NULL, IS DATE, TO DATE functions in Expression Transformation for data cleansing in the staging area.
  • Scheduled workflows from back end using PMCMD.
  • Experience in Production Support, extensively worked on production support issues and resolved them using session logs, workflow logs, and used e-mail task for capturing issues via e-mail along with the session logs
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling process.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
  • Implemented both Connected and UN connected Lookup transformations according to the user requirements.
  • Extensively worked with various lookup caches like Static, Dynamic, and Persistent Cache
  • Developed Re usable Transformations, and Re Usable Mapplets
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
  • Responsible for Unit Testing and Integration testing of mappings and workflows.
  • Interact with customer frequently to understand the business needs as an onshore coordinator and explain the technical details to the offshore team

Environment: Informatica Power Center 8.6/7.1.4(Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Business objects, Oracle 10g, SQL Server, Flat Files, UNIX, Windows XP, Informatica Scheduler

Confidential, San Diego, CA

ETL Developer

Responsibilities:

  • Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements.
  • Developed Informatica mappings to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
  • I involved in capturing the Minutes of Meeting (MOM).
  • Created Informatica ETL code from functional and technical specifications.
  • Documented all ETL code with proper changes.
  • Tuned SQL statements for end users creating custom reports.
  • Reviewed and tested existing Informatica code.
  • Understand business requirement as per the functional and technical aspects.
  • Responsible for project deliverables, issue identification and resolution.
  • Involved into error handling and Debugging Using Informatica Power center.
  • Preparation of Low Design for the mappings created.
  • Prepared Unit test cases for the mappings.
  • Worked extensively on Oracle database 10g and flat file as data sources.
  • Involved into performance tuning of mapping.

Environment: TOAD UNIX, Windows 2K, Oracle 10g, Informatica Power Center 8.6 and Toad.

Confidential, Denver, CO

ETL Informatica Developer/ DWH Developer

Responsibilities:

  • Interacted with end-users and functional analysts to identify and develop business requirements and transform them into technical requirements.
  • Involved in the SDLC process beginning with the design of dimensional modelling in building the data warehouse
  • Worked with Data Architects during Design Stages.
  • Well experienced with data warehouse architecture.
  • Created design specification documents and developed functional and technical specification documents.
  • Extensively involved in developing the ETL logic to do the Auditing and Error Handling for the loads happening across different projects using Informatica.
  • Extracted data from oracle, flat file and xml sources.
  • Used Informatica Power Centre to create mappings, sessions and workflows for populating the data into the dimension, facts, and lookup tables simultaneously from different source systems.
  • Created scripts for automating table partitions.
  • Created a global repository, groups, users and assigned privileges using repository manager
  • Handled slowly changing dimensions of Type 1/ Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
  • Involved in performance tuning of the mappings, sessions and workflows.
  • Extensively used PL/SQL for database related functionality.
  • Used Crontab for scheduling.
  • Created UNIX shell scripts for data cleaning.
  • Involved in unit testing and documentation of the ETL process.
  • Involved in creating the run book and migration document from development to production.
  • Involved in Data warehouse month end loads and support activities.
  • Generated various business reports using Cognos BI, Report Studio, and Query Studio

Environment: Informatica Power Centre 8.x, Oracle 9i, PL/SQL, TOAD8.0, Windows NT, Unix Shell Scripting, SQL loader, Cognos 8.5.1

Confidential

ETL/ Informatica Developer

Responsibilities:

  • Design and develop Informatica Mappings.
  • Performance Optimizations of mappings by creating indexes, identifying bottlenecks etc.
  • Maintain the known error database for troubleshooting.
  • Debugging queries & blocks in the PL/SQL code for activities related to Inventory & Order Management.
  • Used Informatica Power centre for Extraction, Loading and Transformation (ETL) of data in the data warehouse.
  • Involved in writing scripts for loading data to target data Warehouse for Bteq, FastLoad and MultiLoad.
  • Extensively worked in data Extraction, Transformation and loading from source to target system using Bteq, FastLoad, and MultiLoad.
  • Interact with customer frequently to understand the business needs as an onshore coordinator and explain the technical details to the offshore team.
  • Used most of the Transformations such as Source Qualifier, Aggregator, lookup, Filter, Joiner, Sequence generator, Router, update strategy etc.
  • Used session parameters, Mapping variable/parameters and created parameter files for flexible runs of workflows based on changing variable values.
  • Involved in Optimizing and performance tuning logic on targets, sources, mappings, and sessions to increase the efficiency of session.
  • Reloading data for failed Informatica workflows and finding out RCA.
  • Performing developer requests such as import/exporting source & targets in Development region.
  • Migrating source, targets, mappings, mapplets, sessions, worklets and workflows to QA region and after successful completion of integration test moving ETL objects

Environment: Windows XP, Vista, Informatica 7.1, Teradata SQL Assistant, Teradata SQL, BTEQ, MLOAD, FEXP, SQL, PL/SQL, UNIX

Tools: Microsoft Visual Source Safe (for source code control)

Confidential

R&D Business Intelligence Analyst

Responsibilities:

  • Understanding the business logic and technical specification documents.
  • Created List, Grouped List, Cross tab, Chart reports with simple drag and drop functionality.
  • Created Model for reporting and published packages in Cognos Connection.
  • Created related models and schemas, joins, namespaces, query subjects, filters and calculated fields in Framework Manager
  • Created ad-hoc reports using Query Studio.
  • Extensively worked with filters and prompts on report data in Report Studio
  • Implemented conditional formatting in the reports to highlight the exceptional data.
  • Worked with complex reports like master-detail and drill through reports.
  • Migrating reports to different environments and scheduling of reports from report studio.
  • Created and managed sections in the reports and also formatted reports as per the need.
  • Involved in Unit Testing of Data.

Environment: Cognos 8.x BI, TM1 Exp Accelerator, Power play, Express- Reporter, Oracle 8iETL tools

We'd love your feedback!