We provide IT Staff Augmentation Services!

Eim/etl Developer Resume

0/5 (Submit Your Rating)

Plano, TexaS

SUMMARY

  • Having 7 years of experience in the development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Hands on Data Warehousing ETL experience of using Informatica 9.5.1/9.1/8.6.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Knowledge in landing, staging, loading and matching of data using MDM(Informatica master data management).
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, data modeling, ETL development, System testing, Implementation and production support.
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin.
  • Good working knowledge of data quality tool Informatica data quality (IDQ)
  • Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008/2005 and Teradata.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle SQL and Oracle PL/SQL.
  • Experience analysis and resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g, MS SQL Server, DB2, Teradata and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Worked extensively with slowly changing dimensions.
  • Experience in working with small team (Agile Environment).
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

TECHNICAL SKILLS

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Informatica Power Center 9.5.1/9.1/8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), IDQ, MDM,Ab-Initio.

Databases: Oracle 11g/10g, MS SQL Server 2008/2005, DB2 v8.1, Teradata.

Data Modeling tools: Erwin, MS Visio

Languages: SQL, JDBC, PL/SQL, COBOL, UNIX, Linux, Shell scripts, SOAP UI, Web Services, Java Script, HTML,XML/XSD, Eclipse

Scheduling Tools: Autosys, Control-M, Maestro

Testing Tools: Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE

Confidential, Plano, Texas

EIM/ETL Developer

Responsibilities:

  • Handling all aspects of delivering Key Enterprise Release Initiatives in Enterprises Contact Management (ECM) in which the activities include Gathering Requirements from Business, Impacting Analysis, Estimation, Coming up with a Technical Solution & Design, Build, Test & deployment for data loads from external systems using SQLLDR.
  • Involve in preparing the mapping document.
  • Written .IFB files and CTL files to load data from Legacy system to Siebel system
  • Extensively used PL/SQL for writing stored procedures and packages for validating and manipulating data
  • Used TOAD 12.1 to develop oracle PL/SQL, DDL's, and Stored Procedures. Performance and fine tuning of SQL & PL/SQL stored procedures.
  • Maintaining & Supporting the Production Environment for the ECM application in which the activities include addressing Production Issues within Service Level Agreements (SLAs), Resolving any Production Defects, Ongoing Maintenance Activities and ensuring all system documents are updated.
  • Addressing Performance issue
  • Analyzing the query from the backend to check for options of resolving the issue by creating any index or stored outlines. Parallel check for configuration options in Siebel tools to overcome this issue.
  • Business Queries -
  • Used existing ETL standards to develop these mappings.
  • Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center
  • Imported Source/Target Tables from the Respective databases and created reusable transformations like Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator inside a Mapplet and created new mappings using Designer module of Informatica.
  • Extensively worked with joiner functions like normal join, full outer join, master outer join, and detail outer join in the joiner transformation.
  • Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
  • Involved in performing the unit testing before loading the data into target.
  • Used UNIX scripts to access Oracle Data.
  • Part from the Key Enterprises Release, responsible for handling Special Projects that are critical to the application in which the activities performed includes impacting analysis, estimation, design, build, testing & deployment of RFC’s .

Environment: Siebel Tools, Siebel Client 8.1.1, Informatica 9.6, Siebel EIM, eScript, Workflow Manager, Siebel EAI, SQL developer, Oracle 11g, TOAD 12.2,Maestro,Unix Shell Scripting.

Confidential, Ontario, CA

ETL Developer/ Production support

Responsibilities:

  • Developed new and modified existing complex Informatica PowerCenter Mappings to extract the data according to the guidelines provided by the business users and populate the data into Target Systems.
  • Developed CDC mappings using Informatica Power Exchange 9.5.1
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision-mail, command, worklets.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel, and initial and Incremental Load.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, grain, dimensions and measured facts.
  • Responsible for analyzing the system to identify the root cause for the issue/bug raised by the Application owner or Business user.
  • Developed complex Informatica mappings to implement Change Data Capture mechanism by using Type-2 effective date and time logic to perform Upsert operations at database level.
  • Extensively used Tpump, MultiLoad to load Data.
  • Created shortcuts to Sources and Targets in Shared folders to maintain uniformity across the repositories and allow other users to reuse the same shortcut in their mapping.
  • Developed Unix Shell scripts and used them in Informatica Pre- & Post- Session command tasks and Standalone Command tasks.
  • Responsible to pull the data from XML files, Flat files (Fixed width & Delimited) and COBOL files by using complex transformations like Normalizer, XML Source Qualifier etc.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM.
  • Shared the knowledge with Business people on how to work on MDM interface, key-in the master data, authorize and publish the data..
  • Worked on providing the data security for the MDM categories. Only the assigned role people can enter the data in MDM interface and to authorize or approve the master data.
  • Experience in landing, staging, loading and matching of data using MDM.
  • Extracted data from various sources like MS SQL Server and relational databases (Oracle) and supported in Production.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
  • Using Unit testing and Integrated testing for various Informatica Mappings and involved in ETL Testing.
  • Worked on Informatica Advanced concepts & also Implementation of Informatica Push Down Optimization technology.
  • Scheduled Informatica Jobs through Autosys scheduling tool.
  • Created and maintained all the project documentation using VSS (Visual Source Safe).

Environment: Informatica Power Center 9.5.1, Informatica power exchange 9.5.1, Oracle 11g, Teradata, Erwin, PL/SQL, Unix Shell Scripting,Autosys, MS-SQL Server 2005, Toad, MS-Visio, Windows XP

Confidential, Irvine, CA

ETL Developer

Responsibilities:

  • Involved in the ETL architectural design and the preparation of the technical specs based on the business requirements and high interaction with the business users.
  • Proper guidance has given to the team to follow Agile software development methodology to develop standard coding and delivered the work on time in a systematic manner.
  • Created complex mappings, mapplets using the advanced transformations like Filter, Router, connected/unconnected Lookups, Aggregator, Joiner, Sequence generator, Update, and Expression.
  • Used largely Informatica Power exchange 9.5.1 to do the Change Data Capture (CDC) to capture the daily changes on the AS400 system.
  • Stored procedures and functions are created and modified in Oracle to call it from Informatica.
  • Configured sessions for pushdown automation to improve the performance.
  • Run the sessions with error handling strategies and also used the collect performance data, verbose to identify the errors in the mappings.
  • Did the unit testing and also mainly involved in the integration testing.
  • Created the deployment groups and migrated in the migration of the coding into different repositories.
  • Pre pared the documentation templates for the team and prepared the documentation for my coding.
  • Using SQL, PL/SQL wrote queries to understand the data and get the required development work done.
  • Used Ralph Kimball warehouse methodology, Star and Snowflake schema.
  • Used Informatica command, email, and Event Wait and Event Raise tasks to schedule the Informatica Workflows and send emails to the support groups on successful /failure completion of the workflows.
  • Used Informatica IDQ for Data standardization, validation, enrichment, de-duplication, and consolidation purposes.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ .
  • Informatica Data Quality is the tool used here for data quality measurement.
  • Designed, developed, implemented and maintained Informatica Power Center and IDQ application for matching and merging process.
  • Utilized ofInformatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Exported the IDQ Mappings and Mapplets to power center and automated the scheduling process. Knowledge of data profiling using Informatica Data Quality(IDQ)
  • Experienced with DVO, Informatica Data Quality (IDQ) tools for Data Analysis Data Profiling and Data Governance.

Environment: Informatica power center 9.5.1, Oracle 11g, DB2, SQL Server 2005,Teradata, flat files, SQL Loader, Informatica IDQ, Ralph Kimball warehouse methodology, HTML 4.0, CSV files, SQL,PL/SQL, MS SQL Server, Toad 9.1.2, Erwin 4.0/3.5.2, BTEQ, windows NT, MS Office, XML Sources, Unix Shell Scripting.

Confidential

ETL Informatica Developer

Responsibilities:

  • Gathered requirements from Business users and created a Low level technical design documents using high level design document.
  • Participated in the design meetings, and prepared technical and mapping documentation.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 8.6.1, Oracle 10g, SQLServer2008, IBM ISeries (DB2), MS Access, Windows XP, Toad, Cognos 8.4.1., SQL developer.

Confidential

ETL Developer

Responsibilities:

  • Designed the ETL Architecture for the process flow from the source to target along with automation in writing wrapper shell scripts.
  • Worked as team lead in gathering requirements, analysis and coordinating with development team on the efforts and timelines.
  • Involved in team meetings with business users & analysts.
  • Scheduled Informatica Workflows using AutoSys and AppWorks.
  • Using Informatica Power Center for extracting the data from the externally received files and loading the data into the data mart.
  • Design, develop, implement, and assist in validating ETL processes.
  • Writing UNIX shell scripts to automate the jobs on the Informatica server on the daily basis.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Did Informatica mapping specification documents for the mappings developed and documented them according to Business standards.
  • Handling the data with the Slowly Changing dimension type I & II.
  • Used UNIX Putty to transfer, copy, and move and assign privileges to the files like read, write & execute.
  • Schedule and Run Extraction and Load process, monitor task and workflow using the Workflow.
  • Create and execute unit test plans based on system and validation requirements.
  • Worked on Oracle Golden Gate technology to handle the daily updates on the data from the source system.

Environment: Informatica Power Center, Oracle 10g, PL/SQL, SQL*Plus, SQL Loader, Flat files, FCD files, Windows NT, HP-UX, Shell Scripting.

We'd love your feedback!