We provide IT Staff Augmentation Services!

Data Warehouse Etl Developer Resume Profile

2.00/5 (Submit Your Rating)

Summary:

  • I.T Professional with 9 years of experience in application, designing, development and implementation of ETL and RDBMS projects for Financial, Banking, Pharmacy, Insurance and Utilities industries.
  • Experience in data warehousing, ETL architecture, Data Profiling using Informatica PowerCenter 9.1/8.6/8.5/8.1/7.1/6.2 Client and Server tools and building Design and building Enterprise Data warehouse/Data Marts.
  • Adept at understanding business processes / requirements and implementing them through mappings and transformations.
  • Involved in Database design, entity relationship modeling and dimensional modeling using Star and Snowflake schemas.
  • Extensively worked with mappings using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Update Strategy, Unconnected / Connected Lookup, Aggregator and SCD Type-2.
  • Experience in tuning Mappings and Sessions for better Performance.
  • Experience in loading various data sources like Oracle, SQL Server, Teradata, DB2 and Flat Files into Datamarts. Experience in requirement gathering and documenting.
  • Worked in Production support team for maintaining the mappings, sessions and workflows to load the data in Data warehouse.
  • Experience in Performance Tuning and Debugging of existing ETL processes
  • Experience in preparing and scheduling and/or running Sessions/tasks, Work Flows and Batch processes using Workflow Manager or PMCMD command
  • Experience in Oracle 10g/9i/8i.
  • Experience in writing Triggers, Stored Procedures, Functions, and Packages etc using PL/SQL.
  • Experience in OLAP Tools like Business Objects 6/5.1, Web Intelligence 6/2.6.
  • Experience in Electronics /Financial/ Health Care/ Insurance/ Wireless/ Agricultural/Mortgage Industries.
  • Experience in UNIX Shell Scripting.

PROFESSIONAL EXPERIENCE

Confidential

Informatica/ETL Developer

  • ACS operates Federal Government Education solution Projects and Direct Loan Servicing System is one of ACS Education Services, LLC's major Federal Government Education Department projects. The system maintains Student Direct Loans throughout the United States and provides a high-level customer service for more than 16 million borrowers. It also undergoes continuous improvements with the on-going changes in the department's requirements for implementing new schemes and rebates to the students.
  • The Debt Collection and Management System DCMS is the name used to reference to what was formally known as the Federal Family Education Loan FFEL Program on behalf of the Department of Education. The system was running on an IBM Legacy system with 6 million borrower accounts. Towards modernization and reduction in cost on maintenance of Legacy systems the DMCS project is being migrated to Windows SQL server 2008 R2.

Responsibilities:

  • Involved in design, planning, implementation, and the highest level of performance and functionally analysed the application domains, involved in various knowledge transfers from dependent teams understand the business activities and application programs and document the understandings for internal team referencing.
  • Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and also to successfully implement business logic.
  • Study of detailed requirement of end users of system their expectations from Applications.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Business process re-engineering to optimize the IT resource utilization.
  • Integration of various data sources like Oracle 10g, SQL Server 2005, 2008, Fixed Width and Delimited Flat Files, DB2, COBOL files XML Files.
  • Transformed data from various sources like excel and text files into reporting database to design most analytical reporting system.
  • Initiated the data modelling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica.
  • Extensively used different types of lookups like Incore lookup and Flat file lookup. And used debugger to test the mappings and fixed the bugs.
  • Involved in fixing of invalid Mappings, Performance tuning, testing of Stored Procedures and Functions, Batches and the Target Data.
  • Developed slowly changed dimensions SCD Type 2 for loading data into Dimensions and Facts.
  • Involved in Data Extraction from Oracle and Flat Files, XML Files using SQL Loader, Freehand SQL.
  • Using Toad to increase User productivity and application code quality while providing an interactive community to support the user experience.
  • Developed and tested all the Informatica mappings, Processes and workflows - involving several Tasks.
  • Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogues into Frame Work Manager
  • Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancements.

Environment: Informatica 9 Power Center, Oracle 10g/9i, SQL Server 2005,2008, PL/SQL,T-SQL, Erwin, TOAD, SQL Plus, SQL Loader, Windows, UNIX

Confidential

Sr. ETL Developer

  • One of the world's largest trans-national electronics distributors of electronic parts, enterprise computing and storage products and embedded subsystems, Avnet provides a vital link in the technology supply chain.
  • I worked on QRT, a project based on Quoting System for the current availability of the parts with the best quote possible.

Responsibilities:

  • Gathering the requirement changes from the functional team and incorporates them in Informatica and Business Objects.
  • Interaction with direct Business Users and Data Architect for changes to data warehouse design in on-going basis.
  • Designed the ETL processes using Informatica tool to load data from Oracle, flat files into the target Oracle Database.
  • Followed Implemented Standards for BI DW at various levels of the SDLC.
  • Developed complex mappings in Informatica to load the data from various sources using transformations like Source Qualifier, Expression, Lookup connected and unconnected , Aggregator, Update Strategy, Filter, Router, Transaction Control etc.
  • Used Informatica workflow Manager to create, schedule, monitor and send the messages in case of process failures.
  • Designed SQL queries with multiple joins to pull relative data during import state.
  • Designed and modified PL/SQL Stored Procedures to modify data flow.
  • Used triggers in order to enforce business rules.
  • Developed FTP scripts to send the data extracts to various downstream applications using Informatica.
  • Providing support for user BO report issues and Informatica loading issues.
  • Tuning and performance improvement of the jobs in Informatica. Translated business requirements to Informatica Mappings. Involved in Unit testing of mappings.
  • Delegating and Tracking the change requests in Informatica.
  • Created the transformation routines to transform and load the data. Developed processes for automation of loading data using parameter driven sessions for batch schedule processes, verification and re-conciliation of data stored in several different source systems.
  • Worked with analysts and data source systems experts to map requirements to ETL code.

Environment: Informatica Power centre 9.1/8.6, TOAD, PL/SQL Developer, Data Mining, Oracle, DB2, Teradata, Erwin 4.0, Windows 2000, XML, SQL, PL/SQL, Unix/Perl/Shell Scripting.

Confidential

Sr. Informatica Developer

Responsibilities:

  • Worked with the business users, business analyst and end clients in gathering requirements.
  • Created/modified business requirement documentation.
  • Created detail ETL specification for design.
  • Extracted Data from EDW/EDM as per specification for End Client requests.
  • Sent Eligible Subscriber data to third party vendor to receive some additional data for the purpose of marketing research.
  • Used complex SQL logic to extract data from EDW.
  • Created complex views to pull data from source database.
  • Tuned SQL by running parallel queries after analyzing execution plan.
  • Used inline queries to pull current data for a subscriber/member.
  • FTP'd the file using session property FTP Connection to Informatica Server.
  • Pulled data from the mainframe systems, imported Copybooks to Informatica staging Area.
  • Created mappings by using Joiner, Lookups, Expressions, Router, Normalizer and other Transformations.
  • Created stored procedures, functions and called through Informatica.
  • Extensive use of the Informatica Debugger to identify and fix issues.
  • Performed Unit testing before sending code to test environment.
  • Created unit test cases and uploaded on Intranet.
  • Automated sessions and workflows by using JCL's.
  • Used Pre-session Commands to Create Empty Files on UNIX with TimeStamp appended on it.
  • Used Command Tasks to call Unix Scripts.
  • Created Targets as Delimited FlatFiles, ASCII Files, .CSV, and Oracle tables.
  • Supervised interns to assist in their understanding of DataModels, Oracle, Informatica, and UNIX.
  • Exclusively worked on Teradata database and Teradata utilities including MLOAD,FLOAD,BTEQ script,TPUMP
  • Analyzing tables on a regular basis for the Oracle Optimizer.
  • Identifying long-running database queries, refining queries for better performance and working with end- Users to implement changes.
  • Analyzed SQL queries causing performance problems.
  • Creating and rebuilding indexes and optimizing the code to improve the performance after analyzing the trace files.

Environment: : Informatica Power Center 8.6.1/8.1.1, Oracle 10g, SQL Plus, PL/SQL, SQL Developer, UNIX Shell Scripting, Mainframes, JCL, AIX 5.x

Confidential

Sr. ETL Analyst/Programmer

Sales and marketing Data Warehouse

Sales and marketing Data Warehouse is an extraction of data from SIEBEL Sales Force Application and legacy systems. The other sources that were integrated are DDD and Exponent data from IMS into ODS Operational Data store . This Central location of data will make it easier for Roche to acquire tool s to create and manage common reports so that each group can benefit from or leverage work done by other groups. Existing Mainframe System was replaced by the Data warehouse, which helps the company in saving million dollars every year.

Responsibilities:

  • Involved in Data modeling and design of data warehouse in star schema methodology with conformed and granular dimensions and FACT tables.
  • Using Informatica Repository Manager, maintained all the repositories of various applications, created users, user groups, security access control.
  • Analyzing existing transactional database schemas and designing star schema models to support the users reporting needs and requirements.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Worked on Xponent Data provided by IMS.
  • Extensively used SQL Loader, Informatica tool to extract, transform and load the data from MS SQL Server, Flat Files, Oracle to Oracle.
  • Using PowerExchange tool extracted data from mainframe system and shadow direct tool to extract db2 tables.
  • Performed security and user management repository backup using the same tool.
  • Implemented Slowly Changing Dimensions SCDs, Both Type 1 2 .
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings, known as 'Mapplets' using Informatica Designer.
  • Involved in the development of Informatica mappings and also performed tuning for better performance.
  • Using parallel processing capabilities and Session-Partitioning and Target Table partitioning utilities
  • Written a UNIX process for loading the zipped version of files and thereby improving the load time and saving the disk space.
  • Extensively worked on tuning Both Database and Informatica side and thereby improving the load time.
  • Extensively used PL/SQL for creating packages, procedures and functions.
  • Automated the entire processes using unix shell scripts
  • Using Autosys to schedule UNIX shell scripts, PL/SQL scripts and Informatica jobs.
  • Written Unix Shell Scripts for getting the data from all systems to the data warehousing system. The data was standardized to store various business units in tables.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables and used the Test Director tool to report bugs and fix them in order.
  • Tested the target data against source system tables by writing some QA Procedures.
  • Created Migration Documentation and Process Flow for mappings and sessions.
  • Using Autosys as the job scheduling tool.
  • Presentation of the advanced features of Business Objects to users and developers to enable them to develop queries and reports easily.

Environment: Informatica Power Center 5.1/6.2, HP UNIX, Windows NT, DB2, AS/400, Oracle 8i, SQL, PL/SQL, SQL Loader, TOAD, SQL-Navigator.

Confidential

Data Warehouse ETL Developer

Description:

ECTL of Oracle tables to Teradata using Informatica and Bteq Scripts. Migrated SAS code to Teradata Bteq Scripts.Predict current high value sellers who are likely to churn based on their similarities to past high value sellers. Identify dollar opportunity by reducing churn of high value sellers.

Responsibilities:

  • Responsible for preparing the technical specifications from the business requirements.
  • Analyze the requirement work out with the solution. Develop and maintain the detailed project documentation.
  • Used Informatica and generated Flat file to load the data from Oracle to Teradata and BTEQ/Fast Load Scripts to do incremental load. Used Stage work and Dw table concept to load data, applied Start Schema concept. Created UDFs in JAVA Transformation to complete some tasks.
  • Design , develop and implemented ECTL process for Marketing Team for existing tables in Oracle. Wrote BTEQ Scripts in order to support the project.
  • Wrote stored procedures in PL/SQL and UNIX Shell Scripts for Automated execution of jobs
  • Used version control system to manage code in different code streams like Clear case.
  • Performed data-oriented tasks on Master Data projects especially Customer/Party, like standardizing, cleansing, merging, de-duping, determining survivorship rules.
  • Responsible for the design, development, testing and documentation of the Informatica mappings, PL/SQL, Transformation, jobs based on Paypal standards.
  • Initiate, define, manage implementation and enforce DW data QA processes across, Interact with other QA Team. Interacted with data quality team.
  • Identify opportunities for process optimization, process redesign and development of new process.
  • Anticipate resolve data integration issues across applications and analyze data sources to highlight data quality issues. Did performance and analysis for Teradata Scripts.
  • Migrate SAS Code to Teradata BTEQ Scripts to do the scoreing for score taking in account various parameters like login details, transaction amount etc. Playing with Marketing Data for various reports.

Environment: Oracle 9i, Informatica PC 8.1, PL/SQSL, Teradata V2R6, Teradata SQL Assistant, Fast

Load, BTEQ Script, SAS Code, Clear case, Pearl Scripts, XML Source .

Confidential

ETL Developer

Customer has announced its plan to acquire two natural gas companies. These companies will be integrated into the operation of Customer utilities. Among the more complex elements of this acquisition will be the transition into the customer's IT infrastructure. Customer wants to transfer the data from current systems of these companies to its CIS system.Informatica8.1 is used as ETL tool for integration process and Mercury Quality centre is used for Test management.

Responsibilities:

  • Understand the Functional requirements.
  • Created user id and folder for each new account for the developers along with the necessary privileges
  • Developed mappings in Informatica to load the data from various sources into Data Warehousing using different types of transformations like source qualifier, expression, filter, aggregator, update strategy, lookup, sequence generator, joiner, normalizer.
  • Assisting in Informatica Administration.
  • Managing Domain, Nodes, Grids, Integration Service, Repository Service, Folders through Admin Console.
  • Taking Backup of Informatica repository database.
  • Managing Integration service on grid and on node.
  • Used Informatica Workflow Manager to create, Schedule, execute and Monitor Sessions and Workflows.
  • Developed UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Migrating data mappings to production and monitoring, troubleshooting, and restarting the batch process using Informatica. Migration for dev to test and test prod.
  • End-to-End Integration testing of ETL-process from source-system to target data mart.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Scheduled sessions and batches on Informatica server using workflow manager.
  • Involved in unit testing of mappings, Mapplets also involved in integration testing and user acceptance testing.
  • Used the Debugger to run Debug sessions setting Breakpoints across instances to verify the accuracy of data.
  • Experience in coding using SQL, SQL plus, PL/SQL procedures/functions.
  • Extensively used ETL to load data using PowerCenter/Power Connect from source systems like Flat Files and Excel Files into staging tables and load the data into the target database.
  • Created mappings based on procedure logic to replace procedures and functions.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, testing of Informatica Sessions, and the Target Data.
  • Backing up Informatica, UNIX.

Confidential

ETL Consultant

Description: Philips concentration in the United States mirrors the Global focus on lifestyle, health care and Technology. The main objective of this project is to analyze the sales data for their products and customers. The POS Sales Data contains Sales, Product, Inventory Balances, and Demographic information for Philips U.S Operations.

Responsibilities:

  • Analyzing requirement and Functional documents and explaining them to the team and imparting business knowledge to the team.
  • Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
  • Augmentation of mappings to take care of new emerging requirements.
  • Developed mapping interfaces data transfer strategy from data sources to target systems
  • Developed mapping in multiples schema data bases to load the incremental data load into dimensions.
  • Extensively used the Filter Control Expression on Source data base for filter out the invalid data etc.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 9i.Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Good exposure on Normalized, Denormalizated Data.
  • Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, COBOL files XML Files.
  • Developed and tested all the informatica mappings, sessions and workflows - involving several Tasks.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, update strategy, lookup, sequence generator, joiner.
  • Analyzed the session and error logs for troubleshooting mappings and sessions.
  • Designed Universe with data modeling techniques via implementing Contexts, aggregate awareness, hierarchies, predefined conditions, linking, Joining tables, indicating cardinalities, creating aliases to resolve the loops, subdividing into contexts and creating the objects which are grouped into classes.
  • Generated reports using Desk-I and infoview with analytical ability and creativity using Multi-data providers, synchronization, BO formulas, variables, sections, breaks, formatting drilldowns, Hyperlinks etc.
  • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
  • Responsible for providing development support to offshore team, also involved in changing the business logic Ralph Kimball Preferred depending upon the client requirements.

Environment: Informatica 7.1.2, BO 6.5, Oracle 9i, UNIX

Confidential

ETL Informatica Developer

This project helps to increase crop productivity, improve quality of agricultural produce, conserve sustainable use of water, higher fertilizer use efficiency saving in fertilizer, and saving in labour expenses for the Government of India. It was offshore project.

Responsibilities:

  • Contributed in the development of system requirements and design specifications
  • Participated in the design and development of Dimensional modelling.
  • Developed complex mappings in Informatica to load the data from various sources using various transformations like Source Qualifier, Look up connected and unconnected , Expression, Aggregate, Update Strategy, Joiner, Filter and Router
  • Developed Mapplets to implement business rules using complex logic
  • Converted the PL/SQL Procedures and SQL Loader scripts to Informatica mappings
  • Tuned the Sessions for better performance by eliminating various performance bottlenecks
  • Created and scheduled Sessions and Batches through the Informatica Server Manager Wrote UNIX shell scripts to automate the data transfer FTP process to and from the Source systems, to schedule weekly and monthly loads/jobs
  • Environment: Informatica PowerCenter 6.2, Business Objects, Oracle Applications 11i, Oracle 9i, SQL Server, SQL Loader, HP UNIX, ERwin 4.0, Test Director, WinRunner.

Confidential

Software Programmer

Confidential

Software Tester

Confidential

Software Programmer

We'd love your feedback!