We provide IT Staff Augmentation Services!

Etl Developer Resume

0/5 (Submit Your Rating)

Ann Arbor, MI

SUMMARY

  • Over 8 years of excellent IT experience in requirement gathering, data analysis, design, development, Testing and Implementation of Data Warehouses as an ETL Developer using Informatica for Insurance, Financial, Telecommunications, Pharma and Technology Industries.
  • Complete exposure to all stages of SDLC.
  • Skilled in creating entity relational & dimensional relational data models with Dr. Ralph Kimball and Bill Inmon Methodologies.
  • Data Modelling experience using Dimensional Data modelling, Star Schema modelling, Fact & Dimension tables, Physical &logical data modelling using ERWIN 4.1/3.5, TOAD 7.6/8.6.1.
  • Good at ETL mechanism using Informatica using DTS, SQL Server 2005 SSIS, SSAS, SSRS in complex, high volume Data Warehousing projects in both Windows and Unix environment.
  • Extensive experience in coding using SQL, PL/SQL Procedures/Functions, Triggers, Cursors and Packages.
  • Extensive experience in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica Power Center 8.x/7.x/6.x/5.x in Windows and UNIX environment
  • Developed complex mappings using Informatica Power Center Transformations - Lookup Unconnected, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
  • Experience in developing complex Mapplets, re-usable Tasks, re-usable mappings and Slowly Changing Dimensions technique.
  • Experienced in Trouble shooting and performance tuning at various levels such as source, mapping, target and sessions.
  • Used UNIX shell scripting, CRON, FTP, file management command line utility pmcmd to schedule and control sessions and batches of ETL processes.
  • Experienced in Data Analysis, Dimensional Data Modeling, Data Extraction, Transformation & Loading (ETL), Building Cubes, Creating Meta layer, Data Mining and End User Reporting.
  • Day-to-day responsibilities include reviewing system documentation, identifying development objectives, developing/updating change requests, fixes, defects, and executing functional and regression Informatica batch testing.
  • Experience in ability to create, develop Informatica ILM results based on Design Documents create by the Data Analyst / Architect using Data Masking Tool.
  • Worked on databases using Oracle 10g/9i/8/7.3, MS SQL Server 2005/2000/7.0/6.5, IBM DB2 7.1/8.2.
  • Experienced in generating OLAP reports using Cognos and Business Objects.
  • Knowledge in Teradata utilities BTEQ, Fast Load, MultiLoad, FastExport, Tpump.
  • Experience in Business Objects 5.x/6.1b/6.5,XI 6.5.1, XIR2, XIR3.1, Crystal Reports XI - 2008, Crystal Xcelsius 4.5 - 2008.
  • Experience in Designing, Building and Maintaining the Universes, resolving the issues like loops and Traps using Alias and Context, designing complex objects using @prompt, @ Aggregate aware function.
  • Expertise with Business Objects products (Supervisor, Designer, Reporter) and Business Objects Server Suite, (CMC, Info View, WEBI, ZABO, BCA).
  • Created dashboards in Xcelsius for the Executive users.
  • A good team player with excellent written and verbal communication skills.
  • Excellent problem solving skills with a strong technical background and good interpersonal skills. .

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 8.x/7.x/6.x/5.x (Power Center/Power Mart (Designer, Workflow Manager, Workflow Monitor, Server Manager, Power Connect, Power Exchange), Data Masking.

OLAP Tools: Cognos Impromptu, Impromptu Web Reports 6.0 (IWR), PowerPlay 6.6, Framework Manager, Business Objects 6.x (Supervisor, CMC, Designer, Business Object Desktop Intelligence, WEBI, Crystal Report XI/9.0, 7.0/7.1) to Business Objects XIR 3.1, Crystal Reports 2008 and Crystal Xcelsius 2008.

Data Modeling: Erwin, Microsoft Visio, Rational Rose (UML), ER Studio

Operating Systems: Windows NT/2000/2003/XP, UNIX, Solaris and MSDOS

Databases: Oracle 11g/10g/9i/8i, SQL Server2008/2005, DB2, MS Access, Teradata V2R5,SAP/R3.

Tools: TOAD, XML Spy, SQL Developer, MS SQL Server Management Studio, Clear Quest, Quality Center, Test Director, $U, HP Quality Center 9.0/9.2/10.0, Autosys.

Languages: SQL, PL/SQL, C,C++, VB

PROFESSIONAL EXPERIENCE

Confidential, Kew Gardens, NY

Senior ETL Developer

Responsibilities:

  • Involved in reviewing and analyzing Business Requirement Documents and Functional specifications.
  • Worked with Business Analysts, developers while reviewing the Business Requirement Documents and when there are enhancements in the applications.
  • Worked in Cross Functional Team Projects by following Agile and Kanbhaan Methodologies for code reviews with the other teams for signoffs and daily standup meetings in the morning.
  • FTP/SFTP for Data transmission between multiple servers, and to local machines.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Used Informatica as an ETL tool to create source/target definitions mappings and sessions to extract transform and load data into staging tables from various sources.
  • Implemented Slowly Changing Dimensions (SCDs) for implementing Incremental and Full Load Techniques.
  • Used SSIS as an ETL Tool to build packages to process the data from the Source to Target into Data warehouse.
  • Used the SQL Server Management Studio to develop Stored Procedures and Functions to implement Encryption technique.
  • Did Unit Testing to ensure the successful execution of the Logic to ensure the correct results.
  • Worked with the ETL Testing Team to perform the Functional, Regression and Integration Testing.
  • Worked with the DBAs for the successful deployment in the production and to give the security, access permissions to the Data Stewards.
  • Interacted with the Data Stewards regarding the Encryption-Decryption Procedures to educate them about accessing the Data.
  • Involved in setting up the Business Objects Development, Stage and Production Environments.
  • Developed Universes from the Marts and Cubes to load the data into the Canned reports, Adhoc reports and Jump-off reports.
  • Used prompts in the reports as well as in the Universe to create more interactive reports for Data Stewards to see live data.
  • Created Joins, Contexts, and Aliases for resolving Loops and checked the Integrity of the Universes.
  • Customized the dashboards and KPIs for better manageability of overall performance.
  • Used different analytics like Interactive Metric Trends (IMT), Speedometers, Metric Trees, Maps, Pareto Charts and Traffic Lights across the sliced metrics for the dashboards.
  • Rapidly created and published complex WEBi report (Ranking, Hierarchical with multiple blocks) to enable client to meet their deadline for ETL reports.
  • Helped client to Tweak and manage their Dashboard Universes.

Environment: Informatica Power Center 8.6(Repository Manger, Designer, Server Manager), Data Quality, Data Masking, Business Objects XI R3.1 Xcelsius, Infoview, Oracle 11g, MicroSoft SQL Server 2008 R2, MicroSoft SQL Server Management Studio 2008 R2, SSIS(BIDS), MicroSoft Visual Studio 2008, SQL Developer.

Confidential, Plano, TX

Senior Informatica Developer

Responsibilities:

  • Involved in design, development and implementation of RDM (Report Data Mart).
  • Involved in interacting with the business community to understand their requirements.
  • Extensively worked with Informatica Power Center client tools.
  • Used Relational Sources like Oracle, SQL Server, DB2 and flat files to populate the Data mart.
  • Translated the business process into Informatica Mappings for building Data mart.
  • Developed transformations and rules and enabled them as Web-Services.
  • Used Informatica web-services connector ( provider/consumer ).
  • Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
  • Used Teradata Fast Loads for truncate load tables and Mloads for insert, update and upsert options.
  • Worked on SQL Tools like TOAD and SQL Server Management Studio to run the SQL Queries to validate the data.
  • Extensively used Re-usable Objects like Sources, Targets in all mappings to handle metadata changes efficiently.
  • Developed PL/SQL procedures, functions for calculating the quantity of supply, stock in hand which are derived during the business requirement study.
  • Used Unix Shell scripting to automate the execution the workflows. Used PMCMD command to schedule the workflows.
  • Used the debugger in Informatica to test the mapping and fix the mappings.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the bugs.
  • Performed unit testing and system testing for troubleshooting the mappings and workflows.
  • Did migration from test folder to UAT (User Acceptance Test) and Production.
  • Worked on various Cognos drill through reports, conditional formatting, toggle reports and master detail relationships.
  • Maintenance/Support of the application, trouble shooting and Enhancements.
  • Provide flexible & high quality support to wider Testing strategies on key regular deliverables & Adhoc reporting issues..
  • Involved in writing project documentation using Microsoft VISIO for different diagrams.

Environment: Informatica Power Center 8.5.1/7.1, Power Mart, Informatica Power Connect for DB2, Informatica Power Exchange, IDQ, Teradata, Cognos Series 8.3/7.0, Erwin, Oracle 10g, SQL, PL/SQL,DB2, MS SQL Server 2005/2000, Flat Files, Windows XP, UNIX Shell Scripting, TOAD.

Confidential, Danbury, CT

Data warehouse Consultant

Responsibilities:

  • Involved in Data modeling and design of data warehouse in star schema methodology with conformed and granular dimensions and FACT tables.
  • Analyzing existing transactional database schemas and designing star schema models to support the users reporting needs and requirements.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Extensively used SQL * Loader, Informatica tool to extract, transform and load the data from MS SQL Server, Flat Files, Oracle to Oracle.
  • Implemented Slowly Changing Dimensions (SCDs, Both Type 1 & 2).
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings using Mapplets using Informatica Designer.
  • Involved in the development of Informatica mappings and also performed tuning for better performance.
  • Extensively worked on Performance Tuning (Both Database and Informatica side) and thereby improving the load time.
  • Extensively used PL/SQL for creating packages, procedures and functions.
  • Automated the entire processes using UNIX shell scripts.
  • Using Autosys to schedule UNIX shell scripts, PL/SQL scripts and Informatica jobs.
  • Written Unix Shell Scripts for getting the data from all systems to the data warehousing system. The data was standardized to store various business units in tables.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables and used the Test Director tool to report bugs and fix them in order.
  • Tested the target data against source system tables by writing some QA Procedures.
  • Created Migration Documentation and Process Flow for mappings and sessions.
  • Using Autosys as the job-scheduling tool.
  • Creating Universes for specific reporting requirements which run against Oracle 11g Database.
  • Designing and Developing Standard and ad-hoc WEBi Reports to serve the business user requirements.
  • Resolving the System errors/Run time errors in existing WEBi reports that are caused because of the DB changes and Enhancing the Universes to sink with the ongoing DB changes.
  • Building Corporate Dash Boards & Analytics (with Pie & Bar Charts) to meet specific Dashboard requirements of Business Users.
  • Resolving issues like Loops, Fan Traps & Chasm Traps by creating Aliases and Contexts as required in removing the cyclic dependencies and testing them to ensure that correct results are retrieved.
  • Implemented Row-Level Security in Universe to logically restrict data access to report user groups.
  • Created complex Yearly/Quarterly/Monthly reports using cascading and user objects like measure objects by using @aggregate aware function to create the summarized reports.
  • Created universe user documentation guide for the end users reference and trained end users for generating their own ad-hoc reports including Complex reports.

Environment: Informatica Power Center 8.1, HP UNIX, Autosys, WindowsNT, Oracle 10g, DB2 UDB, Erwin, SQL, PL/SQL, SQL*Loader, TOAD 9.5, Siebel, Business Objects XIR2/XI R3, Deski XIR2/XIR3, Crystal XCELSIUS, SQL Server 2003, Windows NT.

Confidential, Ann Arbor, MI

ETL Developer

Responsibilities:

  • Involved in reviewing and analyzing Business Requirement Documents and Functional specifications/use cases.
  • Made changes to the existing code to load data into various tables in Data Marts..
  • Used Informatica as an ETL tool to create source/target definitions mappings and sessions to extract transform and load data into staging tables from various sources.
  • Identify source systems connectivity tables and fields ensure data suitability for mapping.
  • Developed complex mappings with various Informatica transformations like Aggregator, Lookup (Connected, Unconnected), Source Qualifier, Update Strategy, Router, Joiner, Filter and Expression.
  • Used Change Data Capture (CDC) to implement Incremental Data Extraction so that only the modified and new records will be extracted to the destination rather than full extraction.
  • Worked in Agile methodology during application development and participated in daily scrum meetings.
  • Developed user defined functions, created objects where required and used descriptive programing to automate.
  • Control Data Masking processes from a central environment with a multithreaded engine designed to handle data masking for large data volumes.
  • Create standardized Data Masking rules that can be reused throughout your organization.
  • Tested Data Masking rules before implementation to validate privacy policies.
  • Monitor procedures for Data Masking of sensitive information with thorough auditing and reporting capabilities.
  • Created the PL/SQL stored procedures for extracting the data from source tables and load the target tables of Data mart in the development of Audit tables.
  • Created different kinds of objects like Dimension, Detail and measure objects.
  • Validated the data thru various stages of data movement from staging to Data Store to Data Warehouse tables.
  • Created stored procedures to validate the data before loading data into data marts.
  • Created Joins, Cardinalities, Aliases and Contexts for resolving Loops and checked the Integrity by developing the Universes using Business Objects.
  • Responsible for documentation of test results, test matrix,andparticipated in daily/weekly QA meetings.
  • Conducted down test and up test for release builds pushed to UAT and to Production.

Environment: Informatica Power Center 8.6(Repository Manger, Designer, Server Manager), Power Exchange, Data Quality, Data Masking, Business Objects XI R3.1, Oracle 11g,MS SQL 2008, HP Quality Center 10.0/9.2, SSIS, MS Excel.

Confidential, San Francisco, CA

ETL / Informatica Developer

Responsibilities:

  • Responsible in gathering requirements and developing the technical specifications.
  • Involved in analysing the data model by interacting with the data modeler.
  • Used Visio to model the batch processing and information flow between various batch programs.
  • Designed and developed complex mappings using different Informatica transformations.
  • Interacted with Data Modeler to design data model using Erwin.
  • Created and scheduled sessions and batch processes based on demand, run on time, run only once using Informatica server manager.
  • Developed test procedures to compare the aggregates of source and target databases.
  • Wrote Teradata SQL queries according to Process need.
  • Used Teradata SQL Assistant & Manager for database.
  • Exporting Data using BTEQ and Fast Export.
  • Created SSIS Packages to migrate slowly changing dimensions Script Task using C#, Vb, SQL OLE DB Source/Destination, Lookup, Derived Column etc.
  • Implemented Slowly Changing Dimension methodology.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Created effective Test Cases and did Unit and Integration Testing to ensure the successful execution of data loading process.
  • FTP for Data transmission between multiple servers, and to local machines.
  • Extensively used Informatica Debugger for testing the mapping logic during Unit Testing. Performed unit tests, integrated testing and validated results.

Environment: Informatica 7.1, Erwin, UNIX, Test Director, MS SQL Server, Teradata V2R5/V2R6, Oracle 10g, Mercury Quality Center 9.0, SSIS, Cognos Series 8.3.

Confidential, Flower Mound, TX

ETL Informatica Developer

Responsibilities:

  • Assisted gathering business requirements and worked closely with various Application and Business teams to develop Data Model, ETL procedures to design Data Warehouse.
  • Designed and developed star schema model for target database using ERWIN Data modelling.
  • Extensively used ETL Informatica tool to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
  • Used various active and passive transformations such as Aggregator, Expression, Sorter, Router, Joiner, connected/unconnected Lookup, and Update Strategy transformations for data control, cleansing, and data movement.
  • Designed and developed Mapplets for faster development, standardization and reusability purposes.
  • Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Used Debugger to validate transformations by creating break points to analyse, and monitor Data flow.
  • Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.
  • Worked along with the QA Team and provided production support by monitoring the processes running daily.
  • Involved in pre and post session migration planning for optimizing data load performance.
  • Interfaced with the Portfolio Management and Global Asset Management Groups to define reporting requirements and project plan for intranet applications for Fixed Income and Equities.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.
  • Wrote UNIX Shell Scripts and pmcmd command line utility to interact with Informatica Server from command mode.
  • Controlled resources like Universes, Documents being a Supervisor-Designer and also applied command level restrictions to different users and groups.
  • Helped transitioning from XIR1 to XIR2 version providing all groundwork including installation and upgrade. Extensively tested and migrated the reports and universes.
  • Created Users/Groups/Sub-Groups and assigned profiles to restrict unauthenticated access to resources.
  • Created initial Classes & Objects for the Universes in both on/off line modes.
  • Defined Aliases and Contexts to resolve the join issues in the Universe.
  • Solved the issues during designing such as loops, Cartesian product due to fan and chasm traps.
  • Used Index awareness to take the advantage of the indexes on key columns to speeddata retrieval.
  • Created the reports using Business Objects functionalities like Queries, Slice & Dice, Drill down, @Functions, Cross Tab, Master/Detail and Formulae etc.
  • Built full client and Webi reports, according to end user requirements.
  • Created user Prompts, conditions, and filters for various reports to specify the reporting parameters as per business requirements.
  • Automated the reports for refreshing the data every month and quarter details.
  • Responsible for giving Business Objects classes for the entire staff that was using reports.
  • Scheduled Business Objects Reports using Broadcast Agent.
  • Trained user community in basic and advanced Business Objects reporting skills.

Environment: Informatica Power Center 6.2, MS SQL 2000, Oracle 9i, SQL, PL/SQL, SQL Navigator, Erwin, UNIX Shell Scripting, Windows XP and 2000, Business Objects 6.5/XI/XI R2, Desktop Intelligence, Web Intelligence, CMC, Designer, Info View.

Confidential, CT

Sr. ETL Programmer/Analyst

Responsibilities:

  • Extensively used Informatica to load data from different data sources into the oracle data warehouse.
  • Involved in Data Cleansing and Extraction.
  • Involved in Data modeling and design of data warehouse in star schema methodology with confirmed dimensions and FACT tables.
  • Extracted data from sources like DB2, Oracle, Sybase, Fixed width and Delimited Flat files, transformed the data according the business requirement and then Loaded into the Oracle Database.
  • Modified several of the existing mappings and created several new mappings based on the user requirement.
  • Maintained existing mappings by resolving performance issues.
  • Developed the Design Document for each ETL mapping, defining the source and target tables and all the fields, transformations and the join condition, which helped the users to better understand the type of data.
  • Created Mappings using Mapping Designer to load the data from various sources using different transformations like Aggregator
  • Expression, Stored Procedure, External Procedure, Filter, Joiner, Lookup, Router, Sequence Generator, Source Qualifier, and Update Strategy transformations.
  • Implemented Slowly Changing Dimensions (SCDs, Both Type 1 & 2).
  • Handled operating system tasks by generating Pre and Post-Session UNIX Shell Scripts.
  • Created and Scheduled Sessions and Batch Processes based on demand using Informatica Server Manager.
  • Developed UNIX shell scripts for ETL implementation.
  • Involved in Writing Shell scripts to automate Pre-Session and Post-Sessions Processes.
  • Used Scheduling Tool Autosys to schedule UNIX shell scripts, PL/SQL scripts and Informatica jobs.
  • Developed reusable Mapplets and Transformations.
  • Used pmcmd command to schedule sessions and batches.

Environment: Informatica Power Center 6.2(Workflow Manager, Workflow Monitor, Repository Manager), UNIX, SYBASE, WindowsXP/2000, TOAD, SQL server, DB2, Oracle 9i, SQL, PL/SQL.

Confidential

Informatica Developer

Responsibilities:

  • Designed and developed business rules to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
  • Created and scheduled sessions and Batch process based on demand, run on time, run only once using Informatica server manager.
  • Involved in the performance tuning of mappings and sessions.
  • Developed Oracle Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Developed deployment instructions to support the deployment of data marts and reports.
  • Performed manual testing prior to automate testing on the application.
  • Scheduled and monitored transformation processes using Informatica Server Manager.

Environment: Informatica Power Center 6.1, Oracle 9i, PL/SQL, TOAD, SQL* Plus, SQL*Loader, UNIX.

We'd love your feedback!