Sr. Etl Developer Resume
Edison, NJ
Sr. ETL Developer (Teradata/Informatica /BO)
Summary:
- Over Seven years of total IT experience in Data warehousing, Business Intelligence and OLAP with emphasis on Informatica ,SSIS and SAP Buisness Objects
- Extensive experience in Teradata V2R6.x/7.x/8.x, IBM DB2, Oracle 8i/9i and SQL Server 2000/2005.
- Extensive experience working with Teradata Database along with using TTU such as Tpump, BTEQ, Fast load, Multiload and FastExport.
- Worked extensively in domains like Financial (Mutual funds, Savings and CD Management, IRAs, Insurance), Telecom and Power systems
- Extensive experience with Informatica (8.1, 8.6, 7.1.3) in designing the Workflows, Worklets, Mappings, Sessions.
- Extensive experience in building Reports using BO Designer, BO Desktop Intelligence
- Experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Experience in Debugging and Performance tuning of targets, sources, mappings and sessions.
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets and PL/SQL stored procedures.
- Worked extensively in Configuration ,Data flow management and logical data modeling
- Writing strong SQL code on Teradata Platform.
- Experience in designing Business Objects- Universe and Report development.
- Created charts in the reports, worked on different functionalities like breaks, filters and prompts
- Extensively used slowly Changing Dimension (SCD) technique.
- Autosys framework and scheduling of jobs
- Excellent communication skills, ability to communicate effectively with different levels of management, strong analytical, problem solving and understanding business rules.
TECHNICAL SKILLS:
ETL Tools:Informatica (Power Center 8.1/ 8.6/7.1.3/6.x), SSIS
BI Tools:Business objects XI R2, 6.5, SSRS
RDBMS: Teradata V2R6.x/7.x/8.x,Oracle 9i/8i, MS SQL Server 2005/2000, IBM DB2
Data Base Tools and Utilities: Multiload, Fastload, BTEQ, TPump, FastExport,DAC (Data warehouse application console),TOAD, SQL * Loader, Query Analyzer
Sources and Targets: Relational databases (Teradata, SQL, Oracle, IBM DB2), Flat files, XML, VSAM, SAP, Siebel, PeopleSoft
Operating Systems:Windows 2000/NT/98/95, HP-UNIX, MS DOS, Sun Solaris.
Data Modeling:Visio, Erwin
Scripting:UNIX Shell Scripting, Perl, VB Script
Web Technology:HTML, XML.
Languages and GUI:C, HTML, SQL, SQL * Plus, PL/SQL, Visual Basic
Automation:Autosys,Test Director (Mercury Quality Center), QTP 9.0, Win Runner
MS Office Products:Microsoft Excel, Word, Access, PowerPoint, Outlook, Visio, Project
Professional Experience:
Client : Confidential, Edison -- NJ
Datamart: BankUSA – IRA’s Datamart
Period : Jan 2008 – PRESENT
Role : Sr. ETL Developer (Teradata)
Tools and technologies used:Informatica Power center Designer 8.6, Workflow
Manager, workflow monitor, Repository manager, Fast Export, BTEQ,BOXI R2, UNIX, Shell script, Toad, SQL *Loader, SQL and PL/SQL.
Project Description: The deposit account IRA program is one of the many benefits of UBS Wealth Management Account (WMA). Any invested cash that enters the account, whether from depositors who have account in Traditional IRA or in Roth IRA proceeds from an investment, is automatically swept on daily basis to account at UBS Bank USA
Roles & Responsibilities:
- Working with different Teradata objects like triggers, stored procedures, views and temporary tables.
- Developing BTEQ scripts to add the logic to get the related data from the source database.
- Responsible for writing Fastexport scripts to export data
- Using SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables and Fact tables.
- Creating mapping using Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
- Developed mappings to handle exceptions anddiscarded data.
- Extracting data from various source systems like Oracle, Sql Server and flat files as per the requirements using Fastload, Multiload and Tpump
- Creating different types of reports (cross tab, charts etc) with Business Objects – reporting tool and Crystal Reports.
- Testing the data and report generation for review of business users for special accounts.
- Training new Employees in ETL Process.
- Developing UNIX shell scripts with PMCMD for executing the workflows.
Client : Confidential, Overland Park, KS
Datamart : Insurance
Period : Dec 2006 – Dec 2007
Role : Teradata Developer / ETL
Tools and technologies used:Informatica Power center Designer 8.1, Workflow
Manager, workflow monitor, Repository manager, PL/SQL, SQL Server2000/2005, BOXI R2, UNIX, Shell script, Toad, Tpump, Multiload
Project Description: The Insurance Datamart (Teradata) is extremely useful for Customers to help them understand the opportunities, risks and benefits of purpose / non-purpose of Life, Disability, Long term Insurance. The worked involved gathering of customer’s data, converting them into business logic (Informatica) so as to report generation using BO XI R2
Roles and Responsibilities
- Involved in discussions with business analysts for requirement gathering, understanding the requirements and explanation of technical probabilities and possibilities with business users.
- Architectured the dataflow in the datamart, extensively customized UBS traditional methodology, designed data flow diagrams, designed the best solution for data flow.
- Analyze new and existing workload using EXPLAIN, query log, timings.
- Created – Time Estimate Proposal - document with estimation of hours required for completion of each ETL task.
- Conversion of business requirements into technical documents – Business Requirement Document, explained business requirements in terms of technology to the developers.
- Worked with data modelers and business users for designing of tables in Teradata warehouse.
- Performance monitoring of Teradata database.
- Produced data flow diagrams (DFD) with – Visio, Erwin.
- Creation of datamaps using Informatica PowerExchange to access mainframe source files
- Verification of the mainframe files data by checking them in mainframe environment.
- Analyzed the source data with business users, developed critical mappings using Informatica PowerCenter to load the data from mainframe file (access mainframe file through PowerExchange) intoOracle tables.
Client : Confidential, Charlotte, NC
Datamart : Mutual Funds, Savings and CD Management
Period : Jan 2006 – Nov 2006
Role : Teradata ETL Developer
Tools and technologies used: Informatica Powercenter 8.1, Oracle 9i, Teradata V2R6.2,HP-UNIX, Java, Shell script, Sybase, Toad, SQL Server 2000, SQL * Loader, SQL, Repository Manager, PL/SQL, Html, Excel, Visio
Project Description:
Worked extensively on ‘Mutual Funds’ and ‘Savings and CD Management’ datamart. Worked on complex and critical tables of the datamart to design the data load, worked on Tier-1 support for production Issues. This datamart was very critical for bank’s campaigning process and It is been designed to accommodate several new offers, updates on the existing offers. Was very helpful for business users in analyzing the revenues, building up new strategies.
Roles & Responsibilities:
- Created BRD (Business Requirement Documents) and converted the business requirements to technical specifications.
- Architectured the data load into different databases. Identified tables that can be loaded with External loaders (Multiload, Fastload) instead of using Informatica.
- Worked with data modeler and business users for designing of tables.
- Worked with Scheduling team and Production support team for designing of a VISIO diagram with existing and new Informatica / UNIX jobs.
- Good exposure to – Onsite – Offshore model.
- On-call support for existing datamarts.
- First point of contact for all kinds of issues for Offshore team and Business analysts.
- Developed complex Informatica mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
- Created several UNIX scripts to run Informatica workflows.
Client : Confidential, Tamarac, FL
Datamart : Copper Datamart
Period : Dec 2004- Dec 2005
Role : ETL Developer/Report Developer
Tools and Technologies used: Informatica 7.1.3 , BO 6.5, Oracle 9i, IBM DB2, PL/SQL, Mainframes, MS Access 2000, MS SQL Server 2000, Windows NT 4.0, Windows 2000, HP-UX, Sun Solaris 2.6, UNIX Shell Scripts.
Project Description:
Confidential has its base application running on SAP R3 and this project developed data warehousing solution for the Analysis team in the business area of Software Systems Division. ADC data comes from different operational sources and stored in a Data mart and then data is transformed and loaded into a centralized data warehouse for various strategic business reports. This Datamart housed 4 fact tablesaccessed via 13 dimension tables and containing over one million rows in the largest fact table
- Interacted with end-users and functional analysts to identify and develop BRD and transform it into technical requirements.
- Extensively used ETL to load data from Oracle database, XML files, and Flat files data also used Power Connect to import data from IBM Mainframes.
- Imported Sources and Targets to create Mappings based on business logic and developed Transformations using Power center Designer. Used Informatica Workflow Manager, Workflow Monitor to create sessions and batches.
- Extensively used Transformation Language functions in the mappings to produce the desired results.
- Worked on all the transformations like Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Stored Procedure and Sequence Generator.
- Created and ran pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager, Workflow Monitor.
- Used workflow manager and workflow monitor to schedule and monitor the workflows.
- Extensively made use of several features in TOAD to keep track of the various source, staging, target tables
- Developed Shell Scripts to automate file manipulation and data loading procedures.
- Successfully moved the Sessions and Batches from the Development environment to Production environment.
- Completed documentation in relation to detailed work plans, mapping documents and high-level data models.
- Used BO 6.5 for reports generation.
Client : Confidential
Datamart : GEPS EDW
Period : Nov 2002 – Nov 2004
Role : Junior Data Warehouse Tester
Tools and technologies used: BTEQ, Teradata, Informatica, Business Objects, Oracle 8i, Desktop Intelligence, Web Intelligence
Project Description: This data mart is aimed at providing information for managing the quality of material received from suppliers / produced on GE shop floors and farm out vendors. It will help the aforementioned staff produce reports that will not only be used to track deviations on the shop floor / at the supplier’s site, but also to find the root causes and cost of deviations and control them / eliminate them from happening in the future.
Roles & Responsibilities:
- Working as Data warehouse Tester and responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
- Created and executed SQL queries to perform Data Integrity testing on an Teradata Database to validate and test data.
- Wrote BTEQ Scripts
- Ran workflows created in Informatica by developers then compared before and after transformation of data generated to ensure that transformation was successful.
- Performed QA on ETL Mappings and SSRS Reports, BO reports.
- Created reports with charts using Desktop Intelligence and Web Intelligence tool
- Tested extensively the different functionalities of BO like Breaks, Filters, Sorts, and Query prompt and Drill filters.
- Involved in writing detailed level documentation for reports.
EDUCATION:
Bachelors in Electrical Engineering
Masters in CS