We provide IT Staff Augmentation Services!

Etl Lead Developer Resume

5.00/5 (Submit Your Rating)

Summary

  • ETL Lead developer with 9 years of proven expertise providing analysis, design, development, and Implementation for client server and Data warehousing projects.
  • Created scripts to execute DataStage jobs through scheduling tools.
  • Created scripts for scheduling ETL sequencers using scheduling tools like Crontab, autosys
  • Worked with IBM Information Analyzer 8.5 on Data profiling for column analysis, primary key analysis
  • Expertise in Data Mirror with Data stage 8.5/8.1//7.5 using change data capture stage.
  • Having very strong knowledge on informatica 8.x(SOA) and informatica 7.x(client server architecture)
  • Had worked on dimensional database design. i.e.: Kimball methodology
  • Hand on experience on Quality stage i.e. Investigation stage.
  • Knowledge on IBM Master Data Management server
  • Experience using IBM Datastage (Ver. 8.x/7.5x/7.x) for Extraction, Transformation and Loading
  • Hands-on experience in data modeling, Data warehousing, and data migration significant expertise in design of Conceptual, logical and physical data model using relational data modeling tools, including Erwin tool.
  • Scripting and coding experience including Ksh, SQL, PL/SQL procedures/functions, triggers and exceptions.
  • Extensively used DataStage components like Datastage Designer, Director, and Administrator
  • Instant troubleshooting of DataStage jobs during quick-fixes.
  • Experience in addressing production issues like performance tuning and enhancement.
  • Experience in Parallel Extender (DS-PX)developing Parallel jobs using various stages
  • Experience in Database Management, Data Mining, Software Development Fundamentals, Strategic Planning, Operating Systems, Requirements Analysis, Data warehousing, Data Modeling and Data Marts.
  • Proven track record in troubleshooting of Datastage jobs
  • Proven track record in addressing issues like performance tuning and enhancement.
  • Area of expertise encompasses Database designing, ETL phases of Data warehousing. This is achieved with an emphasis on relational data modeling and dimensional data modeling for OLTP and OLAP systems.
  • Execution of test plans for loading the data successfully
  • Strong database skills with Oracle, Teradata. Worked on Teradata Utilities TPUMP, MLoad.
  • Having very good knowledge on connecting to XML Sources using data stage
  • Strong Knowledge in Star Schema and Snow-Flake models and E-R modeling.
  • Strong experience on C++ Routines, java,OOPS concepts copy constructor and inheritance, virtual functions and Object Oriented Analysis & Design - Advanced methods,
  • Worked with UNIX (AIX) Shell scripts and Perl scripting
  • Strong interpersonal, written and verbal communication skills.
  • Highly organized professional with strong technical skills.

Technical Skills:
ETL : Datastage 8.X/ 7.5.1/7.0/6.0 Parallel/Server, Teradata Utilities, Informatica8.x/7.x
Data profiling : IBM Information Analyzer 8.1
Databases : Oracle 10g/9i/8i/8/7.x, DB2, SQL Server 2000, Teradata.
Data Modeling : Erwin4.5/4.0/3.5/3.0, Star schema, Snowflake schema.
Programming : UNIX Shell Scripting, AIX, C, C++, SQL, PL/SQL, HTML, VB, Perl Scripting
Database Tools : SQL *Loader, SQL*Plus, TOAD.sque
Environment : Windows NT/95/98/2000/XP/VISTA, UNIX, MS DOS

Professional Exp:

Confidential,CA June'11 till present
Data stage Lead Developer

ITDH Datawarehouse
Extract data from multiple source systems and dumping data in staging area and Staging to Dimension and Fact table and Loading ECF data, IVOS data into ITDH.

Responsibilities:

  • Developed in ETL jobs using Data stage, supporting all ETL process such as implementing Source to target mapping.
  • Involved in System testing strategies preparation. Designed Test Cases
  • Expertise in resolving performance tuning issues. Expertise in UAT, IT test cases, Utensils.
  • Worked extensively on different types of stages like Aggregator, Merge Join, Lookup, Remove Duplicates, Transformer, Copy, Filter, Modify, Sorter, Investigate, Standardize, match frequency, Survive.
  • Developed various shared containers, stages and validated and fine-tuned ETL logic coded into jobs.
  • Responsible for validating Data stage jobs, sequences against pre-defined ETL design standards.
  • Tuned Source, Targets, jobs, stages and Sequences for better performance

Environment: Data stage 8.5, Information Analyzer, Oracle 10g, Flat files, XML Files and datasets, UNIX shell scripting using ksh, Windows XP, Crontab for scheduling jobs and UNIX

Confidential,Columbus, Ohio Jan '10 - June 11
Datastage Lead Developer
Project is on GPO reporting for developing new jobs and modifying enhancement jobs, where source feed is teradata and lookup tables are tera data and oracle, target is sequential file. Here we are following star flake scheme modeling, fact tables involved are invoice, dimensional tables involved are product, time, short reason, flag etc. Loading is done on daily basis and monthly basis.

Responsibilities:

  • Worked on teradata connector to connect and extract data
  • Worked on lookup's and joiners and transformer stage and sequential files.
  • Worked on oracle connector stage as reference table to lookup data.
  • Developed in writing transformations using stage variables for jobs and code for batch jobs to run jobs in parallel.
  • Performing export and import of DataStage components, table definitions and routines.
  • Developed Job Sequences, job activities, execute commands, user variable activity and implemented logic for job sequence loops, recovery and restart.
  • Used DataStage Director to clear job logs, job resources and status files.
  • Experience in complete life cycle of Design, Development, Maintenance and Documentation of Conversion Process. Worked with Datasets, Lookup, Merge, Joiner, Funnel, Transformer
  • Prepared technical design, tests (Unit/System), and supported user acceptance testing.
  • Involved in setting up ETL standards and helped client to use ETL best practices.
  • Extensively used shell scripting for FTPing to K-drive and GPO server
  • Identified performance bottle-necks and suggested improvements.

Environment: DataStage 8.1 Parallel, Oracle 9i/10g, Teradadata12, Flat files, UNIX- AIX, Windows XP and Autosys Tool for scheduling jobs

Confidential, TX June\'08 - Dec '09
Datastage developer
Creation of LBG by merger of LTSB and HBOS initiated the need of unifying mortgage data of both organizations. In current state, FDM system addresses reporting / MI requirements of organization. However, going forward this will be achieved by establishing a Mortgage Data warehouse (MDW) with industry standard data architecture (Teradata FS LDM), strong data governance, new platform and standard tools for data exploration. This is planned to be achieved under 'The Mortgage Integration Business Intelligence Program. Our source feeds are oracle and Flatfiles and target is Teradata database.

Responsibilities:

  • Designed jobs in Datastage 8.1 Parallel Extender. Provided hands-on participation, technical guidance and leadership to support existing Data warehouse applications, involving Datastage, Quality Stage. Using IBM Information Analyzer, done column Analysis
  • Used Multi load stage using Teradata Utilities(TPUMP,MultiLoad) to load into Teradata DB
  • Worked on Change data capture (CDC) for updating target. Developed reusable jobs using Dynamic Meta data using run column propagation.
  • Developed in writing transforms using stage variables for jobs and code for batch jobs to run jobs in parallel.
  • Performing export and import of Datastage components, table definitions and routines.
  • Developed Job Sequences and implemented logic for job sequence loops, recovery and restart.
  • Used Datastage Director to clear job logs, job resources and status files.
  • Experience in complete life cycle of Design, Development, Maintenance and Documentation of Conversion Process. Worked with Datasets, Lookup, Merge, Joiner, Funnel, Transformer
  • Profiled vendor supplied raw data using Information Analyzer.
  • Prepared technical design, tests (Unit/System), and supported user acceptance testing.
  • Involved in setting up ETL standards and helped client to use ETL best practices.
  • Extensively used Perl Scripting for Transforming Raw data to standard format.
  • Identified performance bottle-necks and suggested improvements. Used Maestro tool as third party tool for scheduling ETL jobs and notifications.

Environment: Datastage 8.1 Parallel Extender using IBM Information Analyzer and Quality stage, Erwin 4.2, Oracle 9i/10g, Teradadata12, Flat files, UNIX- AIX, Windows XP, MS Word, Excel, Maestro Tool for scheduling jobs

Confidential,India Jan \'07 - May\'08
Datastage developer
GSK currently has three source systems i.e. people soft and Flatfiles and sap HR systems, our aim is to connect to three source systems via datastage and extracting and applying business rules and loading data to SAPBW (target system). Profiling data using IBM information analyzer. That means before extracting data into staging area need to do profiling for various countries like India, Brazil, and GPS. Connecting two various source systems (DB2, XML files) .Extracting applying business logic based on client requirement and used various stage like Transformer and copy and modify, aggregation and dataset and finally loading into oracle.

Responsibilities:

  • Involved in reviewing Business/System requirements and other deliverables.
  • Designed jobs in Datastage 8.1 Parallel Extender.
  • Developed in ETL jobs using Datastage, supporting all ETL process such as implementing Source to target mapping.
  • Involved in System testing strategies preparation. Designed Test Cases
  • Expertise in resolving performance tuning issues. Expertise in UAT, IT test cases, UTResults.
  • Used Autosys tool for scheduling ETL Jobs
  • Created jobs to extract credit rating data from enterprise Data warehouse.
  • Worked extensively on different types of stages like Aggregator, Merge Join, Lookup, Remove Duplicates, Transformer, Copy, Filter, Modify, Sorter, Investigate, Standardize, match frequency, Survive.
  • Created stages to look up data from source and target tables. Developed various shared containers, stages and validated and fine-tuned ETL logic coded into jobs.
  • Responsible for validating Datastage jobs, sequences against pre-defined ETL design standards.
  • Tuned Source, Targets, jobs, stages and Sequences for better performance. Developed SQL Procedures for synchronizing data
  • Used Datastage Parallel Extender for splitting data into subsets and to load data, utilized available processors to achieve job performance, configuration management of system resources in Orchestrate environment.
  • Created user defined transforms and routines for handling special cases
  • Extracted data from various sources, Transformed according to requirement and Loaded into Data warehouse schema using various stages like sequential file, Aggregator, Transformer stage.
  • Worked on XML input stage to read XML file. Worked on DB2 Enterprises stage to read tables in DB2 RDBMS
  • Extensively used SQL and PL/SQL coding in Datastage jobs for processing data. Develop UNIX shell scripts and schedule jobs.
  • Developed batches and sequencers in designer to run and control set of jobs.
  • Used Datastage Director and its run-time engine to schedule running job.

Environment: Datastage 8.1/7.5, Parallel Extender, DB2, oracle, XMLfiles, UNIX, Shell Scripting and Autosys tool.

Confidential,India May\'04 - Dec'06
DataStage developer
Lloyds TSB is a leading UK-based financial services group, whose businesses provide a comprehensive range of banking and financial services in UK and overseas. Project Whole sale Banking vision to understand a holistic view of each wholesale customer\'s relationship with bank, as a way of improving its ability to manage needs of customer, and Bank, whether this relates to service, credit, marketing, relationship management or product design.

Responsibilities:

  • Analyzed high level requirements for CVC Project.
  • Identified high-level components for ETL program. Designed ETL components for Change requests
  • Created Work estimations and time estimations for CUT phase.
  • Involved in writing Functions and Procedures to load data from staging to warehouse and to Mart.
  • Develop UNIX Shell scripts and schedule jobs.
  • Extensively used SQL and PL/SQL coding in Datastage jobs for processing data
  • Wrote scripts to automate Datastage jobs on daily bases
  • Knowledge in Dimensional Modeling Techniques to create Dimensions and Fact tables using Erwin.
  • UsedParallelExtenderfordistributingloadamongdifferentprocessorsbyImplementingpipelineandPartitioningofdatainparallelextender.
  • Involved in creating, administering repositories, Folders, Permissions
  • Used ETL to extract and load data from Oracle, and Flatfiles to Oracle.
  • Involved in writing lot of Functions, Stored Procedures.
  • Created various transformations
  • Developed mapping to load data in slowly changing dimensions.
  • Creating Relational Connections, Migrating Mappings from Dev to Test and Test to Production
  • Worked on Sequential files, Hash files, Transformer, in designing ETL jobs.

Environment: Datastage server, PL/SQL, Query man, Erwin, Toad 7.0, and Unix AIX, oracle

Confidential,India Oct\'01 - April \'04
ETL Developer
Scottish Widows Investment Partnership is an Asset Management firm. Assets are invested in major assets classes in domestic and overseas equities, properties, bonds and cash. SWIP gets data from different trading companies like S & P, Bloom berg etc. SWIP uses this data to analyze ups and downs about their investments.

Responsibilities:

  • Worked extensively with Flatfiles with Various Data Files coming from Various Sources.
  • Worked with stages like transformer and Hash files, Aggregator ,sort etc
  • Used Datastage Designer to create complex mappings
  • Involved in design phase of logical and physical data model using Erwin 4.0
  • Involved in designing Star Schema for business processes.
  • Develop UNIX Shell scripts and schedule jobs.
  • Wrote scripts to automate Datastage jobs on daily bases
  • Extracted data from Oracle, Flatfiles and load into Data warehouse.
  • Developed complex mappings using multiple sources and targets in different databases
  • Developed transformation logic to load data into Data warehouse
  • Used Oracle, Sequential files. Hash files, Transformer, in designing the ETL jobs
  • Used Datastage for migrating data from various OLTP servers/databases.
  • Involved in Understanding the Business Requirements and map to the Technical Requirements.

Environment: Datastage Server, Flat files, Windows Server 2000.

Education
Master in computer Applications (MCA)

We'd love your feedback!