We provide IT Staff Augmentation Services!

Sr. Etl Data Stage Developer Resume

0/5 (Submit Your Rating)

Hartford, CT

SUMMARY

  • Highly competent Data Stage Developer/ Consultant with over 7 years of experience in Information technology in Design, Development, Administrator and Implementation of various database and data warehouse technologies (IBM DataStage v 8.X/7.X and Informatica) using client components like Administrator, Manager, Designer and Director.
  • Over 5 years’ experience in writing complex SQL queries and PL/SQL, including the use of stored procedures, functions and triggers to implement business rules and validations in the situation of Microsoft SQL, Oracle 11g/10g/9i.
  • Involved in complete Software Development life - cycle (SDLC) of various projects, including requirements gathering, system designing, data modeling, ETL development, production enhancement, support and maintenance.
  • Extensive experience in analysis and design of database including ER Diagrams and Normalization techniques.
  • Strong knowledge of UNIX shell scripting (C, Bourne, Korn and Bash shell), on AIX, HP, Sun OS, RHEL and C, C++, Perl and Python Programming.
  • Experience with Star and Snowflake Schema, Data Modeling, Fact and Dimensional Tables and Slowly Changing Dimensions.
  • Experience includes working in various industries like Financial, Supply chain management, Healthcare, Insurance and Retail.
  • Experienced in scheduling sequence, parallel and server jobs using DataStage Director, UNIX scripts and scheduling tools.
  • Extensively worked on performance tuning and backup on databases like Oracle 10g/9i/8i.
  • Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter.
  • Experience in integration of various data sources like Teradata, Oracle, DB2, SQL Server, MS Access, Sybase, Informix, MS Excel and Flat files into the Staging Area. Extensively worked with materialized views and TOAD.
  • Imported the required Metadata from heterogeneous sources at the project level.
  • Good experience in scheduling Jobs using AutoSys, Tivoli, Zeke and Corntab.
  • Knowledge in using PL/SQL to write stored procedures, functions, and triggers.
  • Generated Surrogate IDs for the dimensions in the fact table for indexed and faster access of data in server jobs.
  • Created local and shared containers to facilitate ease and reuse of jobs.
  • Experience in resolving Data Transformation, Cleansing and Capturing rejects and Exception and Error reporting.
  • Extensively worked on SAP MDM Pack to deliver data to online analytical processing (OLAP).
  • Good knowledge on reporting tools like, Business Objects, Cognos 7 and microstrategy.
  • Good experience in Transforming Business specific rules into functional Specs.
  • Experience in Production Support, extensively worked on production support issues and resolved them using session logs, workflow logs, and used e-mail task for capturing issues via e-mail along with the session logs.
  • Working experience in interacting with business analysts and developers to analyze the user requirements, functional specifications and system specifications.
  • Ability to work autonomously and also as part of a team under tight deadlines so as to meet any project expectations.

TECHNICAL SKILLS

Data Warehouse: IBM WebSphere DataStage 8.5/ 8.0.1/7.5/ 7.1 (Designer, Director, Manager, and Administrator), MetaStage 6.0/ QualityStage, Parallel Extender 6.0, Orchestrate, Information Analyzer/ProfileStage, and Informatica 7.x/6.x.

Dimensional Modeling: IBM Rational Rose, ERwin R8.1 and Visio Diagram.

Reporting Tools: Crystal Reports 6.x/ 5.x, Cognos8.x/7.x, Microstrategy and MS Access Reports.

Databases: Oracle 11g/10g/ 9i, DB2, Teradata R12, MS SQL 2005/ 2000/ 7.0 , Informix, MS Access 97/ 2000/2007.

Languages: SQL, PL/ SQL, C, C++, VC++, Perl, Java, .Net, JDBC, VB, and XML.

Unix Tools: Shell scripts (C Shell, K Shell, and Bourne Shell), Perl, Python, AWK, VI, and SED

Operating Systems: HP-UX, IBM-AIX, Sun Solaris, Red-Hat Linux, Red hat enterprise Linux 4 AS/3, Susie Linux, Windows 2000/XP//2003/Vista

Others Tools and Applications: SQL Plus, SQL*Loader, SQL Developer MS Word, MS Excel, MS PowerPoint, MS Project, Minitab, HTML, Putty, Win SCP.

PROFESSIONAL EXPERIENCE

Confidential, Hartford, CT

Sr. ETL Data Stage Developer

Responsibilities:

  • Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage.
  • Involved in design of dimensional data model - Star schema and Snow Flake Schema
  • Generating DB scripts from Data modeling tool and Creation of physical tables in DB.
  • Worked SCDs to populate Type I and Type II slowly changing dimension tables from several operational source files
  • Created some routines (Before-After, Transform function) used across the project.
  • Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File Stage, Sequential file stage.
  • Implemented Shared container for multiple jobs and Local containers for same job as per requirements.
  • Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x
  • Implemented multi-node declaration using configuration files (APT Config file) for performance enhancement.
  • Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)
  • Debug, test and fix the transformation logic applied in the parallel jobs
  • Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.
  • Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics.
  • Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.
  • Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.
  • Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.
  • Repartitioned job flow by determining DataStage PX best available resource consumption.
  • Created Universes and reports in Business object Designer.
  • Created, implemented, modified and maintained the business simple to complex reports using Business objects reporting module.

Environment: IBM Info sphere DataStage 8.5, Oracle 11g, Flat files, Autosys, UNIX, Erwin, TOAD, MS SQL Server database, XML files, MS Access database.

Confidential, Charlotte, NC

ETL Datastage Developer

Responsibilities:

  • Designed the ETL jobs using IBM Web Sphere Data stage 8.0.1 to Extract, Transform and load the data into Staging and then into Target Database.
  • Extensively used the designer to develop various parallel jobs to extract, transform, integrate and load the data into Corporate Data warehouse (CDW).
  • Designed and developed the ETL jobs using Parallel edition which distributed the incoming data concurrently across all the processors, to achieve the best performance.
  • Handled Performance Tuning of Jobs to ensure faster Data Loads
  • Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Pivot, and Sort, Surrogate key Generator, Change Data Capture (CDC), Modify, Row Generator and Aggregator.
  • Created Master controlling sequencer jobs using DataStage Job Sequencer.
  • Extensively worked with Shared Containers for Re-using the Business functionality.
  • Extensively developed and deployed UNIX Shell scripts as wrappers that provide values to DataStage jobs during runtime.
  • Created Job Parameters and Environment variables to run the same job for different sources and targets.
  • Used Director to monitor jobs, run and validating its components.
  • Involved in the Performance Tuning of the DataStage jobs using different partition methodologies and node configurations of the environment variable file, designing and editing configurations, increasing the reading as well as the writing speed while fetching or loading data to files or databases.
  • Extensive worked with jobs export, jobs import and multi job compilation etc.
  • Provided data models and dataflow (extract, transform and load analysis) of the data marts and feeder/target systems in the aggregation effort.
  • Migrated projects from development to QA to Production environments
  • Performed the Integration and System testing on the ETL processes.
  • Taking the regular backups of the jobs using Data Stage Export/Import utility.
  • Working with BI team to apply the business rules for OLAP, designing the Frame Work models.
  • Assisted operation support team for transactional data loads in developing SQL & Unix scripts
  • Scheduled jobs using Autosys job scheduler utility based on the requirements and monitored the production closely for any possible errors.

Environment: IBM Web sphere Data Stage 8.0.1, IBM AIX 5.2, Oracle 10g, XML files, Autosys, MS SQL Server database, sequential flat files, TOAD.

Confidential, Canton, OH

ETL Datastage Developer

Responsibilities:

  • Involved in jobs and analyzing scope of application, defining relationship within & between groups of data, star schema, etc.
  • Used the Data stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Used Ardent DataStage Designer/Manager/Director to design and run jobs to implement the Import and Export Interfaces.
  • Extensively used DataStage Designer components to design various Parallel jobsin accordance with business specs.
  • Monitored the performance of Parallel jobs by turning on various Environment Variables.
  • Worked with various databases of Teradata Warehouse in our project like Stage, Target, and Work and Error datastage.
  • Created testbeds for Datastage plug-ins for DB2, Oracle and Teradata to test extracting data from these databases.
  • Extensive experience in creating and loading data warehouse tables like dimensional, fact and aggregate tables using Ardent Data Stage.
  • Used Partition methods and collecting methods for implementing parallel processing.
  • Used Job Control routines and Transform functions in the process of designing the job.
  • DevelopedjobsinParallelExtenderusingdifferentstageslikeTransformer, aggregator,lookup,Sourcedataset,externalfilter,Rowgenerator,andcolumn generator.
  • Involved in Designing and developing universes for reporting generation from warehouse databases.
  • Making the user resources more flexible as per the business requirement by exporting Universe and BO reports.
  • Developed scrubbing Routines to clean address information for Vendors and Customers inQualityStage.

Environment: Ascential Datastage 7.5 (DataStage Manager, DataStage Designer, DataStage Administrator, DataStage Director, Orchestrate, Integrity), Parallel Extender, ETL,PL/SQL, Unix Shell programming, Erwin, WinNT 4.0 SERVER, DB2, UDB, Sybase and UNIX.

Confidential, Richmond, VA

Data Stage Developer

Responsibilities:

  • Involved in creating Table definitions, indexes, views, sequences, Materialized view creation
  • Prepared documentation for addressing the referential integrity relations in between the tables at ETL level
  • Redesigned the existing server jobs with a different logical approach to improve the performance
  • Extensively used Ascential DataStage Designer for creating DataStage Jobs and created Shared Containers for reusability.
  • Extensively used all the stages in Server Jobs like OCI, Hash File, Transformer, Sequential File, Link Partitioner, Link Collector and IPC.
  • Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis)
  • Worked with DataStage Manager for importing metadata from repository, new job categories and creating new data elements
  • Involved in designing the procedures for getting the data from all systems to Operational Data Store.
  • Extensively used DataStage Designer components to design various Parallel jobsin accordance with business specs.
  • Created DataStage jobs, batches and job sequences and tuned them for better performance optimized
  • Used Job Control routines and Transform functions in the process of designing the job.
  • Worked on programs for scheduling Data loading and transformations using DataStage from DB2 to Oracle 9i using SQL* Loader and PL/SQL
  • Extensively worked with various stages in parallel Extender like Sequential file, Dataset, lookup, peek, transformer, Merge, Aggregator, row generator and many more to design jobs and load into Dimension and Fact tables.
  • Involved in performance tuning of the ETL process and performed the data warehouse testing
  • Prepared documentation including requirement specification.
  • Designed XML stages for reading XML log files for capturing data stage jobs audit data.

Environment: DataStage7.5/EE/PX,MetaStage,DB2,Oracle9i/8i,SQL,PL/SQL,SQL*Loader, Erwin 3.5, SQL Server, Sybase, Windows NT.

Confidential, Jersey City, New Jersey

ETL Developer

Responsibilities:

  • Developed / designed various new processes and fixed the existing process with new business requirements, various meetings with users Input.
  • Designed and developed jobs for extracting, transforming, integrating, and loading data into data mart using DataStage Designer
  • Developed, executed, monitored and validated the ETL DataStage jobs in the DataStage designer and Director Components.
  • Worked with DataStage Director to schedule, monitor, analyze performance of individual stages and run DataStage jobs.
  • Extensively used change capture, Transformer, Modify, copy, Join, Funnel, Aggregator, Lookup Stages and development stages to develop the parallel jobs.
  • Generated Surrogate Keys for composite attributes while loading the data into Data Warehouse using Key Management functions.
  • Developed user defined Routines and Transformations for implementing Complex business logic.
  • Developed Job Sequencer and batches and have edited the job control to have jobs run in sequence.
  • Imported Metadata from Oracle database. Imported metadata definitions into the repository. Exported and imported DataStage components using DataStage Manager.
  • Involved in the preparation of ETL documentation by following the business rule, procedures and naming conventions
  • Performed Troubleshooting and Tuning of DataStage Jobs for better query performance.
  • Reviewing the developed jobs based on the build review checklists.
  • Responsible for UNIT, System and Integration testing. Developed Test scripts, Test plan and Test Data.

Environment: Ascential DataStage 7.X/6.X, PVCS Version Controller, Citrix, Mercury Test Director, Oracle, Teradata and UNIX.

We'd love your feedback!