Sr. Datastage Developer Resume
VA
SUMMARY
- Having 9 years of experience in Information Technology with a wide range of skill set on system analysis, design, development, Testing and implementation of ETL methodologies in all phases of Software development Life cycle (SDLC).
- As a certified ETL Consultant, Held responsibilities of Sr. Datastage developer/Tester for all ETL development projects.
- Certified in ETL InfoSphere DataStage v8.5, DB2 Fundamentals
- Expertise in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Extraction, Data Transformation and Data Loading using Datastage.
- Good hands on experience in RDBMS like Oracle, Netezza, SQL Server, Teradata, and DB2.
- Working experience in data modeling and implementing stored procedures using PL/SQL.
- Extensive knowledge of writing complex queries using SQL.
- Worked extensively on different types of stages like Filter, Join, Merge, Lookup, Funnel, Aggregator, Transformer, Sort, Change Capture, Slowly Changing Dimension, Surrogate Key Generator, Remove Duplicates, Sequential File, Dataset, Copy, Enterprise/Connector stages for Databases (DB2, Oracle), ODBC Connector, XML Parser (Hierarchy stage), Shared Containers, Pivot Enterprise, Modify for developing jobs.
- Experience in Quality Assurance (QA) testing Data Warehouse, Database (ETL & BI).
- Sound knowledge with Netezza SQL with IBM Quality Stage 8.1 and info analyzer 8.1.
- Extensive exposure to Star, Snowflake Schema and Multidimensional Data Models.
- Experience in Extraction, cleansing, integration, loading of data from/to disparate data sources.
- Strong knowledge of Extraction Transformation and Loading (ETL) processes using AscentialDataStage, UNIX shell scripting and SQL Loader.
- Expertise in working with various operational sources like DB2, UDBDB2, Oracle, Teradata, Flat Files into a staging area.
- Strong experience in loading and maintaining Data Warehouses and Data Marts usingDataStageETL processes.
- Experience in Using XML in Data stage Integration.
- Hands on experience in writing, testing and implementation of the triggers, Procedures, functions at Database level using PL/SQL.
- Involved in Requirements gathering, Logical design model & physical design model meetings to understand the data flow, frequency of data loads into EDW and create S2T Mapping documents and visio diagrams.
- Excellent experience in extracting data from Teradata, DB2, and Oracle and loading into an EDW.
- Extensive experience in loading high volume data, and performance tuning.
- Involved in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Excellent team member with problem - solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic team player.
TECHNICAL SKILLS
RDBMS: Netezza, Oracle, IBM DB2, MS SQL Server, Sybase
Programming languages: SQL, PL/SQL, UNIX Shell scripts.
Business Analysis: Functional Requirements Gathering, Business Requirements Gathering, Process Flow Diagrams, Data Flow Diagrams, Data Analysis, Requirements Analysis, Design, Data Modeling, Data Architecture, Data Flow, Data Cleansing
Defect tracking& Scheduling Tools: Rational Quality Manager(RQM)/CQ, HP Quality Center/ALM &Tivoli Workload Scheduler(TWS), AutoSys, Control-M
Reporting Tools: Cognos, Crystal Reports.
Operating Systems: Sun Solaris, HP-UX, IBM AIX, Red Hat Linux
Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Visio
ETL: IBM Data Stage 11.5/9.1/8.5/8.1/7.5.2
PROFESSIONAL EXPERIENCE
Confidential - VA
Sr. DataStage Developer
Responsibilities:
- Involved in data modeling session, developed technical design documents.
- Used the ETLDatastageDesigner to develop processes for extracting, cleansing, transforms, integrating and loading data into data warehouse database.
- Created mappings using pushdown optimization to achieve good performance in loading data into Netezza.
- Creating the scheduling plan, job execution timings and sharing with scheduling team (Control-M).
- Migrated the existing Teradata Scripts to Netezza from BTEQ to NZSQL by keeping the business logic same and validating the results across the systems.
- Use the Administrator client to create, delete, and configure project.
- Developed Parallel Jobs using various Development / Debug Stages (Peek Stage, Head and Tail Stage, Row generator stage, Column generator stage, Sample stage) and Processing Stages (Aggregator, Change Capture, Change Apply, Filter, Sort and Merge, Funnel Remove Duplicate Stage).
- Defining the process flow using Data stage job sequences and scheduling the datastage job sequencers using Tivoli Work Scheduler (TWS).
- Prepared the TWS job stream and job info files which are required to upload in TWS database.
- Creation of VAR tables in TWS and Executing/Monitoring the jobs using the TWS.
- Executing the jobs through TWS and monitoring the jobs.
- Designer and Director based on business requirements and business rules to load data from source to target tables.
- Modified the existing job with new functionality in the code. Prepared the test cases for system test.
- Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader. Resolving the defects which have been raised by QA.
- Established best practices forDataStagejobs to ensure optimal performance, reusability, and restart ability.
- Involved in data discovery in profiling different source data.
- Created unit test case documents and discussed with other team for performing integration testing along with performing volume testing & quality testing.
- Involved in performance tuning of long running jobs.
- Reviewing the code developed by the subordinates with respect to naming standards, best practices.
- Worked on different stages for creating the jobs based upon business application.
- Created a new database to monitor the growth and performance of different instances.
- Developed UNIX shell scripts and updated the log for the backups.
- Involved in Unit Testing with the jobs and date loaded in the Target database.
Environment: IBM WebSphereDataStage11.3, Oracle 9i, TWS,DB2, Quality Stage Designer, Director, Administrator), DB2 UDB 8, Teradata V13, Linux, SQL, PL/SQL, UNIX Shell Scripting,DatastageVersion Control, MS SQL server, Netezza 4.x, Mainframe, Autosys, Info analyzer.
Confidential, Minnetonka- MN
Sr. DataStage Developer
Responsibilities:
- Understanding the business requirements and designing the ETL flow inDataStageas per the mapping sheet, Unit Testing and Review activities.
- Handling Change requests - Understanding the application work flow in the existing jobs onDataStageand applying new changes to them, testing & Review activities.
- Adhering to the Process and creating and posting all the required documents and deliverables like the UTC document, Review Checklist, etc.
- Defining the process flow using Data stage job sequences and scheduling the datastage job sequencers using Tivoli Work Scheduler (TWS).
- Executing the jobs through TWS and monitoring the jobs.
- Worked with business analyst to identify, develop business requirements, transform it into technical requirements and responsible for deliverables.
- Provide the staging solutions for Data Validation and Cleansing withDataStageETL jobs.
- Used theDataStageDesigner to develop processes for extracting, transforming, integrating, and loading data into Enterprise Data Warehouse.
- Used Parallel Extender for Parallel Processing for improving performance when extracting the data from the sources.
- Used various Parallel Extender partitioning and collecting methods.
- Extensively worked with Job sequences using Job Activity, Email Notification, Sequencer, Wait for File activities to control and execute theDataStageParallel jobs.
- Created re-usable components using Parallel Shared containers.
- Defined Stage variables for data validations and data filtering process.
- TunedDataStagejobs for better performance by creatingDataStageHashed files for staging the data and lookups.
- UsedDataStageDirector for running the Jobs.
- Written Data stage routines to achieve business logic.
- Extensively written shell scripts in different scenarios.
- Implemented Debugging Methodologies with Break Point Options.
- Designed and implemented slowly changing dimensions and methodologies.
- Transfer data from various systems through FTP Protocols.
- Written data stage routines for data validations.
- Written batch Job Controls for automation of Execution of data Stage Jobs.
Environment: DataStage8.0.1, Oracle 10g, DB2 UDB 9.0, Teradata, TWS, AIX 5.1, UNIX, XML, Mainframe system.
Confidential -Bloomington, IL
Sr. Data Stage Developer
Responsibilities:
- Data stage 8.5 was used to transform a variety of financial transaction files from different product platforms into standardized data.
- Designing ETL jobs incorporating complex transform methodologies using Data Stage tool resulting in development of efficient interfaces between source and target systems.
- Developed ETL jobs to load data from VSAM, GDG, IMS, DB2 databases, Flat files, CSV files to Target and experience with high volume databases on Mainframes.
- Worked with stages like Complex Flat File, Transformer, Aggregator, Sort, Join, Lookup, and Data masking pack.
- Co-coordinating with client managers, business architects and data architects for various sign offs on data models, ETL design docs, testing docs, migrations and end user review specs.
- Primarily involved in Job Design, Technical Reviews and Troubleshooting of jobs.
- Extensively involved in different Team review meetings and conferences with remote team.
- Participated in requirements gathering and created Source to Target mappings for development.
- Extensively designed, developed and implemented Parallel Extender jobs using Parallel Processing (Pipeline and partition) techniques to improve job performance while working with bulk data sources.
- Created and used Data Stage Shared Containers, Local Containers for DS jobs.
- Extensively Worked on Job Sequences to Control the Execution of the job flow using various Triggers (Conditional and Unconditional) and Activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec Command Activities.
- Tuning the jobs for optimum performance.
- Used Data Stage Director to validate, run and monitor the Data Stage jobs.
- Experience in generating and interpreting mapping documentation and translating into detailed design specifications using ETL code.
- Resolved the QA and UAT issues forDataStagejobs.
- Performed the Unit testing for jobs developed to ensure that it meets the requirements.
- Extensively involved with business team for analyzing the source systems data and building the design documents.
- Extensively worked with architects and proposed solutions in building common design approach for building the Job control and error recording tables.
- Completely prepared Naming Standards Document and Low-Level Design Documents which was used across the Projects.
- Prepared ETL job run dependency list by discussing with scheduling team and java extracts team and by considering the load and availability of various systems.
- Prepared mapping documents, technical design document and process flow documents using Visio.
- Prepared integration test case plans and test scenarios along with testing.
- Extensively used the advancedDataStageData warehousing capabilities of almost all Processing stages like Change capture stage, Lookup/Join/Filter/Funnel/Surrogate Key Stages.
Environment: IBM Data stage 8.7 (Director, Designer, Administrator), IBM DB2, UNIX, Oracle, Teradata, Control M, Autosys, DB2, SQL server, Mainframes.
Confidential -Deerfield, IL
Sr. Datastage Developer
Responsibilities:
- Gathering the requirements for the new billing and payment system development to support the overall goal of improved SLA performance within the Claim Solutions organization by reducing server load from Billing & Payment transactions.
- Design and develop queries to extract historical data.
- Low level design of transformation logic for claims processing.
- Analyzing metadata for EDI transactional data used in B2B communication and designing ETL interface for processing EDI data.
- Worked in CVS as version control system.
- Design/Develop/Quality Testdatastagecode to check validations necessary meeting business requirements.
- Support for UAT, Integration testing, Performance testing.
- Working on Batch architecture to transformdatastagejob to a multi-instance one, parameterize and run the processes on daily basis, Create/Submit runsheets for ESP scheduling and automation.
- Shell Scripting to trigger multi-instancedatastagejob in parallel and archiving the files once after processing.
- Responsible for adhoc requests from different teams and data extraction by modifying the parameters.
- Batch architecture to definedatastagejob parameters in database tables and executedatastagejobs on Unix by calling the process ids assigned in the database.
Environment: IBM Infosphere/WebsphereDatastageEnterprise Edition 8.0.1, Oracle 10g, AIX 5.3, Erwin, Putty, ESP Scheduler, Windows 2000 Xp, Toad for oracle.
Confidential, Littleton, MA
DataStage Developer
Responsibilities:
- Understand and gathering business requirements and mappings and prepare design flow and technical specification documents.
- Knowledge in using PL/SQL to write stored procedures, functions, and triggers. Involved in updating Stored Procedure, performance and testing of Stored ProcedureDataStagejobs.
- Defined and implemented approaches to load and extract data from database usingDataStage.
- Migrating the data by using Web services URL or API'S URL loading data into databases through hierarchical Stage & Extract Stage.
- In SAP & Salesforce website by using Username and Password then we need to call the login operation, in that we can find the server URL and session ID, This will connect the API'S.
- I have used the query more call to retrieve records if the result is more than 2000.
- Worked closely with data warehouse architect and business intelligence analyst in developing solutions.
- Used aggregator, look up, join, merge, dataset, transformer, sequencer, sequential file DB2 bulk load, hashed file stage, surrogate key generator.
- Involved in developing the Cognos reports by writing complex SQL queries. Executed jobs through sequencer for better performance and easy maintenance.
- Involved in unit, performance and integration testing of Data Stage jobs.
- Used Data Stage Director to run and monitor the jobs for performance statistics.
Environment: DataStage8.5/11.3, Oracle, Netezza, DB2, API's, CRM Application, Ramp Servers, Dashboard, RTC Tool, Agile.
Confidential
DataStage Developer
Responsibilities:
- DevelopingDataStageparallel jobs, sequence jobs, routines as per the requirement.
- Interaction with client in order to understand the business requirements and develop the jobs accordingly.
- Gathering Business requirements and done the modification in the jobs accordingly.
- Standardized job parameters, job flows and audit process to meet the Design Standards.
- Involved in Post Go-Live Support activities.
- Prepared Design Documents for project Go-Live and shared Knowledge Transfer to Production support team.
- Performed Root Cause Analysis, ETL code fix /data fix for Data issues raised by the Business analyst.
- Participated in reviews of data modeling and business requirement analysis and assisted with defining requirements.
- Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.
- Used the ETL Data Stage Director to schedule running the solution, testing and debugging its components & monitor the resulting executable versions.
- Involved in Importing Metadata from Oracle, DB2 Databases.
- Used CDC Stage and CA to load Type 2 Dimensional Tables.
- Used Data Stage Designer to developDataStagejobs, scheduled the jobs throughDataStagedirector.
- Tuning of Data Stage Jobs for better query performance.
Environment: DataStage9.1, MicroStrategy 9.1.3, Netezza, UNIX.