Senior Data Stage Developer And Team Lead Resume
Chicago, IL
SUMMARY
- 8+ years of extensive and diversified IT experience in Analysis, Design, Development, Testing, Implementation and support life cycle for Data Warehouse using ETL tool IBM Information Server (IBM Websphere DataStage 9.1/8.5/8.1/8.0.1/7 x/7.5 DataStage Enterprise Edition (Parallel Extender)).
- 1.5 Year of experience in working with Data - Services (SAP-BODS)
- 2 Year of experience in working with Clear-Case version control tool.
- 12 Months of Sub Version, version control tool Experience.
- Extensive experience in using Oracle development tool set (Including PL/SQL, SQL*Plus, SQL Loader, PL/SQL developer, Teradata SQL Developer).
- Experience in integration of various data sources like Teradata 12.0, Oracle 11g/10g/9i/8i, MS SQL Server, DB2, Sybase and Flat Files (like CFF) into the staging area.
- Expertise in writing UNIX shell scripts and hands on experience with scheduling of shell scripts and DataStage jobs meeting various conditions on Required Days using scheduling tools like Autosys, Zena, Control-M and Tivoli,
- Designed a frame work in Perl which makes the deployment process automated using cleartool commands and istool commands.
- Migrating jobs from DataStage 7.5.1 to DataStage 8.1 Enterprise Edition.
- Experienced in SDLC, Agile, Waterfall and Hybrid methodologies.
- Participated Actively in Server Upgrade and Data Stage Version Upgrade.
- Have hands on Experience in server Upgrade from GFS to GPFS (Migrating the BAU from one server to another), which involves in Copying Persistent Data Sets from one server to another, Using FDNS Technology in Autosys.
- Hands on experience with Meta-Data-Workbench and Fast Track. Designed Parallel jobs using various stages like join, merge, lookup, remove duplicates, filter, dataset, Sequential file, modify, aggregator, CFF, Transformer, sort, Generic stage, Web services, Java.
- Excellent communication, interpersonal, analytical skills and strong ability to perform as part of a team. Strong analytical and problem solving skills. Willingness and ability to quickly adapt to new environments and learn new technologies.
- Experience in HP Quality Center and RTC (Rational Team Concert) to manage manual and automated test cases and to track defects.
- A good team player who always encourages team work and motivates team members.
- Can lead and motivate geographically dispersed online teams.
- Good in visualizing, envisioning, directing and decision making.
TECHNICAL SKILLS
ETL: Web Sphere DataStage 9.1/8.5/8.1/7.5 (Designer, Director, Manager, Administrator), Parallel Extender, SAP BODS
Languages: SQL, PL/SQL, Shell Script, Perl
Databases: DB2, Oracle 8i/9i,10g,11g SQL Server 2005, Teradata 12.0MS-Access
Package: MS Office (Visio, Word, Excel, PowerPoint)
Operating Systems: UNIX, AIX, LINUX, MS-DOS, Windows.
Oracle Tools: PL/SQL Developer, SQL*PLUS, SQL LOADER, TOADSQL Work Bench
Version Control Tool: Clear Case, Sub Version, Serena.
PROFESSIONAL EXPERIENCE
Confidential, Chicago IL
Senior Data Stage Developer and Team Lead
Responsibilities:
- I was leading the team and was also acting as Senior Data stage developer participating both directly and indirectly in designing data stage jobs and design decisions.
- I was involved in design discussion, and played a key role in gathering all requirements and covered the gaps before designing Data Stage jobs.
- I was part of changing some of the design of existing process and designing new process and installing new DDLs as per the requirement.
- I was also involved in data validation for older process and in order to check if the logic is meeting all the requirements.
- Implemented SCD’s using CDC Stage. Worked closely with mapper and got all the required information for designing new process.
- This project was involved extracting the data from ODS (DB2) and other Web-Sources (Share-Point, web-sites) and transform and loads it to ADW (Teradata).
- Involved in writing SQL-Scripts for validation and writing User defined SQL for DataStage job.
- Developed the Bteq-Scripts, for better performance of the requirement in Teradata.
- Also involved in creating new SQL-Scripts and debugging them in unit testing.
- Used WebServices-Transformer, WebServices-Client Stages to make WebService Calls. All the WebService calls made were SOAP 1.1 binding.
- Imported WSDL file definitions using "Import WebService File Definitions" utility in DataStage Designer.
- Have experience using SOAP UI tool to make the WebService Calls.
- Captured all the Rejected Data from WebService Calls and shared those with customers, which they used to debug the data issues.
- Tracking all defects in Rational Team Concert.
- Used ZENA scheduling tool to schedule the jobs as per the requirement.
- Used Serena tool to version control the code.
- Participated in release activities and dry runs.
Environment: Data Stage 8.5, Serena, AIX 6, Windows 7, Zena, Serena, DB2, Teradata, RTC Defect Tracker.
Confidential, Charlotte, NC
Senior Data Stage Developer
Responsibilities:
- Working with SME in gathering all requirements, and collecting all required information for development.
- Analyze the Business requirements and prepare High level and low level system integration designs. Prepare test scripts, test scenarios, functional and technical specification documents based on the requirements.
- Worked on tools like Fast Track for mapping the source fields to target fields, Business Glossary for writing all Business terms, Technical terms and Term linkage. Metadata Workbench to check the Lineage data flow from Source to Target.
- Used to receive XML-Files from upstream (PIM) and used to parse them and generate the the Flat files and XML-Outputs and send them to downstream BOSS and XAD.
- Used Web services MQ-Stage to read and write messages to Queue.
- Worked on both server and client connections in MQ - Stage.
- Source System used to drop messages in MQ, so we used MQ-Stage in Data Stage to retrieve messages and process them.
- Used Generic Shell-Scripts (Wrapper-Scripts) in common to all DataStage Projects, for running all DataStage Jobs, archive-cleanup of all Data mount points on UNIX.
- Used Clearcase version control mechanism (Branching and labeling mechanism) and designed a process to maintain version control of the code and created an environment for parallel development.
- Designed a frame work in Perl with help of a Perl developer, which makes the deployment process automated using cleartool commands and istool commands.
- Worked in designing Job Batches and Job Sequences for scheduling parallel jobs using UNIX scripts, Autosys Jobs (Jils) and DataStage Director.
- Used Java Stage while converting Mainframes code to ETL but don’t want to change Java-Object used, so used Java stage to call the Jar Files.
- Used Java Transformer Stage to Load the blob-data into DB.
- Used Java stage to run API some times.
- Used Autosys for designing the jils and scheduling all the DS-jobs and Scripts based on different conditions like (Success of previous, file watcher etc).
- Worked in support team to support the deployment, ABD (Automated Build and Deployment) Support team and in Operations Team where I used to support for Production.
- Worked in Platform Support Team where we build New-Servers and migrated all the BAU without any failure and delay.
- Have hands on Experience in server Upgrade from GFS to GPFS (Migrating the BAU from one server to another), which involves in Copying Persistent Data Sets from one server to another, Using FDNS Technology in Autosys.
- Copied Data Sets from one server to another Server during GPFS upgrade.
- Working on DS upgrade from 8.5 to 9.1.
- Worked on Sub-Version as a part of Migration ABD Frame Work from Clearcase to Subversion.
- Working closely with the offshore team as onsite coordinator.
Environment: DataStage 9.1/8.5, Perl v5.8.8, Clear Case, Subversion, UNIX Red-Hat 4.1, Shell Scripts Windows XP/7, Autosys, Metadata Workbench, Oracle 11g, DB2.
Confidential, Alpharetta, GA
ETL DataStage Developer
Responsibilities:
- Developed Functional and Technical specification documents for the requirements. Used to work with FICO, SD, PS and MM Modules Business People to complete requirements gathering.
- Worked closely with business analysts in requirements gathering, reviewing business rules and identifying data sources (SAP, PeopleSoft and Oracle).
- Used DataStage Designer for importing & creating metadata definitions from repository view and edit the contents of the repository.
- Cleansing the extract using the quality stages in DataStage.
- Tuned Source, Targets, jobs, stages and Sequences for better performance. Developed SQL Procedures for synchronizing data.
- Used the copy stage in most scenarios as it moves the data at operating system level.Used proper partitioning technique to improve the performance of the data
- Always queried the data from the oracle database and created index on the table for faster retrieval.
- Performed the datastage admin work for killing the pids (Process Id) and removing the locks and installing the patch work on the datastage server, assigned the credentials to Users for different environments.
- Involved in analyzing the quality of the jobs developed by the team members and providing the suggestions to improve the performance. And did the Performance Tuning
- Involved in discussions with Business Team for scrutinizing the requirements for extracting the data from the source systems.
- Worked on the SAP SD, FICO, PS and MM modules and helped the SAP functional people in analyzing data.
- Have Hands on Experience in designing jobs in Data Services.
Environment: DataStage 8.1, SAP BODS, Oracle 11g, Teradata SQL UNIX Aix 5.3 Shell Scripts Windows XP, SAP Ecc 6.0.
Confidential, New York, NY
ETL DataStage Developer
Responsibilities:
- Worked closely with business analysts in requirements gathering, reviewing business rules and identifying data sources. Worked on analyzing the systems and gathering of requirements.
- Work with the project and business teams to understand the business processes involved in the solution.
- Involved in designing and development of data warehouse environment.
- Extensively interacted with business Analysts for design of Dimensions and facts tables for data warehouse using Star Schema.
- Extensively used DataStage Designer Administrator Director for creating and implementing jobs.
- Used DataStage Designer to extract cleanse transform integrate and load data into the data Mart
- Extracting data from Oracle, and Flat File sources. Cleansing transforming and loading data into the target database using DataStageDesigner
- Used DataStage Designer for importing & creating metadata definitions from repository view and edit the contents of the repository.
- Used DataStage Director and its run-time engine to schedule run and test and debug its componentsand monitor the resulting executable versions (on an ad hoc or scheduled basis).
- Used UNIX shell scripts to automate the Data Load processes to the target Data warehouse to concatenate files before loading data into target tables.
- Fine tuned DataStage jobs and routines for optimal performance.
- Extensively tested DataStage jobs for data integrity using debugger.
- Developed DataStage jobs extensively using Aggregator, Sort, Merge, and Data Set in Parallel Extender to achieve better job performance.
- Used lookup stage with reference to Oracle table for insert/update strategy and for updating slowly changing dimensions.
- Used DataStagesequencer jobs extensively to take care of inter dependencies and to run datastage server/parallel jobs in order.
- Used Parallel Extender Development/Debugging stages like Row generator, Column Generator, Head, Tail and Peek.
- Implemented some of the transformation logic from view reports in to the ETL.
Environment: DataStage 8.1, Oracle 11g, Teradata, SQL UNIX Aix 5.0 Shell Scripts Windows XP.
Confidential, Foothill Ranch, CA
ETL Designer/ DataStage Developer
Responsibilities:
- Worked at Confidential on the Presto EDW Project designed to cater to the Decision Support Systems of the various business segments.
- Primarily worked on the conversion of the existing jobs in DataStage Version 7.5.1 to DataStage Version 8.0 and in corporation of data from new business acquisitions into the DWH.
- Executed Proof of concepts to understand the impact to older jobs in the DataStage Version 8.0
- Migrated existing jobs to DS Version 8.0 and did scheduled multiple runs of sequences to verify the ETL processes
- Understood functional documents for new business source systems and applications and their integration into the existing DWH.
- Wrote the Design documents and specifications in accordance with Confidential ’s design frameworks and best practices. Defined the Mapping and Test Case documents.
- Identified source systems connectivity, related tables and fields and ensure data suitably for mapping.
- Extensively analyzed the Data Sources in identifying data anomalies, patterns, value ranges. Wrote SQL scripts for accomplishing the same. Loaded data using both DB2 API and EE Stages.
- Used DataStage Enterprise Edition/Parallel Extender stages namely Datasets, Sort, Lookup, Peek, Standardization, Row Generator stages, Remove Duplicates, Filter, External Filter, Aggregator, Funnel, Modify, Column Export in accomplishing the ETL Coding.
- Developed DataStage job sequences used the User Activity Variables, Job Activity, Wait for File stages, Execute Command, Loop Activity, Terminate
- Wrote Extensive Unix scripts for running the DataStage jobs.
- Developed Error Logging and Auditing strategies for the ETL jobs.
- Co-ordinate the deployments of code across test and production environments with the deployment teams. Undertook Change and Enhancements request
- Tuned DataStage jobs to enhance their performance.
- Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.
- Wrote Release notes, Deployment documents and scheduled the jobs via the Tivoli Scheduler.
- Wrote DDL scripts for creating, altering tables.
- Extensive usage of AQT for analyzing data, writing SQL scripts/functions, performing DDL operations.
Environment: DataStage 7.5.1/8.0, DB2 Version 8.4, AQT, AIX, SQL/PL/SQL, Unix Shell Scripting, Ultra Edit, Tivoli.
Confidential, Bloomington, IL
DataStage ETL Developer
Responsibilities:
- Involved in understanding the Requirements of the end Users/Business Analysts and Developed Strategies for ETL processes.
- Analyzedthevarious subject areas in ODS database along with Business Analyst to determinethe exact business functionalities requiredforthedelivery tothedownstreamsystems.
- Design, Develop and Unit Test Complex ETL jobs. Performance tuning of ETL jobs.
- Responsible for the detailed design documentation. Responsible for Mini specifications document dat describe the functionality of each subject area.
- Used to gather data from operational and external environments to the business intelligence environments and other sources using Data mart
- Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions. Implemented migration process from different systems.
- Used DataStage as an ETL tool to extract data from sources systems, aggregate the data and load into the DB2 database, Implemented bulk loading method for loading of data.
- Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.
- RestructuredtheDataStage jobs to enhancetheperformance and clarity ofthejob and also senttheexact data tothe downstreamsystems
- Developed UNIX shell scripts to automate the Data Load processes to the target.
- Involved in defining technical and functional specifications for ETL process
- Involved in performance tuning of the ETL process and performed the data warehouse testing.
Environment: DataStage 7.5.1a, DB2 UDB, Oracle 10g, SQL Server 2005, UNIX, Windows 2000.
Confidential, Lexington, KY
DataStage Developer
Responsibilities:
- Extensively usedData Stage Manager to develop jobs for extracting, transforming, integrating and loading data into data warehouse tables.
- Extensively used theHash file Stage, Aggregator Stage, Sequential File Stage, Oracle OCI stage.
- Was involved ingetting requirementsfrom thebusiness usersand preparing the appropriateETL Design specificationsandtest plans.
- Providedproduction and QA support to the project
- Participated in reviews ofbusiness requirement analysisand assisted with defining requirements.
- UsedJob sequencerfor setting up the job execution sequence.
- Wrote user definedData Stage routinesand transform functions to carry out the complex transformations in thetransformer stage.
- Performedunitandintegrated testingof theETLprocess.
- Was involved inMigratingtheETLProcess from Development to QA and QA to production usingData Stage Manager Export/Import utility.
- WroteSQL scriptsto load the manually maintained tables in the staging area.
- Was involved in writingUNIX Scriptto automate the ETL process.
- Was involved in performance tuning of the ETL process and upgrading the ETL best practices document.
- Wrote user definedSQL queriesto extract the suitable data fromOraclesources.
Environment: Data Stage 7.5.1, Oracle 9i, Oracle 10g, TOAD, Windows XP, UNIX Sun Solaris and Microsoft Visio.
Confidential
DataStage Developer
Responsibilities:
- Responsible for design, development, testing and deployement of custom data warehouse systems and applications.
- Developing the ETL jobs based on the mapping document.
- Participated in system analysis and data modelling, which included creating tables, views, indexes.
- Involved in achiving maximum performance of existing jobs.
- Tuning the SQL code for performance enhancement.
- Introduced to design and development of Dtatastage Server jobs.
- Used Transforms and built in functionss available in DataStage.
Environment: IBM DataStage 7.5.1, Oracle 9i, Windows NT, UNIX, PL/SQL, SQL Server 2005.