Sr Data Stage Developer Resume
KansaS
SUMMARY
- Around 8 years as an IT professional with expertise in Requirement Analysis, Development, and Implementation of Data warehouse and Data Integration Projects.
- Experience in design, development, testing, and implementation of Data warehouse, Data Integration Projects using DataStage, Oracle, Sybase, SQL Server Db2, Teradata, and UNIX.
- Worked extensively on DataStage 11.7, DataStage 11.5, DataStage 8.5v and Experienced in creating High Level and Low - level Design.
- Experienced in Designing, Developing, Documenting, Testing ETL Jobs and mappings in both Sequence jobs and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
- Expertise in developing strategies for Extraction, Transformation, and Loading (ETL) mechanism.
- Proven track record in troubleshooting Data Stage jobs and addressing production issues like performance tuning and enhancement.
- Extensively used Teradata utilities like FastExport and MLoad.
- Extracted data from various sources like Oracle and DB2 and loaded into Teradata.
- Implemented slowly change dimensions (SCD) type 1, type 2, type 3.
- Implemented incremental extraction and incremental load.
- Worked Extensively on Parallel and Sequence Jobs.
- Expertise in creating in Shared Containers.
- Designed and Implemented DataStage Framework which was used across many Projects.
- Experienced in teh integration of various data sources (Oracle, DB2, Teradata, SQL Server, and XML) into teh data staging area.
- Developed Control M jobs to schedule DataStage jobs.
- Experienced in Scheduling jobs through Director.
- Experienced in Scheduling DataStage jobs (Unix Scripts triggering DataStage Jobs) through crontab.
- Establish “Best Practices” and plan for continuous improvement of processes.
- Experienced in creating projects, users and assigning roles to them.
- Converted Complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Worked Extensively on Server Routines.
- Provided training to other team members to implement best practices in Data Stage.
- Experienced in creating UNIX shell scripts.
- Expertise in Using HP Quality Center to upload Test cases and maintain Test Results.
- Proficient in writing, implementation, and testing of triggers, procedures, and functions in PL/SQL and Oracle.
TECHNICAL SKILLS
ETL Tools: DataStage 11.7,11.5, 11.3
Relational Databases: Oracle 8i/9i/10g/11g, DB2 9, Teradata, Sybase
Development/Productivity Tools: Azure Devops, Service Now, Asz-Zena, Control-M, HP Quality Center, Unix Shell scripts, Robot, Data-Studio, DBeaver, IBM box
Languages: SQL*PLUS 8.x/9.x, PL/SQL 8.x/9. x.
Editing Tools: Adobe Photo Shop CS 6, Note Pad ++
Operating Systems: Linux/UNIX, Windows 95/98/NT/2000
PROFESSIONAL EXPERIENCE
Confidential, Kansas
Sr Data Stage Developer
Responsibilities:
- As an ETL developer did alot of enhancements using teh Datastage 11.7 tool. we have source as store proc and do some transformations and again load through store proc. we have a total of 20 PX jobs. from source to raw, raw to stage and stage to master stage and master stage to fact. Using alot of intermediate tools like join lookup, sequence file, transformer, ODBC, filter, funnel, Routines handled alot of null to not null and base36 logic to generate a unique key to dimension and fact tables.
- Used Teradata load stage and Teradata commands to load teh database with data from input links. Support for loading from a stream input link to provide rows of data into teh target table.
- Most logic is handled in Source query and created alot of complex queries with 20 -30 joins to retrieve expected output.
- Used Unix commands to read teh count of records in teh file and move teh file from one location to another. In teh PX job after teh job is completed we use Unix commands to move to teh processed folder.
- GIT Bash to migrate code to a higher environment, Once we move teh code to a higher environment we validate teh job run and data check-in target tables.
- For unit testing used excel and win merge tool to compare teh SQL query and validated teh output as business expectation.
- Agile methodology, we have a Kanban board and task tab to add all teh work to be completed. We update before stand up if we have any blockers to impended state and discuss it and plan according to our priority.
- Creating Low-Level Design documents by understanding teh requirements and theirby develop jobs.
- Worked in Data Acquisition project, where we need to extract data from different sources, process teh data and generate teh files and transfer these files to Target Systems.
- Worked on several change requests, which were created because of production incidents and require changes to teh code in a production environment.
- Responsible for using different types of Stages such as ODBC Connector, Oracle Connector, DB2 Connector, Teradata Connector, Transformer, Join, Sequential File to develop different jobs.
- Developing DataStage Parallel and Sequence Jobs.
- Developed common Jobs, Shared containers, and Server Routines which are used across teh project in most of teh interfaces.
- Created Unix Shell Scripts that take care of end-to-end automation. Developed UNIX Shell Scripts that trigger DataStage jobs, transfer teh output files, perform basic validations on file.
- Prepared unit test cases and executing them.
- Fix teh defects raised by teh testing team and maintain teh status in HP Quality Center.
- Extensively used SQL tuning techniques to improve teh performance of Data Stage Jobs.
- Tuned DataStage transformations and jobs to enhance their performance.
- Deploy teh developed code to GIT, Production environments and validate teh code.
- Work closely with software developers/project owners and BAs to develop and execute thorough test suites in all phases of teh software development cycle
- Develop Test strategy, test plan/design, execute test cases and defect management for teh ETL & BI systems
- Develop and execute detailed ETL related functional, performance, integration and regression test cases, and documentation
- Analyze and understand teh ETL work flows developed.
- Quality Management - Knowledge of quality management methods, tools and techniques
- Create UNIX shell scripts to access and move data from production to development environment
- Communicate and manage expectations of teh senior management team and affected stakeholders during teh planning and roll out of project releases
- Exposure to DB tools: Toad/PL SQL developer/SQL Plus
- Strong in SQL preparation in Oracle/SQL Server/DB2/Teradata RDBMS
- Strong noledge on ETL and BI processes
- Extensive Working experience in applying Relational Database Concepts, Entity Relation diagrams and Normalization concepts.
- Excellent Knowledge in Ms-Office (Word, Excel, PowerPoint), Ms-Access, MS Visio.
- A very good team player with excellent Communication (Verbal and Technical), Presentation and reporting skills.
Environment: IBM Infosphere Datastage11.7, Datastage11.5, WinSCP, SQL, DB2, Netezza UNIX shell Script, AQT, WinSCP, Microsoft SQL server management studio, ASG-Zena, Azure DevOps
Confidential, Missouri
Sr Data Stage Developer
Responsibilities:
- Active participation in decision making and QA meetings and regularly interacted with teh Business Analysts &development team to gain a better understanding of teh Business Process, Requirements& Design and PI Planning’s, Refinements, Iteration Plannings.
- Used DataStage as an ETL tool to extract data from sources systems, loaded teh data into teh IBM DB2 database, flat file, Excel
- Designed and Developed Data Stage Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data, and Loaded into Data Warehouse
- Our team consists of Data Analyst, Business Analyst, Developers, Testers, and Scrum Master.
- We follow a set of guidelines to accomplish business needs. Data governance and IT governance
- Business interacts with our team and BAs and DAs will get teh business requirement for each sprint.
- As a developer, Estimate teh requirements how and by when I can complete
- Majority of my work is focused on Provider exchange(PEX) project, Source as JSON file and check for member and provider exits and send a response back to teh association.
- Used different types of stages like Transformer, CDC, Remove Duplicate, Aggregator, ODBC, Join, Funnel, dataset and Merge for developing different jobs.
- Extensively used Parallel stage like row generator, column generator and Peek Stages for debugging purpose.
- Responsible for using different types of Stages such as ODBC Connector, DB2 Connector, Transformer, Join, Sequential File to develop different jobs.
- For teh scheduling process, we use teh ASG-Zena process it’s a good tool to play with. Created event and request trigger jobs.
- Deploy code through Azure DevOps with teh Git-Bash tool. We have a process to deploy code review, Unit testing, and Deployment documentation. Finally, create an RFC with dependency teams to go Prod in between teh Production window.
Environment: IBM Infosphere Datastage11.5, WinSCP,Microsoft SQL server management studio, DBeaver, Azure DevOps, Git-Bash, AQT for DB2 6.5v, SQL Server, Sybase, Unix,, GIT Version control, ASG-Zena scheduler, Visual Studio, Putty, CDMA Mapping tool.
Confidential, Boston, MA
Sr Data Stage Developer/QA Tetser
Responsibilities:
- Attended meeting with business in gathering requirements and for developing teh code.
- Analyzed if teh new requirement impacts teh excising system and how to fit into teh system.
- Prepared mapping documents for designing and developing teh DataStage jobs
- Worked on general Ledger team for commissions and medical value payment process data coming from various vendors like BBC’s Association, BcbsSC.
- Worked with CAQH Vendor for member data for coordination of benefit program
- Jobs are designed by LRD and Mapping document. Communicated parallelly with Development team (Offshore) and Client (Onshore) Team’s to meet requirements for logical design.
- With teh halp of DataStage designer tool, extracted data from various source systems and implemented logic and loaded into target database and outbound files to send it to vendors.
- Worked on teh Automation process for Medical value Payments and Member point touch measures.
- Worked on Server and parallel jobs and created parallel jobs. converted server jobs to parallel jobs with more efficient way.
- Worked with store procs and converted store procs into DataStage jobs.
- Extensively used ODBC connector stage, DB2 UDB stage, Complex flat file, Sequential file, dataset, Aggregator, Change capture, Copy, Filter, Link collector, hash file, FTP, Join, Lookup, Sort, transformer stages.
- Used Data Stage Director for job monitoring, view logs, testing and debugging its components.
- Worked on performance tuning of teh SQL queries
- Used various existing routines for date conversions and calculating amounts and age
- Worked on deploying teh code to production and their by verifying teh post run data
- Worked with different teams like code review team, deployment team, Scheduling team and production team in different situations to get teh work done.
- Prepared DDL’s and DML and provided to DBA as part of table deployment changes.
- Created complex sequence jobs and control jobs and adding them to teh Zena scheduler.
- Ran teh scheduled job in request time trigger mode using Zena scheduler in Development, Test, Stage environments.
- Worked on Unix shell script based on teh Provided business requirement and running them in debug mode to identify teh warning and errors and fixing them
- Used IBM box folder for a daily status update in project tracker and issue tracker.
- While testing jobs found few challenges so turnaround to development and required to do data analysis to update job design to match logic.
Environment: IBM Infosphere Datastage11.5, DBeaver, WinSCP, SQL Developer, DB2, Mainframe DB, Sybase.
Confidential, Greensboro, NC
Sr Data Stage Developer
Responsibilities:
- Worked for data separation project company within a company, where we need to extract data from different sources, process teh data and generate teh files and transfer these files to Target Systems of newly created schemas.
- Active participation in decision making and QA meetings and regularly interacted with teh Business Analysts &development team to gain a better understanding of teh Business Process, Requirements& Design.
- Used DataStage as an ETL tool to extract data from sources systems, loaded teh data into teh IBM DB2 database
- Designed and Developed Data Stage Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data, and Loaded into Data Warehouse.
- Created DataStage 11.5 jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Surrogate Key, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity.
- Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring teh resulting executable versions on an ad hoc or scheduled basis.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Created master jobs sequencers.
- Also, worked on different enhancements in FACT tables.
- Responsible for using different types of Stages such as ODBC Connector, Oracle Connector, DB2 Connector, Teradata Connector, Transformer, Join, Sequential File to develop different jobs.
- Identified source systems, their connectivity, related tables, and fields and ensured data suitability for mapping.
- Created Unix Shell Scripts that take care of end-to-end automation. Developed UNIX Shell Scripts that trigger DataStage jobs, transfer teh output files, perform basic validations on file.
- Prepared Unit Test Cases and executing them. Fix teh defects raised by teh testing team and maintain teh status in HP Quality Center.
Environment: IBM Infosphere Datastage11.5, UNIX shell Script, Data Studio 4.1.3, WinSCPSQL Developer, DB2, IDA (IBM infosphere data architect), Office, Visio, Hp ALM.
Confidential, Los Angeles, CA
Sr DataStage Consultant
Responsibilities:
- Creating Low-Level Design documents by understanding teh requirements and theirby develop jobs.
- Worked in Data Acquisition project, where we need to extract data from different sources, process teh data and generate teh files and transfer these files to Target Systems.
- Worked on several change requests, which were created because of production incidents and require changes to teh code in a production environment.
- Responsible for using different types of Stages such as ODBC Connector, Oracle Connector, DB2 Connector, Teradata Connector, Transformer, Join, Sequential File to develop different jobs.
- Developing DataStage Parallel and Sequence Jobs.
- Developed common Jobs, Shared containers, and Server Routines which are used across teh project in most of teh interfaces.
- Created Unix Shell Scripts that take care of end-to-end automation. Developed UNIX Shell Scripts that trigger DataStage jobs, transfer teh output files, perform basic validations on file.
- Prepared Unit Test Cases and executing them.
- Fix teh defects raised by teh testing team and maintain teh status in HP Quality Center.
- Extensively used SQL tuning techniques to improve teh performance of Data Stage Jobs.
- Tuned DataStage transformations and jobs to enhance their performance.
- Deploy teh developed code to SIT, Production environments and validate teh code.
- Used Robot Scheduling Tool to schedule DataStage jobs.
- Provide Post Implementation Support.
- Work in support of teh development of company products, tools, platforms and services
- Should have worked in Offshore / Onsite model managing offshore resources
- Providing status metrics on testing.
- Strong in ETL data validation developed using Informatica/Abinitio/Datastage/SSIS.
- Provide demos/walk through regarding testing results.
- Work on existing, in-flight and future projects.
- Develop and Design ETL test cases, scenarios, and scripts to ensure quality Data warehouse / BI applications.
Environment: IBM Infosphere Datastage11.5, UNIX shell Script, Robot, ORACLE 10g, SQL Developer, DB2, Teradata, AQT (for accessing SQL Server and DB2), Office, Visio.
Confidential, Buffalo, NY
DataStage Consultant
Responsibilities:
- Worked on an Ecommerce-Dropship project which enables Michael’s Vendors to sell products on Michael’s website.
- Worked on JDA Allocation integration project which takes care of teh seasonal allocation of items from warehouses to stores.
- Understanding teh requirements and preparing Low-Level Design documents and theirby developing parallel and sequence jobs.
- Developed Shared containers and Server Routines which are used across teh project in most of teh interfaces.
- Worked on a Data Migration project.
- Involved in Each Phase of teh project - End to End module.
- Written complex queries to facilitate teh supply of data to other teams.
- Responsible for using different types of Stages such as FTP, Hashed File, Sequential File, Sort Aggregator, Transformer, and ODBC to develop different jobs.
- Designed jobs using different parallel job stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup File Set, Change Data Capture, Modify, and Aggregator.
- Used DataStage Director for Schedule jobs in production for some of teh projects.
- DataStage Enterprise Edition for parallel processing as part of ETL conversion process integrating multiple source systems.
- Prepared Unit Test Cases and executing them.
- Provide support to Testing Team and fix defects raised in different phases of testing.
- Developing DataStage Parallel and Sequence Jobs.
- Developed UNIX shell Scripts used to validate teh files.
- Trained team members in learning DataStage.
- Understand teh existing PL/SQL procedures to extract data loaded by People Soft.
- Have Extensive noledge on creating info types and sending to ADP.
- Created Error Files and Log Tables containing data with discrepancies to analyze and re-process teh data.
- Tuned DataStage transformations and jobs to enhance their performance.
- Lead team size of 4 - 6 members.
- Used Control-M for Scheduling jobs and monitor teh same.
- Provide Post Implementation Support.
Environment: IBM Infosphere DataStage 8.1v, UNIX shell Script, ORACLE 10g, ORACLE 11g, SQL Developer Client for Oracle 10g, Office, Visio.
Confidential
Software Developer
Responsibilities:
- Studying teh business requirement, preparing teh impact analysis document.
- Prepared technical specification document, upon review of teh solution, developed teh solution using DataStage jobs and sequencers.
- Used sequential file stage as teh source for most of teh source systems.
- Developed a file check process that checks teh format, volume, and date in teh file decides whether teh right file is being sent by teh source and whether teh right file is being loaded into teh database.
- Used aggregator, lookup, join, merge, dataset, transformer, sequencer, sequential file DB2 bulk load, hashed file stage, surrogate key generator.
- Created DDL statements for new tables, changes to teh table structure, index changes, and creation of triggers and stored procedures.
- Prepared unit test cases and test plans.
- Executed teh test cases, captured teh results.
- Supported teh SIT testing, UAT testing.
- Worked on packaging teh code with teh halp of tortoise SVN version controlling tool and worked with respective teams to deploy teh code.
- Supported teh system postproduction and worked in coordination with teh production support teams to resolve any issues.
Environment: DataStage 8.1 (Designer, Director, Manager, Administrator) Enterprise Edition, SQL Server 2005, IBM DB2, AS/400, ERwin4.0, MS Visio 2000.