Sr. Datastage Developer Resume
NJ
SUMMARY
- 11 years of experience as a Software Developer.
- 6+ years of experience in IBM Datastage 7.5.2/8.0/8.1.1/8.5/8.7/9.1/11.3
- 3+ years of experience in IBM Websphere Transformation Extender WTX 8.0.
- Expert in IBM Transformation Extender (ITX).
- Good knowledge in Oracle DBA 9i/11g/12c, RAC and Golden Gate.
- Good knowledge in Apache Hadoop components like Map Reduce, HDFS, Hive, HBase, Pig, Sqoop, OOZIE and Flume.
- Hands - on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration testing.
- Implemented Kimball Life Cycle approach in Data Warehouse projects to develop dimensional data marts rather than big and complex centralized model.
- Implemented bottom-up approach in Health care domain.
- Experience on Data Warehousing applications, directly responsible for the Extraction, Staging, Transformation, Preloading and Loading of data from multiple sources into Data Warehouse.
- Expert in Oracle Data Integrator (ODI).
- Expert in designing Parallel jobs using various stages like Complex Flat File, Join, Merge, Lookup, Remove duplicates, Filter, Change Capture, Change Apply, Sequential File, Modify, Aggregator, XML, Surrogate Key stages.
- Extensive involvement in different phases of the project like Requirement Analysis, Systems Analysis, Data extraction, Cleansing, Transformation and Loading into Data marts.
- Implemented Slowly Changing Dimensions Type 1 and Type 2.
- Implementation of Healthcare applications using FACETS.
- Excellent experience in developing and understanding of HIPPA EDI transaction sets like 837P, 837I, 837D, 270/271,276/277 and ANSI X12 EDI Standards.
- Experience in EDI formats like X1, EDIFACT, XML.
- Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
- Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL processes for data warehouses.
- Experience with UNIX shell scripting for Data validation and scheduling the DataStage jobs.
- Experience in Rally to manage the development of projects in Agile environments.
- Experience in performing detailed design, development, unit testing and deployment of DWH projects.
TECHNICAL SKILLS
ETL: IBM Information Server DataStage 7.5/8.1.1/8.7/8.5/9.1/11.3
EAI Tools: DataStage TX 8.0
Databases: Oracle 9i/10g/11g, Netezza 7.0, Teradata, DB2, SQL Server 2005
Languages: C, SQL, UNIX Shell Scripting
Operating Systems: Windows NT/2000/XP/ Solaris 5.8, 5.10, RedHat Linux 5.1/6.0
Standards and Protocols: XML, ANSI X12, EDI
Scheduling Tools: Control-M, Autosys
PROFESSIONAL EXPERIENCE
Confidential, NJ
Sr. Datastage Developer
Responsibilities:
- Building Data Stage Jobs as per the requirement.
- Involving in day to day production support activities.
- Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Capture, XML and Hash file.
- Worked on Renewal Rate Change Enhancements for Composite Rating and Multiple Location match.
- Extracted data from source transformed and loaded into target database.
- Building DataStage ETL interfaces to aggregate, cleanse and migrate data across enterprise-wide MDM ODS and Data Warehousing systems using staged data processing techniques, patterns and best practices.
- Loading XML Insurance Claims and Policy data into Data warehouse.
- Developed jobs for the CAU, Glatfelter, Poulton and Binding Authority interfaces as per the business requirement.
- Working within the team to make appropriate, effective decisions related to project responsibilities and to initiate and follow-up on assigned tasks.
- Involving in Unit Testing, SIT and UAT. Worked with the users in data validations.
Environment: IBM Infosphere DataStage 8.7 (Designer, Director, Administrator, Parallel Extender), Oracle 11g, XML, CSV, Visual Studio 2015, Red Hat Linux, Control-M.
Confidential, MI
Sr. Datastage Consultant
Responsibilities:
- Building Data Stage Jobs as per the requirement.
- Involving in day to day production support activities.
- Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Capture and Hash file.
- Using Big data file stage (BDFS) to pull the data from Big data environment.
- Involving in Unit Testing, SIT and UAT. Worked with the users in data validations.
- Optimizing/Tuned DS jobs for better performance and efficiency.
- Extracted data from source transformed and loaded into target database.
- Worked on ODI studio for mapping.
- Developed interfaces for loading the lookup and transactional data.
- Created ODI design documents from the existing Datastage mappings. Used these design documents in development of ODI interfaces/packages.
- Created ODI packages, scenarios using interfaces, variables, procedure.
- Building DataStage ETL interfaces to aggregate, cleanse and migrate data across enterprise-wide MDM ODS and Data Warehousing systems using staged data processing techniques, patterns and best practices.
- Working with various application development teams to provide data integration architecture services and data modeling services.
- Conducting code review sessions with developers.
- Working within the team to make appropriate, effective decisions related to project responsibilities and to initiate and follow-up on assigned tasks without supervision.
Environment: IBM Infosphere DataStage 11.3 (Designer, Director, Administrator, Parallel Extender), Teradata, DB2, ODI, Unix, Autosys.
Confidential, Oaks, PA
Sr. Datastage Consultant
Responsibilities:
- Worked closely with Business analysts and Business users to understand the requirements and to build the technical specifications.
- Extracted data from source transformed and loaded into Oracle target database
- Created jobs using the stages such as Join, lookup, Transformer, Remove duplicates, Funnel, Aggregator, filter, Sort, Modify etc.,
- Code review on Datastage jobs developed by team members.
- Involved in production support activities during weekends.
- Performed rerunning/Killing the jobs in production and monitoring the space in production.
- Removed unwanted datasets using session management activity
- Added new DataStage users and assigning roles, Grant privileges and roles to the existing DataStage users.
- Monitored the scratch disk memory during execution of Datastage jobs, clearing the space in folders on ETL server.
- Extracted data from source transformed and loaded into Oracle target database.
- Performed session management activities in Datastage.
- Involved in Unit Testing, SIT and UAT Worked with the users in data validations.
- Developed Type Trees for defining the source and target systems
- Generated type trees using queries and extracted data from database.
- Developed maps using map designer to transform the data.
- Implemented XSL transformation logic WTX.
- Debugged Mercator maps using TRACE, AUDIT files.
- Worked on custom built type trees for flat files, modified and generated type trees.
- Worked with BCP Utility to load the bulk data into DB2.
- Deployed the maps using Integration Flow Designer.
Environment: IBM InfoSphere DataStage 9.1, 8.5 (Designer, Director, Administrator, Parallel Extender), IBM WTX 8.3, Oracle11g, SQL/PL SQL, Netezza, UNIX.
Confidential, CA
Sr. DataStage Consultant
Responsibilities:
- UsedEnterpriseEdition/Parallel stageslike Datasets, Change Data Capture, Row Generator and many other stages in accomplishing the ETL Coding.
- Designed the Source to Target mappings between sources to Operational staging areas and then to target Data warehouse.
- Developed parallel jobs to get maximum efficiency, which are implemented using Parallel Extender.
- Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.
- Created some routines (Before-After, Transform function) used across the project.
- Quickly navigate between multiple projects and products using Rally.
- Implemented multi-node declaration using configuration files (APT Config file) for performance enhancement.
- Developed UNIX shell script to run jobs in multiple instances by using parameter file.
- Developed DataStage Parallel Jobs, Job Sequencing, and Creating Parameter Sets, creating routines, Data Cleansing and Writing transformation expressions.
- Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse.
- Importing metadata from repository, importing/exporting jobs and creating new data elements.
- Used XML stage to read XML files and write to target.
- Used Kimball bottom-up approach to implement the Datastage jobs.
- Created re-usable components using shared containers for local usage/shared usage.
- Used SQL Loader and import utility in TOAD to populate tables in the data warehouse
- Imported and exported repositories across projects. Used DataStage Version Control to promote jobs from Development to Test and then to Production environment.
- Involved in Unit, Integration, System and User Acceptance Testing (UAT).
Environment: IBM Infosphere DataStage 8.5, Oracle 11g, DB2, Teradata, COBOL, CSV Files, Flat Files, TOAD, Rally, Autosys, Cognos, Shell, UNIX (AIX 6.1/Solaris 10) and Linux.
Confidential, Columbus, GA
DataStage Consultant
Responsibilities:
- Prepared Source to Target mapping documents.
- Developed parallel Jobs in DataStage Designer to Extract data from the Sources Oracle and Complex Flat Files, cleanse it, transform by applying business rules, staging it in Data marts and Load (Initial/Incremental) into Target DWH Teradata.
- Migrated all the jobs from 7.5.2 to 8.1.1.
- Used the Slowly Changing Dimension Stage itself to implement SCD Type-1 & 2 instead of implementing them using CDC Stage in 7.5.2.
- Used Job Compare option to compare the jobs developed in 8.1.1 to 7.5.2 for efficient job design.
- Tuned DataStage transformations and jobs to enhance their performance.
- For Parallel jobs, configured the multiple nodes and used parallel engine capacity efficiently as well as designed a master sequence to run multiple jobs in parallel.
- Unit tested the jobs in Test environment by running them in DataStage Director and verifying the job logs for warnings & errors.
- Exported the project from Development to Test environment using DataStage Manager.
- Exported the project from Test to Production environment using DataStage Manager.
- Used the DataStage stages Oracle Enterprise, CFF, Copy, Filter, Lookup, Transformer, Sort, Funnel, Shared Containers, Join, Dataset, Aggregator, Sequential file, Remove Duplicates, Peek.
- Involved in Unit, Integration, System and User Acceptance Testing (UAT).
Environment: IBM Infosphere DataStage 8.1.1/7.5.2 EE (Parallel Extender), DB2, Oracle 9iCFF files, CSV Files, SQL, PL/SQL, and RedHat Linux 5.1.
Confidential, Portland, OR
Sr. Analyst Programmer
Responsibilities:
- Developed Type Trees for defining the source and target systems.
- Developed maps using map designer to transform the data.
- Worked on 837 Daisy Premier Claim, 270/271, 276/277 maps with implementation of HIPAA National Provider Identifier (NPI) regulation changes which will be mandated on May23, 2007.
- Developed maps to take a COBOL copy book as input and load the subscriber detail data into Sybase database upon validation.
- Developed map, which uses XML file as input and generate EDI 271 and 277 as output format.
- Developed HIPPA EDI transaction sets like 837P, 837I, 837D, 270/271, 276/277 and ANSI X12 EDI Standards.
- Developed maps for HIPPA transaction sets like 835, 834.
- Developed Common Message Logging map to send Email and error log messages in database tables.
- Performed tuning the mappings to enhance the performance.
- Performed trouble shooting of maps using Audit and Trace files.
- Worked with EDI Standards, XML, ftp and Java Methods.
- Developed Technical Specifications as per Business Requirements.
- Deployed the maps using Integration Flow Designer.
- Tested the mappings to meet the desired results.
- Worked with BCP Utility to load the bulk data into Sybase.
- Involved in Unit testing and prepared test cases for all jobs.
Environment: IBM Websphere TX 8.0 (Mercator), Sybase, UNIX, Crystal Reports, FACTS, and FACETS Systems.