We provide IT Staff Augmentation Services!

Datastage Developer Resume

5.00/5 (Submit Your Rating)

Texas, TX

Summary:

  • Over 7 years of IT experience in Data Analysis, Design, Development, Implementation and Testing of Database/Data warehousing/Client/Server/Legacy applications using Data Modeling, Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Over 6 years of hands-on experience with ETL Tools DataStage 5.x/6.x/7.x/8 (Parallel Extender).
  • Broad experience with different parallel and partition techniques.
  • Extensive experience as Production Support Personal in various multiple Data warehouse Projects and extensively worked with offshore team.
  • Experience in Working and Designed and populated dimensional model (star and snowflake schema) for a Data warehouse and data marts and Reporting Data Sources.
  • Extensive Experience with dimensional and relational DB design and development.
  • Expertise in handling of Complex Flat files and loading them in to Oracle using SQL Loader and used Cobol Copybook to import table definitions from mainframes.
  • Strong knowledge of Extraction Transformation and Loading (ETL) processes using UNIX shell scripting, SQL Loader and handled metadata using metasage for data analysis.
  • Experience in developing custom subroutines/scripts to simplify the determination of data output paths for complex business logic and expertise working with both OLTP and OLAP architectures.
  • Experienced in handling of packed decimal field in Mainframe datasets and tape drives and loaded them into DB2 and Teradata using DataStage.
  • Extensive experience in handling and creating in-bound and out-bound files from legacy systems.
  • Additional background in web development. Extensive experience in backend programming using Transact-SQL and PL/SQL that includes stored procedures, triggers, stored functions and cursors. Working knowledge in Databases like Teradata , SQL Server 2000, Oracle 11g/10g/9i/8i, and also had expertise using MS Access 2000.
  • Involved in various phases of software development lifecycle (SDLC) using C/C++ and Java GUI as front end and Oracle 9i as back end in UNIX and Windows NT Platform.
  • Having a good knowledge of source to target mapping tool, Fast Track in the Information Server.
  • Strong analytic, problem solving, collaborative and interpersonal communication skills.

Education:
Bachelor of Engineering (BE)

Technical Skills:

ETL Tools: ETL IBM InfoSphere DataStage 8.1/7.X/6.X, Parallel Extender (Manager, Designer, Director, Administrator), Quality Stage V7.5, Information Server, Information Analyzer, Information Stage, IBM Web Sphere Application Server
Databases: Oracle 11g/10g/9i/8i, DB2 UDB, TeraData V2R6, SQL Server 2000/2005, MS Access, Sybase, SQL Plus, Informix, Netezza
Languages: SQL, PL/SQL, PERL, JAVA, UNIX Scripting, PRO*C/C/C++.
Reporting Tools: Oracle Reports 6.0, Crystal Reports XI, MicroStrategy, Cognos Report Net, Cognos8, Business Objects XIR2
Data Modeling: Designer2000, ERwin, Sybase PowerDesigner, Fast Track, MS Visio
Operating System: UNIX, Solaris, SunOS 5.8, AIX 6.1, Windows NT/95/98/2000/XP, MS DOS
Other Tools: SQL * Loader, TOAD, SQL Developer, SQL *Plus, PVCS, PuTTY, Serena Dimensions, Harvest, Merant Dimensions, Test Director, Autosys, Tivoli And SAP Plugin (IDocs, ABAP, BAPI Stages)

Profiaessional Experience:

Confidential, Texas, TX Jan 2011 Till Date
DW Analyst / DataStage Developer

Confidential, is one of the leading health care providers and provides a wide range of services. Project involved development and implementation of ODS (Operational Data Store) which is used to adjudicate claims and MDM (Master Data Management), a convergence of common Enterprise Data & functionality. The multiple input sources are driven by Data Mirror and the control database allows archiving, reporting and updating the target database based on mapping level rules.

Responsibilities:

  • Involved in gathering End User and Business Owners requirements, User Interviews and Drafting end users Specifications and BI needs.
  • Analyzed the Business requirement specs and prepared ETL specs for development purposes.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Worked on the DataStage to develop/modify existing mappings and sessions and coordinate with ETL team to verify source data confirms the guidelines specified in the dimensional model.
  • Developed code fixes for production and pre-prod environments based on the CQ issued. One main focus was for the Data Quality checks which was developed and enhanced for various target tables.
  • Main resposibilty involved versioning and updating the ETL code based on environments (test, production and performance).
  • Performed Repository Administration tasks (Creating Repositories, Users, Assigning privileges, creating backups and recovery).
  • Performed the analysis of the Source, Target elements and perform database load based on the comparision scripts. Also used Shared Containers to differentiate and categorize the different code base.
  • Worked with Teradata SQL Assistant and associated tools like BTEQ,Fast Load, Fast Export, Multiloadand Querymanfor Data Extracts, Data loading from EDW.
  • Developed Mapping document and prepared Test plans for Unit testing.
  • Extensively used Unix Shell Scripts for DQ checks.

Environment:
Ascential Data Stage 8.1, TeraData SQL Assistant, Erwin Datamodeler 4.1, Pl/Sql, UNIX Shell Scripts, DB2, IBM workbench and ClearQuest

Confidential, Concord, CA Sep 2008 Dec 2010
Sr.DataStage Developer
Confidential, is a diversified financial services company providing banking, insurance, investments, mortgage and consumer finance through almost 6,000 stores, the internet and other distribution channels across North America and internationally. METIS (Market Event Triggering Information Systems) objective in evaluating internal controls over financial reporting and discloser is to assure management of those activities that have a significant impact on financial statements and disclosures are controlled.
Responsibilities:

  • Data Warehouse is implemented using sequential files from various Source Systems.
  • Meeting with Source system users & Business Users to create data share agreements and BI requirements.
  • Involved in Defining Best practice docs, Development standards doc for DataStage Jobs.
  • Developed Mapping for Data Warehouse and Data Mart objects.
  • Worked extensively with Parallel Stages like Copy, Join Merge, Lookup, SAS, Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Change Capture, Remove Duplicates and Transformer Stages etc.
  • Design and Develop ETL jobs using DataStage tool to load data warehouse and Data Mart.
  • Performance tuning of ETL jobs.
  • Used quality stage plug-in within DataStage jobs and Created and verified quality stage jobs for Match and de-duplication.
  • DataStage Jobs version control and Migration using Version management & DS-Manager tools.
  • Extensive experience in handling in bound and out bound files from outside vendors.
  • Perform data manipulation using BASIC functions and DataStage transforms.
  • Define reference lookups and aggregations.
  • Developed ETL processes on Teradata and RDBMS utilities such as MultiLoad, FastLoad, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
  • Import relational metadata information for project.
  • Wrote JCL scripts in Mainframe to FTP files to different databases.
  • Used Fast Track to create logical and physical mapping models.
  • Developed DS Routines for extract job parameters from files and developed DS routines for job Auditing.
  • Create master controlling sequencer jobs using the DataStage Job Sequencer.
  • Create and use DataStage Shared Containers, Local Containers for DS jobs and retrieving Error log information.
  • Design, build, and manage complex data integration and load process.
  • Automated DataStage jobs in UNIX environment.
  • Developed PL/SQL Packages to perform activities at database level.
  • Developed Unix Shell Scripts to automate the Data Load processes to the target Data warehouse.

Environment: IBM DataStage 8.1 (IBM Information Server) , Oracle 9i/10g, DB2 UDB, Mainframe, Sybase IQ,
Dimensions, SQL, PL/SQL, TOAD, SQL Server, Fast Track, Autosys, Shell Scripts, Universe Basic, KSH, Windows 2000.

Confidential,Huntsville, AL Mar 2008 Aug 2008
Production Support
Telecom
Developed a Business Intelligence system to Quickly identify customer needs and better target services using DataStage. A large amount of customer related data is from diverse sources is consolidated, Including customer billing, ordering, support and service usage.
Responsibilities:

  • Involved in Source systems Analysis
  • Discussed with Data Architect and gathered requirements
  • Designed and developed DataStage ETL Jobs to extract data from Oracle, DB2 & Sybase.
  • Worked on tuning the SQL and DataStage jobs to improve performance of various reports.
  • Extensively worked with offshore team and analyzing the Job performance.
  • Worked on Parallel extender in the whole Project.
  • Worked on scheduling jobs using Autosys and wrote scripts to insert/append jobs, conditions, analyzing dependencies etc.
  • Played a major role in improving the performance of complex jobs.
  • Involved in designing, developing, implementing and loading inbound files into RIMDER data base from major vendors.
  • Used XML Output stage to convert tabular data, such as relational tables and sequential files, to XML hierarchical structures using a subset of XPath expressions.
  • Used Quality stage plug-in within DataStage jobs
  • Worked with Quality stage designer client for mapping documents.
  • Wrote Unix shell scripts to transfer files securely and validate the inbound files and from outs side vendors and call the DataStage sequences to load the data into RIMDER database and send the success/Failure reports to the End Users via emails.
  • Good working knowledge in installing plug-ins like SAP Packs.
  • Worked with Quality Stage like investigate, match, Profile, survivorship to integrate with DataStage.
  • Used to extract data from SAP Tables like MARA(Material Data), MARC(Plant Data), LFA1(Vendor Data).
  • Created Autosys Scripts to Schedule reports on Business Objects XI server.
  • Enabled employees to access the inventory and payroll data by developing Web interface for the database. HTML, ASP and DHTML were used for web design.
  • Created Hierarchical reports in WEBI and Created Parameterized Drill Down Reports.
  • Created complex and challenging Parallel Jobs Utilizing Different Partitioning techniques running on Multiple Nodes.
  • Used Extensively Teradata utilities like MultiLoad, FastLoad, FastExport, BTEQ& TPump.
  • Automated File Reading Process by Creating Some Shell Scripts.
  • Used SQL Loader to Load Database Table from Flat Files.
  • Worked on web analytics applications in extracting and transforming between various databases.
  • Created Before/After Job Sub routines, and various batch jobs Using Job sequencer
  • Maintained and supported Existing PL/SQL ETL Processes.
  • Maintained schedules for warehouse storage. Read and interpreted UNIX logs.
  • Reviewed ETL jobs developed by other teams and made sure they followed the standards set by RIMDER team.
  • Documented Production support documents for support team.
  • Tools: MS Word, MS Excel, MS Visio, MS Access, Business Objects, SQL * Loader, TOAD, SQL Developer, SQL *Plus, PVCS, Serena Dimensions, Harvest, Merant Dimensions, DBLink, Autosys.

Environment: ETL WebSphere DataStage8.0/7.5.2/7.5 (Parallel Extender), IBM Web Sphere Application Server, Teradata 5.0/6.0, Oracle 9i/10g, Quality Stage V7.5, SAP Plug-in (BAPI, ABAP and IDoc stages), DB2, Sybase, PowerDesigner, Cognos Report Net, Serena Dimensions, Cognos8, SunOS Unix 5.8, Borland Star Team, Merant Dimensions, Lotus Notes.

Confidential,Herndon, VA Jan 2007 - Feb 2008
ETL Ascential DataStage Developer
Mortgage (Finance)
Risk Control Self -Assessment (RCSA): Freddie Mac’s objective in evaluating internal controls over financial reporting and discloser is to assure management of those activities that have a significant impact on financial statements and disclosures are controlled. RCSA is the central repository data source is built for reporting purposes. Various reports are built off this Reporting Data store in wide areas such as sales, marketing and call center operations etc. Freddie Mac implemented an MS Access database to support SOX activity-level risk and controls documentation. The new database provided a better means for the company to track and understand changes to key control population over time and provide a better audit trail. Updates to controls can be made on a real-time basis as changes occur, with the attestation performed based on snapshot of controls as of a particular date. The new database will allow greater capability to use pick lists and other methods of enhancing data quality.
Responsibilities:

  • Involved in Source systems Analysis.
  • Discussed with Data Architects and gathered requirements.
  • Designed and developed DataStage ETL Jobs to extract data from Siebel to RCSA database.
  • Worked on tuning the SQL and DataStage jobs to improve performance of various reports.
  • Worked on conversion of DataStage server jobs to Parallel extender thereby improving the performance of the batch cycle.
  • Worked on scheduling jobs using Autosys and wrote scripts to insert/append jobs, conditions, analyzing dependencies etc.
  • Involved in defining ETL standards and best practices for DataStage.
  • Played a major role in improving the performance of complex jobs.
  • Involved in designing, developing, implementing and loading inbound files into RCSA data base from major vendors.
  • Used SQL Loader to Load Database Table from Flat Files.
  • Wrote Unix Shell Scripts to run DataStage jobs.
  • Reviewed ETL jobs developed by other teams and made sure they followed the standards set by RCSA team.
  • Wrote shell scripts to transfer files securely and validate the inbound files and from outs side vendors and call the ETL Ascential DataStage sequences to load the data into RCSA database and send the success/Failure reports to the End Users via emails.
  • Involved in the whole life cycle of the design and build - writing and reviewing specification and Unit test cases within and outside the group to the actual build.
  • Tools: MS Word, MS Excel, MS Visio, MS Access, SQL * Loader, TOAD, SQL Developer, SQL *Plus, PVCS, Serena Dimensions, Harvest, Merant Dimensions, DB Link, Autosys.

Environment: ETL Ascential DataStage 7.0/7.1(EE), Oracle 9i, Oracle Designer 2000, Cognos Report Net, Cognos8, SunOS, UNIX, AutoSys, Borland Star Team, Harvest, Lotus Notes.

Confidential,
ETL/DataStage Consultant
Circuit City, USA (Client)
A Retek application was implemented to help the Circuit City retailer efficiently create, manage and fulfill consumer demand .Circuit City Merchandise System Transformation is aimed at designing and building a ‘’top down’’ planning solution. Working in the Business Releases 1 and 3. Interacted with business managers, data architect, and team lead to understand the business requirements and the design of the data warehouse .Implemented a detailed error handling strategy to capture all the error records. Data from different sources should be brought into Retek using Informatica ETL.
Responsibilities:

  • Worked with business customers for requirements gathering, Logical Design, Physical Design model meetings to understand the data flow, frequency of data loads.
  • Worked with architects and subject-matter experts to design comprehensive solutions.
  • Designed and coded the ETL logic using ETL Ascential DataStage to enable initial load and incremental processing and exception handling, restorability and recovery, data cleanup, validation and monitoring.
  • Used Complex Flat File, DataSet Stages depending on the requirement and layout for parallelism of data.
  • Used Join/Merge/Lookup Stages to replicate integration logic based on the volumes of the data.
  • Used Most of the Other Parallel Stages like Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer Stages etc extensively.
  • Used DB2 UDB’s Export/Import/Load Utility for Fast Loading or Unloading of Data into DB2 Tables.
  • Used SQL Loader to Load Database Table from Flat Files.
  • Build the new universes as per the user requirements by identifying the required tables from Data mart and by defining the universe connections in Business Objects XI.
  • Created parameterized Crystal reports and customized existing reports for presentations using Cross-Tab reports, Sub-reports.
  • Extensively used DataStage Director utility Monitor and debug DataStage code.
  • Involved in DataStage Administration tasks such as creation of Projects, Environment variables and configuration files.
  • Prepared unit test, integration test, and performance test plan documents. Performed unit test, integration test, and performance test and documented test results.
  • Involved in creating Autosys for Scheduling the Job dependencies and Timings.
  • Tools: MS Word, MS Excel, MS Visio, MS Access, SQL * Loader, TOAD, SQL Developer, SQL *Plus, PVCS, Serena Dimensions, Harvest, Merant Dimensions, DB Link, Autosys.

Environment: ETL Ascential DataStage 7.0/7.1(EE), Oracle 9i, Oracle Designer 2000, Cognos Report Net, Cognos8, SunOS, UNIX, AutoSys, Borland Star Team, Harvest, Lotus Notes.

Confidential,Hyderabad, India Jun 2004 Jul 2005
Database Analyst
Confidential, has been a leader in healthcare, specializing in disease management since 1994. The flow of data was from different sources and types, which encompassed premiums, claims, expenses and contracts. The primary objective of this project is to get data from different sources (DB2, Sybase, Flat file) and perform the necessary operations on the data as per the user requirement
Responsibilities:

  • Extracted data from sources like DB2, Sybase and Fixed width and Delimited Flat files, transformed the data according the business requirement and then Loaded into Oracle database.
  • Modified several of the existing mappings and created several new mappings based on the user requirement.
  • Maintained existing mappings by resolving performance issues.
  • Used Informatica Repository Manager to maintain all the repositories of various applications.
  • Imported and Created Source Definitions using Source Analyzer.
  • Imported and Created Target Definitions using Warehouse Designer.
  • Created reusable transformations and Mapplets and used them in mappings.
  • Fine-tuned Transformations and mappings for better performance.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, lookup, stored procedure, sequence generator and joiner.
  • Created, launched & scheduled tasks/sessions. Configured email notification. Setting up tasks to schedule the loads at required frequency using Power Center Server manager. Generated completion messages and status reports using Server manager.
  • Administrated Informatica server ran Sessions & Batches.
  • Developed shell scripts for automation of Informatica session loads.
  • Involved in the performance tuning of Informatica servers.

Environment: Windows NT, Oracle 8.0, Informix, SQL, PLSQL, MS Access, SQL Server, ERwin, Informatica 5.1, MS Excel, COGNOS.

We'd love your feedback!