Peoplesoft Epm Resume
New Jersey, NJ
SUMMARY
Technical Skills:
ETL : IBM Infosphere Datastage 8.1/7.5.X2 (Server /PX)
IBM Quality stage 8.1, IBM Information Analyzer 8.1
Informatica 8.5/8.1/7.1
OLAP: Congnos 8.1, Business Objects XIR2, SAS 9, OBIEE
Data Modeling : Erwin 7/4.5, MS Visio, Embarcadero ERStudio
Database: Oracle 10G,UDB/DB2, SQL Server
Languages: C, C++, JAVA, SQL, PL/SQL, HTML
Operating Systems: AIX 5.3, Solaris 10, Linux, Window XP
Scheduling: Autosys, UC4
Knowledge on : Siebel 8.0(SFA), SAP CRM 5.0(Service Agreements)
Oracle EBS 11.5.1 TCA (Order Management, Finance)
SAP R/3 ABAP 4.7(IDOC, BAPI), BW 3.5
SAP Netweaver XI/PI 7.0
Professional Summery:
IT Experience: Over 9 Years of experience in Information Management end-to-end Data Warehousing, Business Intelligence, Data Integration & Data Migration. Developing Data Warehouse roadmap, strategy and architecture, enterprise data warehouse, ODS, dimensional data marts and end user reports for Healthcare, Manufacturing, Retail, Pharmaceutical, Financial, Insurance.
Data Transformation: Over 5 Years of IBM Infosphere Data stage 8.0.1 / 7.5.2 / 6.0 (Administrator, Designer, Director and Manager) both SERVER & Parallel Extender/Orchestrate and Multi Clint Manager (7.5.2 to 8.0.1). Informatica Power center 7.2/8.1/8.5
Data Cleansing & Standardization: Over a years of Data Cleansing experience (Deduplication, Relationships, Address Validation, Identified, Standardized, Matched, Reconciled) using IBM Datastage, Quality Stage.
Data Profiling: 1 Years of Data profiling experience on IBM Information Analyzer 8.0.1/ Profilestage (validates data values and column/table relationships, source to target field mappings, source system profiling and analysis ) join with SME & Data modeler.
SAP Knowledge: SAP ECC & R/3 ABAP 4.7 (LSMW, BAPI, IDOC), SAP BW 3.5, SAP Netweaver XI 3.0/7.0. SAP CRM 5.0
Siebel Knowledge: Siebel SFA, Supply Chain, Service & Call Centre (EIM & Base Tables).
Data Integration Knowledge: IBM WBI Message Broker (MQ, JMS, AS1, AS2, XML, SOAP, WSDL), IBM Websphere TX, B2B Gateway, Agent to Agent.
Education:
- MS Business Information Systems from fh-wismar, Germany.
Professional Experience:
Confidential, Oct’2010 - Present
PeopleSoft EPM (ETL/BI)
Confidential,Project is actively collaborating with state agencies to replace the current Minnesota Accounting and Procurement System with a PeopleSoft Enterprise Resource Planning system. SWIFT will integrate all of the administrative functions across state agencies, including financial, procurement, reporting and the current SEMA4 (human resources/payroll) system.
Responsibilities:
• Developed DataStage ETL jobs and Data Loader definitions in the Enterprise
Warehouse based on the client’s requirements.
• Experienced in Database programming for Data Warehouses (Schemas), proficient in
dimensional modeling (Star Schema modeling, and Snowflake modeling).
• Follow the ETL standards, naming conventions especially for DataStage Project
categories, Stage names, links and maintain best practices as per the existing EDW.
• Developed DataStage ETL jobs and Data Loader definitions as specified by the
requirements of the customer.
• Reconfigured and setup the DataStage ETL jobs to go to the ODS layer of the
Enterprise Warehouse.
• Created multiple documents including requirements, scope, customization analysis
• Customize the DataStage jobs according to client business needs.
• Identified bottlenecks in the DataStage ETL and Data Loader process and tuned as
necessary resulting in a decrease of processing time.
• Setup and configured the Enterprise Warehouse including PF Business Units, SETID.
• Created Reconciliation process (SQL scripts, Word Doc, etc.) in order to identify the
accuracy of the results.
• Created many deliverables including Technical Project Plan, Customization Effort
Level Document, ETL Processes Definition, Enterprise Warehouse Outlined, End-to-
End Documentation, and Setting Up Source Environments, Setting up a New EPM
Environment, End to End Document, Test Scripts, SQL Scripts.
Environment: IBM Information Server 8.1 (Data Stage, Metadata Workbench, Business Glossary), PeopleSoft EPM 9.1, People Tools 8.5.11, Oracle 11g, OBIEE 10.3
Confidential,Westborough, MA Feb’2010 – Oct’ 2010
Sr. ETL Consultant (IBM Infosphere Data Stage)
Responsibilities:
• Requirement analysis/documentation, developing functional and technical specifications,
DW and ETL designing, developing detailed mapping specifications, DFD\'s and
Scheduling charts.
• Using Information Analyzer for Column Analysis, Primary Key Analysis and Foreign
Key Analysis.
• working with Quality Stage for data cleansing, standardize, matching and survivorship
• Designed technical design specs during the design phase and developed ETL process
flow diagrams.
• Developed scheduling charts and scheduled shell scripts, Datastage ETL jobs and reports
using ESP.
• Move the developed components in to AllFusion Harvest
• Worked with data modeler and database administrator to implement database changes.
• Introduced restartable dataStage jobs to address batch cycle failures by redesigning non
restartable DataStage jobs those are critical to the Batch Cycle.
• Mentored Developers by introducing Best Practices to reduce Design Complexity and
Implement the best Parallelism in DataStage PX Jobs.
• Used DataStage Director to Debug, Run and Monitor the jobs, DataStage Designer to
Import Source/Target Metadata Definitions and Export/Import DataStage Jobs and
Components
• Design and develop the new ETL job, modify the existing jobs as per the new process.
• Design datastage PX jobs that extract, integrate, aggregate, load and transform the data into data warehouse or data mart.
• Create and reuse metadata and job components.
• Design SCM the data mart dimensional and fact tables data coming from Manugistics.
• Design the jobs using OCI/Oracle EE stage, ODBC Enterprise stage, Lookup stage, Change Capture stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
• Using NZ_LOAD utility load the daily, weekly and monthly data in to UNIT_CONTOL datamart.
• Created tables, Index and modify the aggregator tables as per the requirement.
• Prepared the UNIT and SIT test case based on designed and modified jobs.
• Modify the Incremental sequencer that support to modified jobs.
• Worked with Metadata Definitions, Import and Export of Datastage jobs using
• Getting mainframe data and put at Linux box using FTP script.
• Using FTP Plug-in gets mainframe data and load into DB2 tables.
• Work with the TJX Canadian team as part of production support.
• Defined back up recovery process for data stage projects.
• Extensively developed UNIX Shell scripts for Data Manipulation.
• Defined & implemented data stage jobs process monitoring.
• Effective in cross-functional and global environments to manage multiple tasks and
assignments concurrently with effective communication skills.
Environment: IBM InfoSphere Information Server 8.1,DB2/UDB 9.1, Netezza 7.2, SqlServer, Sybase, Oracle EBS, JDA 7, Cognos 8.3, AIX 5.3, Hummingbird, CA ESP , Harvest, Linux, Windows XP.
Confidential,Boston, MA Dec 2009 – Feb 2010.
ETL Analyst
Responsibilities:
• Understand the existing Informatica Mappings , Sessions, Work Flows
• Prepare the Mapping document for new development in DataStage jobs.
• Preparing the Technical Analysis Document based on existing Informatica work flows,
mappings and new jobs development in IBM DataStage.
• Work with Off-Shore team as on-site coordinator.
• Design the DataStage Jobs & providing technical support to off shore team.
• Prepare the Unit test cases once after finish the development.
• Preparing Autosys scheduling script for DataStage jobs
Environment: IBM DataStage 7.5.2 (PX), Informatica 7.2, Oracle 9.2.4, Autosys, Argent, Sun Solaris 10.
Confidential,Westborough, MA July’2009 – Dec 2009
Sr. ETL Consultant (IBM Infosphere Data Stage)
Responsibilities:
• Design and develop the new ETL job, modify the existing jobs.
• Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing EDW.
• Design Datastage PX jobs that extract, integrate, aggregate, load and transform the data into Data warehouse or data mart.
• Create and reuse metadata and job components.
• Design SCM the data mart dimensional and fact tables data coming from Manugistics.
• Design the jobs using OCI/Oracle EE stage, ODBC Enterprise stage, Lookup stage, Change Capture stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
• Created tables, Index and modify the aggregator tables as per the requirement.
• Prepared the unit test case based on designed and modified jobs.
• Modify the Incremental sequencer that support to modified jobs.
• Getting mainframe data and put at Linux box using FTP script.
• Using FTP Plug-in gets mainframe data and load into DB2 tables.
• Defined back up recovery process for data stage projects.
• Defined & implemented data stage jobs process monitoring.
Environment: IBM Datastage 7.5.2/8.,DB2/UDB 9.1, Netezza 7.2, SqlServer, Sybase, Oracle EBS, Manugistics SCM, Cognos 8.0, MQ Series, AIX 5.3, Sun Solaris, Windows XP.
Confidential,Durham, NC Feb’2009 – July’2009
Sr.ETL Consultant (IBM Data Stage)
Responsibilities:
• Analyze the existing EDW prepare the mapping documents.
• Design and develop the new ETL job, modify the existing jobs.
• Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing EDW.
• Design Datastage PX jobs that extract, integrate, aggregate, load and transform the data into Datawarehouse or data mart.
• PL/SQL Packages to create daily reports in CSV format and emailing these reports to business users using UTL_FILE and UTL_SMTP built in Oracle Packages.
• PL/SQL Packages for daily summarization for Sales and Customer data. This was developed in UNIX and PL/SQL.
• PL/SQL packages for automating lot of manual queries used by business users.
• Create and reuse metadata and job components.
• Design the jobs using OCI/Oracle EE stage, Lookup stage, CDC stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
• Implementing the SCD Type-2 using SCD stage & Change Capture stages.
• Worked with Metadata Definitions, Import and Export of Datastage jobs using
• Implemented security among data stage users and projects.
• Created crosscheck UNIX shell scripts on interface files and audit reports on data extracted and data loaded, implemented post -Execution scripts to reconcile the data.
• Setup UNIX groups and defined UNIX user profiles and assigned privileges.
• Defined back up recovery process for data stage projects.
• Defined & implemented data stage jobs process monitoring.
Environment: IBM Datastage 7.5.2/7.5.3 (PX/MVS), Sun Solaris, Cognos 8.1, Oracle 10G, SqlServer 2005, TOAD, Tortoise CVS 1.8.3, Erwin.
Confidential,Sunnyvale, CA Mar’2008 – Jan’2009
Sr.ETL Consultant (IBM Infosphere Data Stage)
Responsibilities:
• Analyze the existing EDW environment and find out the gaps.
• Find out the Impact analysis and cardinality changes.
• Experience in Converting the Business Logic into Technical Specifications.
• Prepare a high level and low level design documents.
• Schedule the meetings with up and down streams.
• Design and develop the new ETL job, modify the existing jobs.
• Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing EDW.
• Tune the Datastage jobs design level and custom SQL scripts.
• Used Investigate, Standardize, Match & Survive stages in Quality Stage to Harmonize
and align the data to create a single view of customers
• Used Domain preprocessor & Domain specific rule sets like USPREP,USNAME,
USADDR, USAREA to standardize Customer Master and Vendor Master.
Standardized, Matched, Deduped and Survived using Custom rule sets in Quality Stage.
• Used Quality Stage to developed jobs which involved converting the variable length
record to fixed length records, parsing the fields to single domain data fields, identifying
the most commonly used pattern for each field, selection of subset of records,
standardizing the data by converting each field into a most commonly used format.
• Used SAP BW PACK BW Load stage, BW Open Hub Extract Stage to pull & push data in to SAP BW Info packages, Process Chains.
• Extensively used SAP R3 stages like IDOC LOAD, IDOC Extract, ABAP and BAPI.
• Customized the PL/SQL code as per the rules engine.
• Used PL/SQL to create Packages, Functions, and Procedure.
• Work with different internal teams and offshore team also.
• Created tables, Index and modify the aggregator tables as per the requirement.
• Prepared the unit test case based on designed and modified jobs.
• Modify the Incremental sequencer that support to modified jobs.
• Maintain the defects by using HP-Quality Center, assign the defects.
• Implemented security among data stage users and projects.
• Set up development, QA & Production environments.
• Migrated jobs from development to QA to Production environments.
• Involved in preparing FSD documentation.
• Defined production support methodologies and strategies.
• Defined back up recovery process for data stage projects.
Environment: Windows XP / Sun Solaris, IBM Datastage 7.5.x (Server) IBM Datastage, Qualitystage & Information Analyzer 8.0.1, Cognos 8.1, OBIEE, Siebel 8.0, Oracle 11i, Oracle CDH, SAP ECC 6.0 my SAP CRM 5.0, SAP Net weaver PI 7.0, TIBCO, Oracle 10g, TOAD 8.0, Erwin
Confidential,Roanoke, VA Aug’2007 – Feb’2008
Sr.ETL Consultant (IBM Infosphere Data Stage)
Responsibilities:
• Design the ETL jobs based on the DMD with required Tables in the Dev Environment
• Designed and developed Star Schema dimensional model.
• Developed various jobs using Datastage PX stages DB2API / DB2EE stages, Lookup stage, Datasets, Funnel, Duplicate stage, Change Capture stage, Change Apply stage, ODBC stage.
• Provided production support and customer support to the newly developed data marts and subject areas like Replenishment Stock, Inventory Reduction.
• Applying rules set using Qualitystage to maintain customer information.
• Provide the staging solutions for Data Validation and Cleansing with Quality Stage and Datastage ETL jobs.
• Load the data in to Finanacial datamart getting data from Peoplesoft GL table.
• As a DataStage consultant working with Peoplesoft EPM.
• Read the supply chain data from Salesforce application.
• Tuning the Datastage jobs source level, transformation level and target load level.
• Supporting the existing jobs in Datastage 7.5.2. Using multi client manager.
• Data extraction from iseriesDB2 database, Oracle, flat files.
• Implemented Slowly Changing dimension Type- 2 concepts.
• Performance tuning of DB2 target database using explain plan (Access Plan).
• Validation testing and Unit testing using the existing AS/400 required data
• Validating and compare the source flat file data using Perl script in UNIX box.
• Scheduling the Datastage batch jobs using UC4.
Environment: IBM Infosphere Information Server 8.0, Crystal Reports, SPSS, Business Objects XIR2, OBIEE, PeopleSoft, JDA , Oracle 10g, Sql Server, DB2UDB 8/ 9.1, IBM BCU, Toad, AIX 5.3, Win XP Pro, UC4.
Confidential,Lexington, KY Sept’2006 – July’2007
Sr.ETL Consultant (IBM Data Stage)
Responsibilities:
• Responsible for gathering business requirement from end users
• Prepare the Data Mapping Documents and pseudo code
• Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
• Reading the data from Siebel SFA, Supply Chain, and Service Modules.
• Supporting to global regions like North/South Amrica, EMEA, Asia Pacific
• Read data from Siebel basic tables using Siebel direct plug-in.
• Write data to Siebel basic tables through EIM table using EIM plug-in.
• Designed and developed Star Schema dimensional model.
• Design and Developed various jobs using Datastage Parallel Extender stages OCI, Hashed file, Sequential file, Aggregator, Pivot and Sort.
• Implemented Slowly Changing dimension concepts.
• Worked with Metadata Definitions, Import & Export .dsx file using Data stage Manager.
• Setup UNIX groups and defined UNIX user profiles and assigned privileges.
• Defined back up recovery process for data stage projects.
• Defined & implemented data stage jobs process monitoring.
• Defined K-shell scripts for file watcher and file archiving process.
• Installed packages and patch management.
• Validation testing and Unit testing using the Siebel required data
• Primary contact for business users for UAT testing.
Environment: Ascential Datastage 7.5 ( Server/PX/MVS, Crystal Reports, Micro Strategy, BO XI R2, Oracle 10g, SQL, PL/SQL, Siebel 7.3, JD Edwards, AS/400, Toad, UNIX Shell Scripts, Sun Solaris 8.0, Win XP, VM, Clear case
Confidential,Atlanta, GA Jun’2005 – Mar’2006
Sr. ETL Consultant (IBM Data Stage)
Responsibilities:
• Identified the various DataStage jobs, PL/SQL Scripts and UNIX scripts that were
impacted and had to be designed/created and also created the technical specifications and
mapping documents for the various tasks.
• Developed and loaded data warehouse tables such as dimension, fact and aggregate tables
using IBM DataStage.
• Extensively used IBM DataStage designer to perform complex mappings based on user
specifications.
• Aggregate and Transformer stage were used to calculate the fields based on the business
requirement and date transformations were performed.
• Developed Triggers and Views for data auditing and security purposes.
• Developed UNIX scripts (Korn shell) to communicate production error messages to
appropriate support personnel and developed routines to automate the copying of files to
remote servers.
• Developed UNIX scripts for data validation, threshold checks and email reject records to
the business and ETL primaries.
• Loaded data into Teradata using DataStage, FastLoad, BTEQ, FastExport, MultiLoad, and
Korn shell scripts
• Analyzed business requirements, transformed data, and mapped source data using the
Teradata Financial Services Logical Data Model tool, from the source system to the
Teradata Physical Data Model
• Worked closely with the source team and users to validate the accuracy of the mapped
attributes
• Troubleshoot and created automatic script/SQL generators
• Maintained versions of DataStage Code and Unix Scripts using Rational ClearCase.
• Used MS Visio to illustrate process flows for documentation purposes.
• Performance tuning of the complex queries using the explain plans.
• Creation of documents for test plans and technical guides.
• Involved in S.I.T and U.A.T test case generation and support
Environment: Ascential Datastage7.5/7.0/EE, Quality Stage, Oracle9i, PL/SQL, Toad, SQL Server, PeopleSoft, DB2UDB, Teradata, AIX.
Confidential,Omaha, NE Oct’2004 – Mar’2005
Sr.ETL Consultant (IBM Data Stage)
Responsibilities:
• Involved in migration process from DEV to Test and then to PRD.
• Obtained detailed understanding of data sources, Flat files and Complex Data Schemas.
• Used Data Stage as an ETL tool to extract data from sources systems and aggregate the data and load into the DB2.
• Used IBM Datastage as an ETL to extract data from sources like Sybase, DB2, VSAM files and flat files and loaded to target DB2.
• Read and write the data using Sybase OC stage and
• Created Re-usable repository using Data stage Manager.
• Involved in multiple subject areas like Providers, Claims.
• Designed XML stages for reading XML log files for capturing data stage jobs audit data.
• Installed and configured MQ Series Plug-In and captured On Line messages.
• Developed jobs in Parallel Extender using different stages like Transformation, Aggregation, Source dataset, external filter, Row generation, Column generation and vector stage.
• Created crosscheck UNIX shell scripts on interface files and audit reports on data extracted and data loaded, implemented post -Execution scripts to reconcile the data.
Environment: Ascential DataStage 6.0/7.5.1, DB2, AS/400, Sybase, AIX, Webfocus, Clear Case, Clear Quest, and Cybermation.
Confidential,PA Nov’2003 – Sep’2004
Sr. ETL Consultant (IBM Data Stage)
Responsibilities:
• Created a prototype for PSL to ease the quarterly submissions.
• Used Ascential Datastage as an ETL to extract data from sources like Sybase, DB2, VSAM files and flat files and loaded to target Oracle.
• Implemented Oracle Warehouse Builder & Oracle Bulk Loader for staging area.
• Used lookup stage with reference to Oracle table for insert/update strategy and for updating slowly changing dimensions.
• Used Data Stage Parallel Extender for and then split the data into subsets and to load data, utilized the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.
• Worked with Metadata Definitions, Import and Export of Datastage components using Datastage Manager.
• Integrate SAP using Datastage SAP R/3 Load & Extract PACK ABAP, IDOC & BAPI.
• Customized the ABAP Programs while using the ABAP Stages.
• Developed SQL scripts for data validation and testing.
• Created UNIX shell scripts using K-shell for extracting and cleaning up the data to load the data in to the target CDW and for scheduling the jobs, Email notification to capture the status of the jobs ran.
• Created Jobs in DataStage to transfer from heterogeneous data source like COBOL, fixed record flat files, CSV files, DB2, Oracle, and Text files to ORACLE 9i.
• Covert data EBCDIC to ASCI using CFF stage
Environment: Ascential Datastage 6.0/7.x(Server/PX/MVS), Web sphere, Quality Stage, Profile Stage, Meta Stage, DB2 UDB 7.0/8.0, Oracle 9.2, PL/SQL, SQL Server 2000, SAP R/3,Show Case, Erwin 4.0, Cognos, IBM AIX 4.2, Rational Clear Case and Rational Clear Quest.