Technical Lead Resume
Michigan, UsA
SUMMARY
- Over 5 years of experience in Business Intelligence Domain. Have been well versed with various business processes and applications
- Technological forte is IBM Web Sphere DataStage & Business Objects with DB2 database in UNIX environment.
- Expert knowledge in creating Server jobs and Parallel jobs using IBM Web Sphere Data stage V7.5.x/6.x with Best Practices for ETL projects and Code Management.
- Hands on training on IBM Web Sphere Data stage V8.
- Expert knowledge in IBM Web Sphere Data stage client components Data Stage Designer, Data Stage Director, Data Stage Manager and Data Stage Administrator.
- Implemented reusability concepts in Data Stage using Server Routines and Shared Containers in both server and parallel jobs.
- Strong Skills in Coding (Stage Variables, Derivations and Constraints) Business Logics in Transformation Stage.
- Experience in Developing Job Automation Process, Job Scheduling, Running DataStage Jobs Using DataStage Job Control And Global Parameter File.
- Used Import / Export of DataStage projects and individual components.
- Proficient in Debugging and Testing DataStage jobs(UAT and SIT).
- Ability to integrate stored procedures and user defined SQL in to DataStage
- Proficient in Parallel Extender Stages and Parallel Processing.
- Developed and tuned both SQL query and DataStage jobs for better performance
- Well versed with Korn shell scripting.
- Extensively Used SQL-Analyzer for Query Optimization and to create Test SQL’s.
- Good experience of working RDBMS likes DB2 UDB, and well versed with Oracle 10g
- Experience of generating reports as per business requirement using Business Objects.
- Have been involved in all phases of software development including requirement gathering, design, coding, testing and implementation and have been involved in a role that requires client interaction and managing deliveries.
- Excellent ability to learn and implement new tools/technology.
- Excellent Analytical and Logical programming skills with a good understanding at the Conceptual level.
- Possess Excellent Presentation, Interpersonal skills with a strong desire to achieve specified goals.
SOFTWARE SKILLS:
Languages : PL/SQL, SQL,HTML, XML
Operating Systems : UNIX (AIX 5.x, Sun OS 5.x, Solaris 2.6), Windows
Databases : DB2, ORACLE, SQL server, MS - Access
ETL Tools : DATASTAGE V7.5.x/6.x/5.x/4.x
Data Warehouse Tools : Erwin, Visio
BI Tools : BUSINESS OBJECTS V XI R2/6.5
EDUCATION & TRAINING
- Bachelor of Engineering, Electronics & Instrumentation
- IBM certified Database Associate
- Conducted IBM WebSphere Data Stage Version 7.2
- Attended IBM WebSphere Data Stage Version 8
- Attended IBM WebSphere Data Stage Version 7.5x,
- Attended IBM DB2 UDB
- Attended Business Objects Version X2 RI,6.5
PROFESSIONAL EXPERIENCE
Client: Confidential,Michigan, USA Jul 2006 to till date
Project Title: Data Management, Jul 2008 to till date
Designation: Technical Lead
Project Details:
The scope of this project includes providing production support for all the applications like BCFR(Business Critical and Financial reporting) , SHAIR (Shared Applications and Infrastructure Repository ) , Mypersonnel,FLEET etc. within Chrysler and doing enhancements as per the user requirements.
Responsibilities:
- Requirement Gathering – Interacting with the client for requirements gathering. Analysing the business requirements quoted by client and documenting them
- Coordination and Delivery Management – Coordinating with offshore to ensure timely deliveries and ensuring knowledge transition across team members. Coordinate with the reporting team to ensure that the reports are as per the client requirements.
- Establish load strategies and error correction methodologies - Carrying out a Data Quality Analysis and highlight business / IT processes to improve it.
- Production Support – Providing post production support for all the applications of Chrysler. Monitoring of jobs and scheduling the jobs.
- Facilitate Enhancements - Design the jobs, support UAT process.
Project Title: SHAIR, May 2008 to Jun 2008
Designation: Technical Lead
Project Details:
One of Chrysler’s application is SHAIR (Shared Application and Infrastructure Repository) .It is a data hub and data warehouse of IT support systems.
The hub functionality enables the various IT support system to exchange data between them using existing processes, for example the application ID ( i.e AMS ID) is defined and maintained by the application owner in AMS and all related information is made available to consuming systems such as IPR, ECC, DCCS, TTTS, etc. , MSTS use the IPR reference number which provide the association between the host and the AMS ID, the infrastructure Discovery associate the host to the installed products and configured instances on the host, and so on. SHAIR consume data using two primary tools:
Custom Lotus Notes agents for Domino based applications ETL for all supported data sources supported by ETL, currently the following are being accessed (ODBC to UDB, DB2 and Sybase, and CSV files), an in process request for XML.
· The data warehouse functionality in SHAIR includes relevant data from several systems, that provides custom reports and ad hoc query framework, the integration is ongoing the following is a list of the IT support systems contributing to SHAIR (AMS, IPR, ECC, Einfo, Auto Discovery, MSTS, CSS, DNS, Tech Portal, SeRV)
Responsibilities:
- Requirement Gathering – Gathering and Analyzing the business requirements quoted by client and documenting them.
- Designing and Developing jobs – Designing and Developing jobs as per the client requirements
- Preparing Unit Test Plans - Preparing test cases and providing support for UAT.
- Establish load strategies and error correction methodologies - Carrying out a Data Quality Analysis and highlight business / IT processes to improve it .Finalizing the data architecture, determining the required physical tables and their relations with each other
- Documentation & Backup - Publishing the processes on knowledge sharing platforms and taking weekly backups
- Production Support – Providing post production support. Monitoring of jobs and scheduling the jobs.
Project Title: Confidential,Jul 2007 to Jan 2008
Designation: Technical Lead
Project Details:
IPATS (Investment Planning and Tracking System) is an application used for tracking of activities such as capital forecasting, authorizing capital expenditure and restricting capital expenditure to authorized levels. A key factor of IPATS is it enables better communication between the different groups IPATS has three business tiers driving the application – Platform Group,
Operating Group and Spending Location. The application ensures a mechanism of transfer of funds and tracking the expenditures between these groups.
Analysis Reports form an important part of IPATS. Reporting access is required to all the above groups of IPATS users i.e. Platform, Operating Group and Spending Location. These reports can be briefly categorized into Absolute Reporting and Variance Reporting.
The project work included understanding the information required in the analysis reports, designing the data warehouse / data mart so as to facilitate these reports, developing ETL components to load these data stores, Designing a Business Objects Universe and developing reports that can run on this universe. As a special requirement, the Business Objects Universe had to be robust enough to also enable the users to perform specialized ad-hoc queries not summarized in the above reports.
Responsibilities:
- Analyzing client requirements and Creating Technical Specification Documents – Analyzing the requirements quoted by client and documenting them
- Establish load strategies and error correction methodologies: Carrying out a Data Quality Analysis and highlight business / IT processes to improve it
- Preparing Unit Test Plans - Assessing the information needs of the users and developing the test cases for the jobs to be developed for Data Migration.
- Designing and Developing jobs – Designing and Developing jobs for data migration using ETL tool, Websphere Data Stage.
- Job Optimization - Optimization of job designs
- Unit Testing - Testing of the jobs for quality assurance.
- Defect Analysis - Analyzing code defects found during the unit testing and rectifying them.
- Providing support for User Acceptance Testing (UAT) - Rectifying the errors detected as well as incorporating changes or suggestions given by client during UAT.
- Documentation & Backup - Publishing the processes on knowledge sharing platforms (iQMS) and taking weekly backups
Project Title: VMDM, Feb 2007 to Jun 2007
Designation: Sr. Data Stage Developer
Project Details:
VMDM stands for Vehicle Mix and Demand Management. This is used to produce various reports comparing the forecasting data and the actual data. Based on that the Demand is calculated and hence the production, sales etc are controlled.
There are various reports generated from VMDM data as below:
Mix Consensus Report (Daily): Dealer Bank Order (11pm)
New Day Celebration Report (Daily)
PCBS (Price Class Body Style) Report (Monthly)
Polk Report (Monthly – 2nd week of every month):
The actual data comes from the External agency (Registration Office: Sales & Lease).The data comes at 9am and report is generated at 5pm of the same day
Polk sends the file and then VMDM ftps that data file to its own production server, then loading of MART tables is started.
Once loading completes, a manual validation (firing select query to count the number of loaded records and comparing with the actual no of records present in the file) is done.
Every year the Polk team sends a doc containing the time of sending the Polk data every month. Based on that they will send their data. It may vary.
Responsibilities:
- Analyzing client requirements and Creating Technical Specification Documents – Analysing the requirements quoted by client and documenting them
- Establish load strategies and error correction methodologies: Carrying out a Data Quality Analysis and highlight business / IT processes to improve it .Finalizing the data architecture, determining the required physical tables and their relations with each other.
- Preparing Unit Test Plans - Assessing the information needs of the users and developing the test cases for the jobs to be developed for Data Migration.
- Designing and Developing jobs – Designing and Developing jobs for data migration using ETL tool, Websphere Data Stage.
- Job Optimization - Optimization of job designs
- Unit Testing - Testing of the jobs for quality assurance.
- Defect Analysis - Analyzing code defects found during the unit testing and rectifying them.
- Providing support for User Acceptance Testing (UAT) - Rectifying the errors detected as well as incorporating changes or suggestions given by client during UAT.
Project Title: Mypersonnel, Jul 2006 to Jan 2007
Designation: Data Stage Developer
Project Details:
This project was an outsourcing venture by Chrysler Group. Scope of project starts from ETL construction to UAT for various layers of data warehouse.
MyPersonnel data mart is being used by the dealer connect team in the sales and marketing division. It holds the data about the users of the dealer connect portal that is used for the administration of the portal. It has information about the corporate individuals (represented by TID), Dealer individuals (represented by SID) and Dealer Service Providers. It includes HR information and security related information.
The system has around 28 Business Objects reports that users are refreshing themselves whenever they want
Responsibilities:
- Analyzing client requirements – Analysing the requirements quoted by client and documenting them
- Establish load strategies and error correction methodologies: Carrying out a Data Quality Analysis and highlight business / IT processes to improve it. Finalizing the data architecture, determining the required physical tables and their relations with each other.
- Preparing Unit Test Plans - Assessing the information needs of the users and developing the test cases for the jobs to be developed for Data Migration.
- Designing and Developing jobs – Designing and Developing jobs for data migration using ETL tool, Websphere Data Stage.
- Job Optimization - Optimization of job designs
- Unit Testing - Testing of the jobs for quality assurance.
- Defect Analysis - Analyzing code defects found during the unit testing and rectifying them.
- Providing support for User Acceptance Testing (UAT) - Rectifying the errors detected as well as incorporating changes or suggestions given by client during UAT.
- Documentation & Backup - Publishing the processes on knowledge sharing platforms (iQMS) and taking weekly backups
- Providing support for reports using Business Objects.
Environment:
IBM Websphere DataStage 7.5.1, Business Objects 6.5/XI R2, IBM DB2 UDB, UNIX AIX
Client: Confidential,U.K. Jul 2005 to Jun 2006
Project Title: Serco Data Migration Project
Designation: Data Stage Developer
Project Details:
SERCO data migration is a part of Formula100 project undertaken by Serco, a financial services company based in UK. The objective of this project is to migrate data from various Legacy Systems into SAP. DataStage was used as the ETL tool for data migration.
Responsibilities:
- Analyzing client requirements. – Analysing the requirements quoted by client and documenting them.
- Establish load strategies and error correction methodologies: Carrying out a Data Quality Analysis and highlight business / IT processes to improve it.
- Managing team members – Managing project plan to ensure timely deliveries and ensuring knowledge transition across team members
- Preparing Unit Test Plans - Assessing the information needs of the users and developing the test cases for the jobs to be developed for Data Migration.
- Designing and Developing jobs – Designing and Developing jobs for data migration using ETL tool, Ascential Data Stage.
- Job Optimization – Optimization of job designs
- Unit Testing - Testing of the jobs for quality assurance.
- Defect Analysis - Analyzing code defects found during the unit testing and rectifying them.
- Data Cleansing - Performing data cleansing using Quality Stage tool.
- Providing support for User Acceptance Testing (UAT) - Rectifying the errors detected as well as incorporating changes or suggestions given by client during UAT.
- Documentation & Backup - Publishing the processes on knowledge sharing platforms (iQMS) and taking weekly backups.
Environment:
IBM Websphere DataStage 7.5.1, Windows
Client: Confidential,U.S.A. Jul 2004 to Jun 2005
Project Title: Sony – Horizon EDW
Designation: Data Stage Developer
Project Details:
SPE\' global operation encompasses motion picture production and distribution, television programming and syndication, home video acquisition and distribution, operation of studio facilities, development of new entertainment technologies and distribution of filmed entertainment in 67 countries worldwide. SPE is following a bottom-up, iterative approach for building its Enterprise Data Warehouse (EDW). As part of this initiative, approximately three phases/tracks have already been completed, and currently two subject areas are being integrated into the EDW. SPE is now planning to integrate four more subject areas into the EDW during next year.
Responsibilities:
- Analyzing client requirements – Analysing the requirements quoted by client and documenting them.
- Establish load strategies and error correction methodologies: Carrying out a Data Quality Analysis and highlight business / IT processes to improve it.
- Preparing Unit Test Plans - Assessing the information needs of the users and developing the test cases for the jobs to be developed for Data Migration.
- Designing and Developing jobs – Designing and Developing jobs for data migration using ETL tool, Ascential Data Stage.
- Job Optimization – Optimization of job designs
- Unit Testing - Testing of the jobs for quality assurance.
- Defect Analysis - Analyzing code defects found during the unit testing and rectifying them.
- Providing support for User Acceptance Testing (UAT) - Rectifying the errors detected as well as incorporating changes or suggestions given by client during UAT.
- Documentation & Backup - Publishing the processes on knowledge sharing platforms (iQMS) and taking weekly backups.
Environment:
IBM Websphere DataStage 7.5.1, IBM DB2 UDB, UNIX AIX