Ecolab Data Warehouse Resume Profile
PROFESSIONAL SUMMARY: |
- 8 Years of IT Experience in Informatica Power Center development ETL and Data Integration/ Data Warehousing techniques.
- Currently working as full time Senior ETL developer/Data Integration specialist with IBM for client Ecolab.
- Major competencies in Informatica PowerCenter development, Informatica administration activities, Informatica Cloud Services, dimensional and data modelling, data integration, data warehouse maintenance, support and development, PL/SQL and data profiling using Informatica analyst tool.
- Good understanding of ETL techniques and dimensional data models as worked on various end to end ETL projects.
- Certified Scrum Developer.
- Experience on data profiling using Informatica Analyst tool.
- Good knowledge and experience of Informatica Cloud services with data integrations tasks for Salesforce.com
- Extensively used Power Exchange for SAP Netweaver.
- Performed ETL testing using Informatica Data Validation Option DVO .
- Worked on job scheduling tools e.g. UC4.
- Good knowledge and experience of UNIX commands and scripts.
- Worked extensively on heterogeneous source and targets systems for ETL e.g. Oracle,DB2, SQL server, flat files, Mainframes, Salesforce and SAP
- Extensively worked on both development and 24 7 Production support environments.
- Worked on migration project for migration of Informatica 8.6 to 9.1 for Ecolab.
- Extensive knowledge of Supply chain and Finance domains and implementation of IT processes in these domains.
- Managing and leading various geographical teams time zones ,
- co-ordination with third party vendors, effort estimation, capacity planning and tasks assignment and scheduling.
- Extensively worked on onshore-offshore models and 24 7 Production support environments.
- Good organizational, interpersonal and communication skills.
- Strong problem solving, troubleshooting and analytical skills.
- Ability to quickly learn new technologies in a dynamic environment.
- Implemented CMMI level 5 standards, IT best practices and processes.
EXPERIENCE DETAILS: |
- 4 years of experience at IBM Aug 2010- Till Date
- 2 years of experience at Wipro Technologies Jul 2008-Aug 2010
CURRENT ROLES AND RESPONSIBILITIES: |
- Develop and implement all ETL procedures for all new projects and provide support to all existing applications
- Prepare ETL processes according to business requirements and perform tests and validate all data flows.
- Documents all technical and system specifications for all ETL processes and perform unit tests on all processes and prepare required programs and scripts.
- Schedule the jobs and also monitor production data loads and provide support in case of job failures and data issues.
- Analyze and interpret all complex data and coordinate with data analyst to validate all requirement
- Design and implement all publication activities such as dashboards and maintain required system documentation.
- Collaborate with all business users to gather requirements and also work with them to resolve day to day data related issues and bug fixes.
- Implement bug fix, provide resolution and perform root cause analysis for all production issues. Validate the data and perform routine tests on databases and provide support to all ETL applications.
- Release management of all ETL codes.
- Creation of Informatica cloud data synchronization, data replication and contact validation tasks for Salesforce.com
- Migration of all ETL objects from different environments in accordance with the change management process.
- Document all test procedures and coordinate with business users to resolve all requirement issues and maintain quality for same.
- Work with multiple stakeholders Application Owners, Architects, QA, Business Analysts external vendors.
- Perform the system and unit testing for all the ETL code and work with business users during UAT to resolve any issues.
Infrastructure Support -
- Responsible to manage Development ,QA and Production environments
- Making sure that code is sync in all the three environments and migration of codes between repositories using Deployment Groups.
User Creation and Security -
- Creation of various types of connection e.g. Application,relational,FTP etc.
- Connections Creation and other Custom Properties:
- Responsible for Add/Delete users in the repositories and defining roles and Privileges for different users.
- Defining of permissions on various connections.
- Creation and Maintenance of various services like SAP and BW in informatica.
- Creation of Custom Properties as per the requirements from developers.
Load Monitoring
- Monitoring the loads workflows whether the loads are scheduled to respective timings.
- Recovery or Re-start in loads in case of any failures.
- Identifying the tardiness in loading process in case of any sessions are in hung status.
- We have done automations for most of the above monitoring activities but still have some manual re-checks.
Other Activities:
Install, upgrade, configure and maintain Informatica components including PowerCenter. Coordinate with Informatica support for patches and bug fixes. Understand complex Informatica maps, workflow and session log, and participate in configuring and tuning of respective objects to meet the performance and recovery objectives. Working with LDAP configuration and user management. Coordinate with development and infrastructure groups to prepare and implement planned maintenance outages. Communicate effectively with management and provide heads up on issues, business impact, mitigation and resolution plans and escalation procedures. Manage and lead offsite and onsite team members.
TECHNICAL SKILLS: |
ETL tools. | Informatica 9.x, Informatica Cloud Services, Informatica data validation option DVO ,Informatica Analyst |
Administration | Informatica Administration Informatica 9.1 HF4 |
Databases | Oracle 9i/10g, SQL Server, IBM DB2. |
Tools and Utilities | Toad 9.0, Harvest, Remedy, Putty,Service Now,UC4 |
WORK EXPERIENCE DETAILS: |
Project | Ecolab Data Warehouse |
Client | |
Duration | |
Technology | Informatica Cloud Services |
Project Abstract | Confidential is the leading global developer of premium cleaning, sanitizing, pest elimination, maintenance and repair products and services for the world's hospitality, institutional and industrial markets. Headquartered in St. Paul, Minnesota. Customers include hotels and restaurants foodservice, healthcare and educational facilities quick service fast food units commercial laundries light industry dairy plants and farms and food and beverage processors. Products and services are marketed by the industry's largest and best-trained direct sales-and-service force, numbering more than 10,000 associates who advise and assist customers in meeting a full range of cleaning, sanitation and service needs. The current project involves the development and maintenance of the data warehouse and ETL mappings for Ecolab, which includes all the sales, performance, global customers, products ,employees and business related data. |
Roles Responsibilities |
|
Project | |
Client | Confidential |
Duration | |
Technology | |
Project Abstract | Genpact is one of the top most IT ITES outsourcing firm which has got around 30 different businesses across different global locations each business consists of many sub businesses and every sub business has got multiple processes. In this project a financial data warehouse is built for the organization from which the monthly and quarterly financial reports are published as per the requirements for business analysis. |
Responsibilities |
|
Project | |
Client | Confidential |
Duration | |
Technology and Tools | |
Project Abstract | Seagate HRIT was a development, maintenance and enhancement project which involved warehousing of HR data of Seagate employees. It involved creation of mappings for generating payrolls and other employee related information of Seagate employees worldwide. It also involved time to time changes in the existing mappings as per the revision of HR payroll policies of Seagate.Data warehouse also consisted of sales data related to Seagate products across the world. The sales data was used to create region wise reports for the analysis of sales and profit and losses. |
Responsibilities |
|
Project | |
Client | |
Duration | |
Technology and Tools | Informatica, Oracle 9i,Toad,PL/SQL Developer |
Project Abstract | GAPC project stores the performance data of various GE businesses in the data warehouse. The businesses include GECIS,GEINFRA,GEMONEY,GE HEALTHCARE,GE COMMERCIAL FINANCE. The data warehouse stores data collected from BAC database performance testing tool ,this data is used by various GE customers for the analysis of their various applications portals .This helps them analyse past trends for the performance of their applications downtime,maintenance time, response time etc. |
Responsibilities |
|