Informatica Lead Developer Resume
Summary:
- Working as an Onsite Coordinator.
- 5 years of IT experience in Cognizant Technology Solution in Analysis, Design, Development and Implementation of Data warehouse systems and Expertise in Teradata, Informatica, Oracle and Unix/Linux.
- Extensively worked in Designing, Coding and Testing phases of software development life cycle process, which includes implementing and supporting Teradata & batch-processing jobs.
- Worked on Teradata Performance Tuning Activities.
- Worked in conceptual, logical and physical data models using ERWIN.
- Worked in SQL and PLSQL in Teradata and Oracle.
- Worked on Teradata Client Utilities using BTEQ, Fast load, MultiLoad, Tpump and Fast export.
- Worked on Design Docs, Unit Test Cases, QA Test Cases, QA Test Plan & Test Strategy.
- Worked on Basic Administrative Activities in Teradata.
- Worked on developing various test cases for testing Informatica code.
- Worked with major Pharma Clients like Pfizer, Bayer,Abbott
- Worked on Data modeling.
- Experience in Life Sciences Domain.
- Excellent communicational, interpersonal, analytical and problem solving skills.
EDUCATION:
Bachelors of Technology, Electrical
WORK EXPERIENCE:
Confidential,USA
Role: Informatica Lead
PROJECT DESCRIPTION:
This Project deals with the trend of Trx and Nrx market share for the abbot products over a period of 24 months. it gives business user report of how the Abbott products are performing in the market. it provides the share of Abbott products over total market as defined by therapeutics. It also gives an insight of how competitor products are performing in the market.
RESPONSIBILITIES:
- Interacted with Functional teams to understand the requirements.
- Data Profiling Activities.
- Analyzed existing SAS codes for conversion into Informatica ETL code.
- Design, Development and Testing and coordination with Offshore Team.
- Data comparison with SAS files and ETL output Files.
- Involved in writing SQL scripts for unit and functional testing.
- Provided recommendations at query-level & also at Table-Level to improve the performance.
Environment: Informatica, Teradata, UNIX
Confidential,, USA
Role: Informatica Lead
PROJECT DESCRIPTION:
This Project is a Legacy SAS system conversion into Informatica and Teradata.This project has 5 deliverables exponent plantrak, Mail order, Encumbered Mail order and Specialty with no plan and Specialty with plan. We bring up data at brand level from Strength using business rules and give it to BOT.BOT is downstream system process(aggregate to the another level maintained by BOT) this outbound Files tagging BOT entity ID and BOT drug ID Comes again as Inbound which is used by Abbott internal applications for their analysis.
RESPONSIBILITIES:
- Interacted with Functional teams to understand the requirements.
- Data Profiling Activities.
- Analyzed existing SAS codes for conversion into Informatica ETL code.
- Design, Development and Testing and coordination with Offshore Team.
- Data comparison with SAS files and ETL output Files.
- Involved in writing SQL scripts for unit and functional testing.
- Provided recommendations at query-level & also at Table-Level to improve the performance.
Confidential,USA
Role: Informatica Lead
PROJECT DESCRIPTION:
At present, multiple consumer groups are loading different sets of Backbone data into their applications themselves. There is the problem of data redundancy, along with loss of data, doing the same jobs multiple times causing un-necessary delays. Hence we would like the Backbone files made available in a single unified location eliminating the need for separate applications to load the data themselves. The ODS Backbone tables will be the single location for BOT Lives, Formulary, Plan for all Abbott applications. The automation of the load will make the process seamless and efficient.
RESPONSIBILITIES:
- Interacted with Functional teams to understand the requirements.
- Data Profiling Activities.
- Design, Development and Testing and coordination with Offshore Team.
- Involved in writing SQL scripts for unit and functional testing.
- Provided recommendations at query-level & also at Table-Level to improve the performance.