Informatica Consultant Resume
NJ
SUMMARY
- Over Seven years in Information Technology, with experience in Data Warehousing teamed with Business Requirements Analysis, Application Design, Data Modeling, Development, testing and documentation.
- Implementation of Warehousing and Database business systems for Pharmaceutical, Insurance, Financial and Telecommunications industries.
- Five years of strong Data Warehousing experience using Informatica Power Center (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica Power Mart, Power Connect, Power Plug, Power Analyzer, Data Mart, ETL, OLAP, ROLAP, MOLAP, OLTP, Autosys, Control M, Visual Source Safe, Maestro, Oracle 9.x/8.x/7.x, UNIX Shell Scripting.
- Proficient in Data Warehousing Concepts like Ralph Kimball and Bill Inmon Methodologies and Six Plus (6+) years of Dimensional Data Modeling experience using Star Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling using ERWIN 4.5/4.0/3.x and Oracle Designer.
- Extensive experience in ETL processes in extracting data from operational and legacy systems to data warehouse using informatica understands complex mappings, transformation rules, data formats and efficient in performance tuning of sources, mappings, targets and sessions.
- Experienced in estimation, planning, Risk Management, finalization of technical / functional specifications, communication management and quality management of the product and Sound knowledge in tuning Informatica mappings and identifying bottlenecks and resolving the issues to improve the performance of the data loads and extracts.
- Six Plus (6) years of experience using Oracle 10g/9i/8x/7.x, DB2 8.0/7.0/6.0, Terradata 2.x, MS SQL Server 2000/7.0/6.5, MS Access 7.0/97/2000, Oracle Report Writer, M-Load, TPUMP, BTEQ, Fast Load, SAS, SQR 3.0, Erwin 4.x/3.5/3.x, SQL, XML, XSL, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000, Win 3.x/95/98/2000, Win NT 4.0 and Sun Solaris 2.x.
- Excellent Client Interaction, Communication and Presentation Skills.
Technical Skills
Data WarehouseInformatica Power Center 8.6.1/8.1/7.1/7.0/6.2/6.1 (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager)
BI & ReportingBusiness Objects XI/6.5/6.0/5.1/5.0, Business Objects SDK, Micro strategy
Data Modeling
Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, Cardinality, ER Diagrams, Erwin 4.5/4.0/3.5.2/2.x, Oracle Designer 2000
DatabasesOracle 10g/9i/8i/8.0/7.3, MS SQL Server 2000/7.0/6.5, IBM DB2, IBM UDB 8.0/7.0/6.0/5.0, DB2/400, Teradata V2R5/V2R4/V2R3 with M-Load, TPUMP & BTEQ, PL/SQL.
Job Scheduling & Other toolsCA Autosys, BMC Control M, Maestro, Visual Source Safe, Quest TOAD 7.6, Quest Central for DB2
EnvironmentUNIX (Sun Solaris, HP-UX, AIX), Windows 2003/2000/XP/98, Sun-Ultra, Sun-Spark, Sun Classic, RS/6000, HP 9000, SCO UNIX, Mainframes
Others
Cobol, Java, XML, JavaScript, XHTML, HTML, DHTML, C++, Visual C++, VBScript, CSS, SQL, Dream weaver, LAN, WAN, TCP/IP, D-AMPS, GSM, W-CDMA, Networking
Functional KnowledgeInsurance, Health Care, Telecom and Finance
EducationBachelors in Computer Science and Engineering. JNTU, Hyderabad.
Professional Experience
Confidential,Princeton, NJ July 2010 - Present
DW/Informatica Consultant
NovoNordisk is one of the world’s leading Pharmaceutical Company, also a provider of Investment Products. NovoNordisk empowers local business units to identify and provide products and services that meet the evolving needs of the customers, using distribution channels best suited to their local markets.
The Project is to develop a new Investment Data Warehouse for NovoNordisk to replace the existing Data warehouse. To re-design the data warehouse, approach to developing the new IDW is to sequentially develop the business intelligence applications required to manage the business.
Responsibilities:
- Responsible for requirement gathering, Business Analysis, user meetings, discussing the issues to be resolved and translated the user inputs into ETL design documents.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, MS Access and Flat files.
- Worked with power center tools like Designer, Workflow Manager, Task Developer, Workflow Monitor, and Repository Manager.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation, Developer, Mapplet Designer and Mapping Designer.
- Extensively used Source Qualifier Transformation and used most of its features like filter, sorter and SQL override
- Extensively used various Active and Passive transformations like Filter, Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, Lookup Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation and Aggregator Transformation
- Solid Expertise in using both connected and unconnected Lookup Transformations.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
- Used Update Strategy DD_INSERT, DD_UPDATE, DD_DELETE and DD_REJECT to insert, update, delete and reject items based on the requirement.
- Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
- Used Debugger wizard to troubleshoot data and error conditions.
- Used Informatica row error logging feature to capture the logs in a relational table.
- Worked with Index cache and Data cache in transformations like Rank, Lookup, Joiner and Aggregator Transformations.
- Developed Reusable Transformations and Reusable Mapplets.
- Fined tuned the mappings and sessions to make them more efficient in terms of Performance.
- Worked extensively with Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
- Worked with workflow System variables like $$$Sessstarttime, Sessstarttime, Sysdate.
- Extensively used Various Data Cleansing and Data Conversion Functions like RTRIM, LTRIM, ISNULL, ISDATE, TO_DATE, Decode, Substr, Instr, and IIF functions in Expression Transformation.
- Responsible for Best Practices like naming conventions, Performance Tuning and Error Handling.
- Worked with Shortcuts across shared and non shared folders.
- Migrated the code using export and Import utilities across various Instances.
- Optimized SQL queries for better performance.
- Expertise in using TOAD and SQL developer for accessing the Oracle database.
- Used Oracle SQL Loader to load data into RDBMS tables from excel files.
- Created pre sql and post sql scripts which need to be run at Informatica level.
- Worked with various Oracle Analytical functions like rank, row etc.
- Created Oracle stored procedures and triggers to automate the time consuming which were to complicated for the standard SQL statements
- Responsible for Unit Testing of Mappings and Workflows.
- Created UNIX Shell scripts to start and stop sessions.
Environment:
Informatica Power Center 8.6.1(Repository Manger, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow
Designer, Task developer, Worklet Designer), Power Connect, PeopleSoft, Oracle 9i, 10g, SQL server, Access, Flat files, SQL/PLSQL, T-SQL, UNIX shell scripting, Windows XP, TOAD, SQL Developer.
Confidential,IL February 2009 – July 2010
DW/Informatica Consultant
Agribusiness Loss Analysis Enhancements Project dealt with creating a rich business intelligence platform that supports all reporting and analytical needs to analyze the loss incurred to the clients at different locations and helping the clients to overcome the losses by providing the accurate reports on the losses incurred at the different locations. The project created an enterprise data warehouse where data is extracted from the multiple databases, Flat Files and loaded across multiple layers like Staging, Foundation Layer (Operational Data Source), Information Layer (Data Marts).
Responsibilities:
- Worked with various active transformations in Informatica PowerCenter like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation
- Extensively worked with various Passive transformations in Informatica PowerCenter like Expression Transformation, Sequence Generator, Sorter Transformation, and Lookup Transformation
- Extensively worked with Slowly Changing Dimensions Type1, Type2, and Type3 for Data Loads
- Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level
- Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation
- Responsible for Performance in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level
- Extensively worked with both Connected and Un-Connected Lookups
- Extensively worked with Look up Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
- Worked with re usable objects like Re-Usable Transformation and Mapplets
- Extensively worked with aggregate functions like Avg, Min, Max, First, Last, Std Deviation in the Aggregator Transformation
- Extensively made use of sorted input option for the performance tuning of aggregator transformation
- Extensively used SQL Override function in Source Qualifier Transformation
- Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation
- Responsible for migrating the workflows from development to production environment
- Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters
Environment:
Informatica PowerCenter 8.5/8.0(Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), MicroStrategy, Teradata, DB2 UDB 8.1, Oracle 9i, Teradata, Flat Files, UNIX, Windows XP.
Confidential,KS. November 2007 – December 2008
DW / Informatica Developer
Sprint PCS, a leading Telecommunication Company and the nation’s leading wireless service providers along with major role in local telephoneindustry. The primary objective of the Order Status project is to provide data to track status related information during the order life cycle, send notifications to customers to provide real-time order updates,and to generate informational and analytic reports for Sprint’s internal and external customers. Project involved in product development & Research in Development of Regulatory Affairs, Biostatistics and Data Management, Surveillance, Quality Assurance. The project also dealt with the Sales & Marketing Data collected from different tests and IMS data which consisted of all the sales and marketing information for the Sprint products in the market and subscriber information. The data was then consolidated into the data mart which was used for report generation.
Responsibilities
- Interacted with the business community and database administrators to identify the business requirements and data realties.
- Responsible for dimensional modeling of the data warehouse to design the business process.
- Parsing high-level design specification to simple ETL coding and mapping standards.
- Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
- Documented standards handbook for Informatica code development.
- Developed a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data warehouse.
- Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
- Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator, and Joiner on the extracted source data according to the business rules and technical specifications.
- Loaded the data from IMS files into oracle tables using informatica with any data cleansing required.
- Worked on SQL tools like TOAD to run SQL queries and validate the data.
- Worked on database connections, SQL Joins, views in Database level.
- Extensively used SQL*Loader to load Data from flat files to Database tables in Oracle.
- Used Power Center server manager/Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
- Used UNIX shell scripting for Scheduling Informatica Workflows.
- Actively interacted with business analysts and assisted in creating Classes and Objects using Business Objects.
- Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
- Implemented Slowly Changing Dimensions to update the dimensional schema.
- Implemented procedures/functions in PL/SQL for Stored Procedure Transformations.
- Monitored workflows and collected performance data to maximize the session performance.
- Resolved memory related issues like DTM buffer size, cache size to optimize session runs.
- Documented the mapping process and methodology used to facilitate future development.
Environment:
Informatica Power Center 7.1, Informatica Power Connect, (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager), Erwin 4.5, Flat files, Oracle 9i, MS SQL Server 2000, PL/SQL, Business Objects 6.5, Shell Programming, SQL*Loader, IBM DB2 8.0, Toad, Excel, Unix scripting, Sun Solaris, Windows NT
Confidential,GA July 2006 – September 2007
DW/ Informatica Developer
Emory, the nation’s leading risk and insurance services firm. The project was to provide the profitability reports to ESM (European Senior Manager) and CSM (Country Senior Manager). The project dealt with the Revenue generated per month and the cost incurred at different offices and in different countries of Europe to analyze the profitability by different segment, by office, by company, by client and by country.
Responsibilities
- Resposible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 6.2.
- Participated in all phases including Requirement Analysis, Design, Coding, Testing and Documentation.
- Created Workflows, Tasks, database connections, FTP connections using Workflow Manager.
- Designed new Database tables to meet business information needs, Designed mapping document, which is a guideline to ETL Coding.
- Extensively used Informatica to load data from Flat Files to Oracle & Oracle to Oracle. Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, and Stored Procedure transformations.
- Worked on Informatica Power Center 6.2 tool -Source Analyzer, Warehouse designer, Mapping Designer & Mapplet, Transformations, Work Flow Manager (Task Developer, Work lets, and Work Flow Designer) and Monitored the sessions using Work Flow Monitor.
- Designing the process monitor for automation of the system. Which keeps the load history, load summary, load status of the system.
- Added enhancements such as mail functioning, success flag, pre and post session functions etc.
- Extensively migrated data from different sources such as flat files to ODS, DataMarts and Datawarehouse
- Performed tuning of Informatica Mappings for optimum performance.
- Standardized parameter files to define session parameters such as database connection for sources targets, last updated dates for Incremental loads and many default values of fact tables.
- Generated reports using Business Objects Report Designer.
- Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
- Designed and developed UNIX scripts to create and drop tables which are used for scheduling the jobs.
- Database filtering using Unix Shell Scripts (awk, sed). Shell Scripts were run through UNIX Cron scheduling Batch sessions
- Analyzing the data loads and supporting the production.
- Handling daily load issue and standardizing the system to business as usual state.
Environment:
Informatica Power Center 6.2, HP-Unix, Oracle 9i, Teradata V2R5/V2R4, DB2 7.0, Erwin 4.0, Unix Shell Scripting, Cognos Impromptu, Autosys, MS PowerPoint, Visual Basic 6.0, SQL Navigator 4.0, TOAD, PL/SQL Developer MS SQL Server 2000, Unix, Win NT 4.0