Sr. Mdm/etl Developer Resume
Melville, NY
SUMMARY
- Seven Plus (7+) Years of Total IT Experience in Analysis, Design, Development, Implementation, Testing and maintenance of Business Intelligence solutions using Data Warehousing/Data mart design, ETL, OLAP,OLTP client /server applications.
- Strong Data warehousing and ETL experience using Informatica Power Center 9.5.1/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x, Informatica MDM 9.6.1, IDQ Developer/Analyst 9.5.x/8.6.x, Warehouse Designer, Source Analyzer, mapping Designer, Transformation Developer, Mapplet Designer, Mapping Designer.
- Excellent knowledge upon RDBMS, OLAP, OLTP, Data Marts, ODS Systems.
- Mapping experiences using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Sequence Generator, Unconnected / Connected Lookup, Update Strategy, Aggregator, Sorter, Union, Xml Generator, XML Parser, Transaction control, Match, Key Generator, Association, Consolidation, Labeler, Parser, Address Validator, Classifier, Decision, Standardizer, Expression, Case Converter and Merge
- Efficiently handled the granularity, indexing and partitioning with data warehouses.
- Mapping experiences with SCD TYPES, Master, Transactional Subject areas & other dimensional models with respect to ETL Data warehouses.
- Maintained warehouse Meta data, naming standards and warehouse standards for future application development.
- Performing Profiles, Standardization, Identifying Anomalies, creating/Applying rules and created mappings/mapplets.
- Good experience upon Business Intelligence reports using Business Objects, COGNOS, OBIEE, and Jaspersoft.
- Performed End - to-end data lineage with IDQ along with maintaining an excellent relationship with the end client.
- Worked closely with the users to understand the current state of information availability in the enterprise and then identify future needs based on their analysis of business requirements, current state environments, gap analysis, and future state warehousing implementation.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups and packages
- Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
- Performed gap analysis between the current state and future data warehouse environment identifying data gaps and quality issues plus recommending potential solutions.
- Worked closely with the data warehouse development team to ensure user requirements and issues being addressed while controlling scope.
- Data modeling experience using Star Schema/Snowflake modeling, FACT & Dimensions tables, Relational & Dimensional tables in perspective of Master and Transactional data.
- Good knowledge upon databases using Teradata, Oracle 11g/10g/9.x/8.x/7.x, MS SQL Server 2000/2005, SQL, PL/SQL, SQL * Plus, SQL*Loader, Toad 7.3 / 8.x . 0/ 9.1/ 10.3.3.0/ 11.5.
- 2+ years of Informatica Power Center Administration in 8.x,9.x including server setup, configuration, client installations, deployments, backups, repository management, server health and maintenance, performance monitoring and improvements, patching connectivity to connect to other databases., setting up ODBC.
- Experience in Performance tuning in Informatica Power Center.
TECHNICAL SKILLS
Data Warehousing: Informatica PowerCenter 9.5.1/9.1.x/9.0.x/8.6.x/8.1.x/7.1.x (Repository Manager, Source Analyzer, Designer, Server manager, Work Flow Monitor, Warehouse Designer, Mapplet Designer, Mapping Designer, Work Flow Manager), IDQ Developer/Analyst 9.5.1/8.6.1 , Informatica MDM 9.6.1, OLAP, OLTP, IDQ/IDE,SQL * Plus, Informatica Data Quality, Informatica Data Profiler .
BI & Reporting: Business Objects 6.5/6.0/5.1/5.0 , SQL Server Reporting 2005, Siebel Analytics 7.8, OBIEE 10.x/11.x,Cognos 8.1/8.2/8.3/8.4.
Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, ER Diagrams, ERWIN, ERStudio 7.1/6.1
Other Tools: HP Service Manager, IBM Rational Clear Quest 7.1.1.3.
Databases: Oracle 11g/10g/9i/8i, Teradata 13.0, MS SQL Server 2008/2005/2000 , Oracle SQL Developer 3.0.04, SQL*Loader, IBM Data Studio 2.2.1.0, EDW DB2
Programming: SQL, UNIX.
Job Scheduling: IBM Tivoli Manager 5.0, Autosys, Control-M, Mainframes.
Environment: UNIX, RedHat Linux 5.x/4.x/3.x, Linux 2.6.32 Solaris, Windows 2008/2003
Other Tools: TOAD, RapidSQL, SQL Plus, CYGWIN (X-Windows Server), WinscpCore FTP LE 2.2, Putty, AIX Servers, SONAS, EDGE Servers.
PROFESSIONAL EXPERIENCE
Confidential, Melville, NYSr. MDM/ETL Developer
Responsibilities:
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Working with various sources such as Flat files, Relational, XML and Web services as part of Informatica Mappings.
- Designed ETL processes using Informatica to load data from Oracle, Flat Files (Excel/Access) and XML Files to target Oracle Data Warehouse database.
- Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle NCLOB data type to store XML files.
- Experience in developing XML/XSD as a part of Source XML files for Informatica and also input XML for Webservice Call.
- Loading XML messages received from various source systems into Database using JMS queues/Informatica.
- Performed data integration from Informatica PWC into Salesforce cloud and also forSAPto pull data fromSAP R/3.
- CreatedETL mappingsandmappletsto extract data from ERPs like SAP, Oracle and load into EDW (Oracle 10g), Salesforce, Flat files etc.
- Involved in implementing the Land Process of loading Data Set into Informatica MDM from various source systems.
- Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Developed cleanse functions in IDQ and was responsible to integrate them in MDM cleanse Library.
- Configured match rule set property by enabling search by rules in MDM according to Business Rules.
- Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
- Developed the automated and scheduled load processes using Autosys scheduler. Involved in migration of mappings and sessions from development repository to production repository.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data conversion, exception handling, cleansing mapplets, exposing PDO's, mapplets as web service and monitoring capabilities of IDQ.
- Utilized of InformaticaIDQ 9.5.1to complete initialdata profiling and matching/removing duplicate data.
- Developed cleanse mapplets in IDQ and exposed them as a service to other systems like saleforce.
- Developedshell scriptsfor Daily and weekly Loads and scheduled using Autosys.
- Responsible for Unit testing and Integration testing of mappings and workflows.
- Analyzed, documented and maintained Test Results and Test Logs.
- Actively participating in Agile process development style like attending scrum meeting (standup meetings).
Environment: Informatica PowerCenter 9.5.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), IDQ, MDM 9.6.1, Oracle 11g, SQL Developer, Data Loader, JMS Queue, Tibco, Salesforce, SAP, TFS, Unix Shell Scripting, putty.
Confidential, Denver, CO
Sr.Informatica Developer
Responsibilities:
- Designing the source to target mappings that contain the Business rules and data cleansing during the extract, transform and load process.
- Responsible for converting Functional Requirements into Technical Specifications and production support.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Consultant, Mapplet Designer and Mapping Designer.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
- Utilized IDQ for data profiling, standardization and structuring the data.
- Developed Oracle views to identify incremental changes for full extract data sources.
- Developed the automated and scheduled load processes using Tidal scheduler. Involved in migration of mappings and sessions from development repository to production repository.
- Responsible for Unit testing and Integration testing of mappings and workflows.
- Performed data integration from Informatica cloud into SAP & Salesforce cloud.
- InEDWwe load the batch related data based business rules, this data we extract fromSAPsource system.
- Proficient in developing PL/SQL Stored Procedures, Packages and triggers to implement Business logic.
- Analyzed, documented and maintained Test Results and Test Logs.
- Exposure on partitioning for loading large volumes of data.
- Scheduled Informatica workflows using Informatica Scheduler to run at regular intervals.
- Developed Reports / Dashboards with different Analytics Views (Drill-Down, Pivot Table, Chart, Column Selector, and Tabular with global and local Filters) using OBIEE.
- Worked on Shell Scripts in order to convert incoming Excel Flat files from .xls to .csv which helps importing into Informatica.
- Wrote Shell Scripting as a part of FTP'ing the Files to the mainframe region.
- Actively participating in Agile process development style like attending scrum meeting (standup meetings).
- Accepted inbound transactions from multiple sources using FACETS.
- Supported integrated EDI batch processing and real-time EDI using FACETS.
Environment: Informatica PowerCenter 9.5/9.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica IDQ, SQL Query Analyzer 8.0, Oracle 10g/11g, SQL Developer, Data Loader, Salesforce, SAP, OBIEE, BI Publisher, Erwin, Unix Shell Scripting, putty, FACETS and Business Objects.
Confidential, Colmbus, Ohio
Informatica Developer/IDQ Consultant
Responsibilities:
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Efficiently worked upon Parameters and Variables.
- Identifying and tracking slowly changing dimensions and created complex mappings by using the SCD concepts.
- Acted as a liaison between various teams to connect to business users.
- Implemented Indirect file types for loading data with the same structures.
- Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Used IDQ for data profiling, standardization and structuring the data
- Proficient in using SQL and PL/SQL for extract transform and load data into data warehouse.
- Data quality monitoring and profiling tool for the enterprise wide data elements.
- Classify and enrich meta data requirements to ensure performance and functionality of the meta data management system for future releases
- Utilized Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement.
- Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files
- Expertise in Teradata SQL & Teradata Utilities like Multiload, Fast export, BTEQ (Batch Teradata Query), Fastload, Teradata Parallel Transporter, TPump.
- Proficient in performance analysis, monitoring and SQL query tuning usingEXPLAIN PLAN,Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
- Preparing HLD, LLD, BRD, FRD, gathering complete requirements.
- Translated Business processes into Informatica Mappings for building Data marts.
- Develop reports using complex formulas and to query the database to generate different types of ad-hoc reports using SSRS.
- Wrote UNIX Shell Scripting for Informatica Pre-Session, Post-Session Scripts and also to run the workflows.
- Wrote Shell Scripting as a part of FTP'ing the Files to the mainframe region.
- Using Heterogeneous files from different sources and Importing stored procedures from Oracle for transformations.
Environment: Informatica PowerCenter 9.1/8.6 (Repository Manager, PowerCenter Designer, Workflow Manager, Workflow Monitor), Informatica IDQ/IDE, Informatica Metadata Manager, Teradata, Oracle 10g, Reporting Services(SSRS), SQL SERVER 2008, MS-Access, Unix Shell Scripting, Window 7, SQL, PL/SQL, MS-Visio, Microsoft tools, Putty.
Confidential, Parsippany, NJ
Informatica Developer
Responsibilities:
- Designing the source to target mappings that contain the Business rules and data cleansing during the extract, transform and load process.
- Ensuring Conformance to Informatica coding standards.
- Ensuring Conformance to Cognizant’s Quality Process.
- Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
- Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
- Extensively used Power Center/Mart to design multiple mappings with embedded business logic.
- Used different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Sequence Generator, Unconnected / Connected Lookup, Update Strategy, Aggregator, Sorter, Union in the Informatica Designer.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Created Mapplet and used them in different Mappings.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Expertise in Teradata SQL & Teradata Utilities like Multiload, Fast export, BTEQ (Batch Teradata Query), Fastload, Teradata Parallel Transporter, TPump.
- Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
- Prepared BTEQ scripts to load data from Preserve area to Staging area.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
- Preparing HLD, LLD, BRD, FRD, gathering complete requirements.
- Tuning Informatica Mappings and Sessions for optimum performance.
- Analyzed, documented and maintained Test Results and Test Logs.
Environment: Informatica PowerCenter 9.1/8.6, Teradata 13.0, Oracle 10g/9i, PL/SQL,Erwin Data Modeler r8, Visio 12.0, Microstrategy 8, SQL,Shell Scripting, Windows XP.
Confidential, CO
Informatica Developer
Responsibilities:
- Analyzing the specifications and identifying the source data that needs to be moved to the data warehouse.
- Created different source definitions to extract data from flat files, relational tables and external sources.
- Worked widely upon all Master and Transactional data’s of the OLAP systems.
- Hands on experience upon Surrogate keys, Natural Keys.
- Strong understanding of the managed metadata environment (MME) and its architectural components.
- Ability to work at multiple levels of a metadata management project, including defining the metadata management architecture, managing the metadata project, mentoring metadata developers, etc.
- Created and maintained Data Models working with Data Architect.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Developed Stored Procedure to support front-end application.
- Involved in creating Naming Standards, Best Practices for ETL development.
- Used all types of transformations like Expression, Router, Lookup, Filter, Aggregate, Sorter etc.
- Expertise in Teradata SQL & Teradata Utilities like Multiload, Fast export, BTEQ (Batch Teradata Query), Fastload, Teradata Parallel Transporter, TPump, VIEWPOINT, DATAMOVER.
- Designed dimension and fact tables for Star Schema and Snowflake Schema to develop the Data warehouse.
- Participated in Integration of Siebel Database
- Involved in Integration of Siebel UI screen.
- Actively interacted with business analysts.
- Worked widely on Sales Force Dot Com (SFDC).
- Efficiently worked upon Connected and Unconnected Lookups.
- Creating tables, constraints for improved database performance.
- Created tables and views for reporting purposes.
- Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
Environment: Informatica Power Center 8.1.1, MS SQL Server, SQL LOADER, Oracle 9i, Erwin, SQL, PL/SQL, Oracle8i,SalesForceDotCom,Siebel CRM,Erwin, SQL*Loader, Salesforce,XML, Win 2000/NT
Confidential
ETL Developer
Responsibilities:
- Development of Informatica Mappings, Sessions, and Workflows of varying complexity.
- Migrated the ETL Codes.
- Used various Transformations like Lookup, Filter, Joiner, Aggregator, Expression, Router, Sequence generator, Normalize in the mappings.
- Worked upon the VSAM and COBOL Sources.
- Collected Business Analysis and Requirements by working with the business end users.
- Promptly analyzed the existing Informatica mappings and understanding and solving the issues.
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
- Converting the Requirements into Business Rules for ETL Transformation.
- Prepared USE CASES, Activity Diagrams, Identifying the Users, Actors.
- Prepared the ER Diagram Models.
- Understanding the complete business requirement and need of the report.
Environment: Informatica Power Center 7.4.x/7.1, Erwin, MS SQL Server, Oracle 8i, IMS, Windows NT, Flat files, SQL, PL/SQL, SQL*Loader,Oracle 8i, XML, Cognos 5, Autosys, UNIX Shell Scripting.