Senior Informatica / Data Quality (idq) Developer Resume
Mt Laurel, NJ
SUMMARY:
- Over 9 + Years of experience in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of BI applications which includes strong experience in Data Warehousing (ETL & OLAP) environment as a Data Warehouse Consultant.
- E xperience in using ETL methodologies for supporting Data Extraction, Data Migration, DataTransformation and loading using Informatica Power center 9.6.1/ 9.1/8. 6.1/ 7. x/6.2, IDQ and Trillium.
- Experience in Informatica Data Quality (IDQ - Informatica developer 9.6.1/ 9.1) for cleansing and formatting customer master data.
- Experience in Informatica Analyst to analyze customer master data. used Informatica Analyst to perform profiling.
- Created business rules and populated those rules Pass, Fail percentages and failed records(rule breaks) in DQP dashboard using Qlik view.
- Create logical data object model, Customized data objects in Informatica developer.
- Created reference tables for standardization.
- Experience in Trillium to scrub, standardize and match customer address.
- Used BCI to extract data from SAP R/3.
- Good understanding of Ralph Kimball Dimensional Modeling using Star schema methodology and Bill Inmon Snow Flake schema methodology.
- Strong understanding of OLAP and OLTP Concepts
- Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
- Experience in SQL, PL/SQL, T-SQL and UNIX shell scripting.
- Performed data modelling using Teradata.
- Documented the process for rule analysis, designing, mapping, Unix SCP Process, DQP dash board, CM Synergy, rational change.
- Migrated code using IBM CM Synergy process.
- Created change requests and attached assigned task which created in IBM rational synergy and solution records for code migration in IBM Rational Change.
- Experience with the data-modeling tool using Erwin.
- Worked with Full Software Development Life Cycle (SDLC) involving Application Development and ETL/OLAP Processes.
- Used 3rd generation programming languages like C++, COBOL & Java.
- Used Informatica Metadata Manager and Metadata Exchange exhaustively to maintain and document metadata.
- Dealt with Master data, Transaction data, Control data etc.
- Extensively involved in creating Oracle PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, and Indexes with Query optimizations as part of ETL Development process.
- Adept in UNIX operating system commands & Shell Scripting.
- Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.
- Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
- Strong decision-making and interpersonal skills with result oriented dedication towards goals.
- Worked in onsite-off shore model.
- Worked extensively in Data analysis and quality assurance.
- Participated in requirements gathering with business and functional teams and worked independently on given objects.
- Performed performance tuning to work efficiently.
- Knowledge of B2B data exchange and data transformation tool.
- Experienced in Chemical, Finance, Materials, Sales and distribution, manufacturing, entertainment and healthcare domains.
- Worked on Big data Editions
TECHNICAL SKILLS:
ETL Tools: Informatica Power center 9.6.1/ 9.1/8. x/7.x/6.x, Data Stage 7.5
Cleansing Tools: Informatica Data Quality (IDQ 9.6.1/9.5 ), Trillium
Reporting Tools: Crystal Report XI/10/9, Business Object XI, Cognos 8
Operating Systems: UNIX, Windows 98/ NT/2000/XP, Solaris
Database: Oracle 10g/9i, SQL Server, DB2,Teradata14.10
Database GUI Tools: TOAD
Web Development: HTML, XML
Programming Languages: Oracle SQL, PL/SQL, UNIX Shell Script, C, C++.
Supporting Tools: Microsoft Visio
Software: DB Artisan, Rapid Sql, IBM Rational Synergy, TWS
PROFESSIONAL EXPERIENCE:
Confidential, Mt Laurel, NJ
Senior Informatica / Data Quality (IDQ) Developer
Responsibilities:
- Involved with business and product development teams to get the functional requirements and specking out the Technical Requirement Documents for the data conversion.
- Performed profiling using Informatica Analyst and Informatica Developer.
- Designed data modeling with team as per the requirement.
- Captured DQ metrics( like Total record count, Distinct record count, Count of a distinct field, Count of null, Referential checks etc.) on the inbound/raw data (i.e., Prep or Stage), and then again on the processed/output data (i.e., Base or Pub).
- Created scorecards to review data quality.
- For DQ Enhancement, Add ed Local Time zone to DQ Framework
- For Metric Promotion Automation Process, masked/encrypted passwords and running multiple batches
- For Metric Promotion Automation Process, Created shell script for avoiding manually editing the shell scripts for each environment specific changes as you migrate code (parameterization)
- For Metric Promotion Automation Process, Developed batch file, preventing running the given batch again while one instance is already running, Running different batches at the same time and in parallel
- Created logical data object model in Informatica developer.
- Used Informatica Data Quality tool (Informatica Developer) to scrub, standardize and match customer address against the USPS database.
- Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations etc.
- Used Informatica Address Doctor for global address verification across a n organization
- Design, Development and implementation of Power center mappings for DQ Framework using SQL, Normalizer, Expression, Filter, Router, Lookup transformations etc.
- Imported mapplets and mappings from Informatica developer (IDQ) to Power Center.
- Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta- data definitions.
- Worked extensively on Informatica Power Center tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
- Responsibilities include designing the documents which is based on the requirement specifications.
- Used Teradata for data modeling.
- Worked code migration process using Tortoise SVN Browser .
- Used UC4 scheduler for scheduling.
- Debug the Informatica mappings for DQ Metrics and by utilizing the session logs in Informatica.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Involved in design, development and testing of the DQ Metrics system.
- Done extensive testing and wrote queries in SQL for push down optimization.
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter, SQL and Aggregator transformations.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- Debug the sessions by utilizing the logs of the sessions.
- Developed and implemented the UNIX shell script for the start and stop the sessions.
- Modified the Stored Procedures, Trigger & other backend objects for performance.
- Extensive Experience on Agile methodology
- Performed documentation for stored procedure, exception detail reports.
Environment: Informatica Power Center 9.6.1, Informatica Data Quality9.6.1, Teradata, XML, Tortoise SVN, UC4 scheduler
Confidential, Manhattan, NY
Senior Informatica / Data Quality (IDQ) Developer
Responsibilities:
- Involved with business and product development teams to get the functional requirements and specking out the Technical Requirement Documents for the data conversion.
- Performed profiling using Informatica Analyst and Informatica Developer.
- Designed business rules (Cycle-1 to Cycle-5) with team as per the requirement.
- Created business rules and populated those rules percentages and failed records(rule breaks) in DQP dashboard using Qlik view.
- Created generic rules in Informatica developer as per the business.
- Extracted the data from Flat files and Sybase tables and created custom data objects in Informatica developer for profiling and rule breaks.
- Created scorecards to review data quality.
- Create logical data object model in Informatica developer.
- Used Informatica Data Quality tool (Informatica Developer) to scrub, standardize and match customer address against the USPS database.
- Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations etc.
- Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
- Worked on Informatica Power Center tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
- Responsibilities include designing the documents which is based on the requirement specifications.
- Used Teradata for data modeling.
- Worked code migration process using IBM Rational Synergy.
- Created tasks and change requests for code migration in IBM Rational Change.
- Used TWS scheduler for scheduling.
- Debug the profiles and rule breaks by utilizing the logs in Unix and Teradata.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Involved in design, development and testing of the MDS system.
- Extracted the data from Flat files and Tera Data to load into MDS system.
- Done extensive testing and wrote queries in SQL to ensure the loading of the data.
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- Debug the sessions by utilizing the logs of the sessions.
- Developed and implemented the UNIX shell script for the start and stop procedures of the sessions.
- Modified the Stored Procedures, Trigger & other backend objects for performance.
- Documented the process for rule analysis, designing, mapping, Unix SCP Process, DQP dash board, CM Synergy, rational change.
- Performed documentation for stored procedure, exception detail reports.
Environment: Informatica, Informatica Data Quality, Sybase, Rapid SQL,DB Artisan, Teradata, XML, IBM CM Synergy, Qlik view, TWS scheduler
Confidential, Berkeley Heights, NJ
Informatica Developer/Informatica Data Quality (IDQ) Developer/Analyst
Responsibilities:
- Retrieving legacy data and cleanse that data in Informatica developer and load it into SAP R/3.
- Retrieving legacy data and covert them Customer Hierarchy Structure with Linkage, Sales Force Structure with Linkage, Create Credit Rep, Credit Manager, Customer Service rep, Freight Terms, General and sales area views, Payer (Bank, Contacts), Ship To if different from Sold To (planning in process, Sold To, Contacts, Customer (Loyalty) Cards, Anti Diversion, License
- Performed profiling using Informatica Analyst and Informatica Developer.
- Created Customer master generic data profiles.
- Create scorecards to review data quality.
- Used Informatica Data Quality tool (Informatica Developer) to scrub, standardize and match customer address against the USPS database.
- Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, to standardize the address using standardizer, Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations and etc.
- Design, Development and implementation of data load and data validation processes using MS Access and SQL Server.
- Reported to the higher management on a bi-weekly basis via status reports on the number of issues (Bug Status) resolved and outstanding and project health report.
- Design, Development and implementation of ETL Informatica Mappings. Design, Development and implementation of data load and data validation processes using PL/SQL, SQL.
- Performance tuning and Monitoring of SQL. Partitioning of tables for better performance. Tuning of Informatica mappings using SQL overrides and source filters and managing cache file allocations and sizes. Tuning of user queries for better performance.
- Production and Operations Support for Data Warehouse Processing, was involved in upgrade, maintenance, and administration of Tidal Scheduler and monitoring the load on servers.
- Developed and implemented strict access controls and governance.
- Data modeling and design of for data warehouse and data marts in snowflake and star schema methodology with confirmed and granular dimensions and FACT tables.
- Used Erwin to create Logical/Physical model for new tables to be added for new data to be pulled from third party sources as part of Change Request effort.
- Conducted business data and process gap analysis, Designed to-be Processes and developed performance metrics, monitoring, and maintenance plans.
- Ensuring projects meet their schedules, milestones, deliverables and those deliverables meet quality requirements.
- Providing project estimates and communicating efficiently with business to bring down critical issues to the table and resolve the same.
- Involved in upgrade of Informatica 8.6.1 to Informatica 9.1.
- Added new user /folder to Informatica Development repository during upgrade.
- Created setup files for application in UNIX to be run through job scheduler (Tidal).
- Used Business Content Integration (BCI) to extract data from SAP R/3 Data Sources.
Environment: Windows NT, Oracle 9i/8i, SQLServer-2008 and 2005, SAP R/3, SAP BW, Informatica Power Center 8.6.1/9.1, (Informatica Data Quality) Informatica Developer9.1, Informatica Analyst 9.1, MS Access, Business Content Integration, Tidal.
Confidential, Falls Church, VA
Informatica Applications Engineer
Responsibilities:
- Responsible for the extract transfer load (ETL) of Business Planning data per month in support of Military Health Services (MHS) Reconciliation Tool (RT) using Informatica 8 and MS SQL Server.
- Involved in database design and developed Transact-SQL for the RT system in MS-SQLServer.
- Created technical design documents from business requirements for development of the mapping and to implement business logic through transformations in the mapping.
- Created mappings with the help of source definitions (flat file and relational) and target definitions (relational) and different transformations like normalizer, routers, joiners, aggregates to load data into the SQL SERVER tables.
- Created sessions and workflows and monitored the jobs through workflow monitor.
- Tuning the Informatica maps and SQL’s for optimal load performance.
- Worked on the test and the validation document.
- Developed the logical and physical models using ERWin designer.
- Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1
- Translation of Business processes into Informatica mappings for the data warehouse.
- Involved in the Migration process from Development, Test and Production Environments.
- Used stored procedures, views and triggers in PL/SQL for managing consistency and referential integrity across data mart.
- Performed analysis and implemented changes to Informatica in response to the testing Issues.
Environment: Informatica Power center 8.6.1, Business Objects XI R2, SQL Server 2005/2008, SSIS, SQL, T-SQL, Windows 2008 and Windows XP.
Confidential, DC
ETL Developer
Responsibilities:
- As the data integration specialist at XM Satellite Radio in Washington D.C. I was responsible primarily for managing and mentoring a team of 3 developers who were engaged in fixing defects as part of the Systems Integration Testing phase of the DW/BI implementation.
- This entailed identification and implementation of modifications/tweaks in the existing Informatica mappings/sessions/workflows or adding new code to augment and enhance the functionality of the code depending on the nature of defects detected by the testing team. This assignment was especially challenging as the ETL design and code had already been developed by other developers who were no longer a part of the project. I had to analyze all the existing logic and then mentor the developers so that all were on the same page as some of the logic was highly complicated and business critical.
- Used Erwin to create Logical/Physical model for new tables to be added for new data to be pulled from third party sources as part of Change Request effort.
- Modified the existing CDC logic (Informatica Maplets) which used triggers at the source to detect and capture any DML changes at the table object level to leveraging PowerExchange to distill new and updated data from unchanged background data.
- Developed Informatica mappings and unit tested them as part of Change requests new development effort.
- Created PL/SQL Stored procedure construct to automate the checking of status of target database before loading data into it.
- Created PL/SQL stored procedure to drop and re-create indexes during pre and post loading of database tables respectively.
- Modified and executed UNIX shell scripts to run jobs from the OS command line.
- Used Trillium to scrub, standardize and match customer address against the USPS database.
- Analyzed and understood the code and start fixing the bugs with little or no turnaround.
- Identification and optimization of ETL mappings and SQL code where ever opportunities for tuning existed.
- Reported to the higher management on a bi-weekly basis via status reports on the number of issues (Bug Status) resolved and outstanding and project health report.
Environment: Informatica Power Center 8.1.,Oracle 10g, SQL Server 2005, Quality Center 9.0, Trillium7.6, Erwin 7.3, TOAD.
Confidential, Sunnyvale, CA
Informatica Developer
Responsibilities:
- Studied and analyzed the business requirements.
- Interacted with different group of users for analysis of the system.
- Worked with Informatica Designer to create complex mappings and mapplets using Expression, Aggregator, Routers, Lookup and Stored Procedure Transformations to load data from different data sources to Data Marts.
- Involved in data extraction from SAP system using power exchange connector.
- Used PL/SQL procedures in mappings to do complex database level operations.
- Created the Unix Shell script to load the high volume flat files data into staging tables using Oracle SQL* Loader.
- Used power connect to connect SAP system for source data.
- Involved in creating staging Tables, Indexes, Sequences, Views and performance tuning to load into tables, analyzing tables, proper indexing and dropping.
- Effectively used different Workflow manager tasks viz., Worklet, Command, Decision, Email, Event wait and Event raise.
- Performed Performance Tuning of sources, targets, mappings, transformations and sessions.
- Involved in creating, scheduling, and running batches and sessions using Control-M (scheduling tool).
- Involved in upgrade of Informatica 5.1 to Informatica 7.1.
- Adding new user /folder to Informatica Development repository during upgrade.
- Created setup files for application in UNIX to be run through job scheduler (Control-M).
- Monitoring the Scheduled jobs and Respond for ant failure alert received from server.
- Documented the document which gives the clear idea about the Tables, Views, script files, Log files, Source Flat files information, FTP Process, Control files and Production migration information.
- This project is Development including Production support. Analyze, design, build, test and implement change requests for certain areas in the production data warehouse.
- 24 x 7 on-call production support responsibilities on rotation basis
- Identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs
- Coordinating with offshore team members and Users.
Environment: Informatica Power Center 5.1/7.1, XML, PL/SQL, Oracle 8i/9i, SQL Server, TOAD, SQL*Loader, DB2, SAP, Perl Scripting, Control-M and Windows 2000.
Confidential, Duluth, GA
Informatica Consultant
Responsibilities:
- Interacted with business users and team leads in correlating business specification and identifying data sources and development strategies.
- The source data was extracted from Oracle and Flat file sources, transformed according to business rules and loaded to star schema on Oracle RDBMS involving TYPE 2 dimension tables designed to preserve history.
- Informatica 7.1.2 was used to develop complex mappings extensively involving Expression, Lookup, Aggregator and stored procedure transformations.
- Identified and fixed errors in the mapping logic using the Debugger.
- Unit tested developed Informatica code as well as participated in SIT.
- Identified and eliminated bottlenecks at database/mapping/session levels.
- Extensively implemented various types of partitions to improve session performance.
- Configured SQL * Loader to load data to Oracle target databases.
- Enhanced and modified the existing UNIX shell scripts to schedule tasks/sessions and to execute Informatica workflows.
- Interacted with the reporting team to correlate end-user requirements.
Environment: Informatica Power Center 7.1.2, Oracle 9i, TOAD, MS SQL Server 2000, SQL * Loader, UNIX.
Confidential, Wilmington, DE
Informatica Developer
Responsibilities:
- Analyzed the Informatica mappings and made enhancements and corrections.
- Extensively used Informatica Power Center to create mappings, sessions and workflows for populating the data.
- Created transformations such as Joiners, Lookups, Filters, Aggregator, Sequence Generators, according to the business rules.
- Worked on the SAP integration with Informatica.
- Used Informatica Power Connect to establish connection between Informatica Repository and SAP BW. This Connection was used to invoke the workflows from SAP BW.
- Extensively used Informatica Power Center to create mappings, sessions and workflows.
- Created transformations such as Joiners, Lookups, Routers, Filters, Update Strategy, Aggregator, Sequence Generators, and Sequence Generators for defining the transformation logic according to the business rules for the data loads.
- Developed Power Connect Jobs and involved in scheduling & maintenance of Power connect Jobs in SAP BW.
Environment: Informatica PowerCenter 7.1.2, Informatica PowerConnect for SAP R/3, DB2 8.1.7, SQL, PL/SQL, TOAD, Windows XP and UNIX.