Informatica Idq Developer Resume
NJ
SUMMARY
- Around 8 years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica PowerCenter9.x/8.x and Informatica IDQ 9.x. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
- Experience working with PowerCenter Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
- Knowledge in Full Life Cycle development of Data Warehousing.
- Strong understanding of OLAP and OLTP Concepts.
- Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
- Having solid Experience in Informatica and Teradata mix in Enterprise distribution center environment. Having solid involvement in utilizing Teradata utilities like TPT, FASTLOAD, MULTILOAD and BTEQ scripts.
- Developed mappings in Informatica to stack the information from different sources into the Data Warehouse, utilizing distinctive changes like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
- Built logical data objects (LDO) and developed various mappings, Mapplet/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.
- Experience in determining on - going upkeep issues and bug fixes; observing Informatica sessions and in addition execution tuning of mappings and sessions.
- Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.
- Expertise in implementing performance tuning techniques both ETL & Database level.
- Experience in utilizing Automation Scheduling instruments like Autosys, Control-M and Maestro.
- Experience in working both Waterfall & Agile Methodologies. Good relational abilities with solid capacity to connect with end-clients, clients, and colleagues. Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
- Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
- Experience in working with Perl Scripts for handling data coming in Flat files.
- Strong with relational database design concepts.
- Expertise in Health care domain like Medicare, Medicaid and Insurances compliance within HIPPA regulation and requirement.
- Expertise in Change Data Capture (CDC) using Informatica Power Exchange.
- Performed data validation by Unit testing, integration testing and System Testing.
- Knowledge on Informatica Data Explorer (IDE) and IDQ Workbench for Data Profiling.
- Extensive experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
- Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling, slowly changing Dimensions using Erwin.
- Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
- Able to work independently and collaborate proactively & cross functionally within a team.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
TECHNICAL SKILLS
Tools: Informatica PowerCenter9.x/8.x, Power Exchange 9.x/8.x, Informatica IDQ 9.x,Informatica MDM,Informatica MDM Hub Console, Informatica PowerCenter Data Replication Console,Talend.
Databases: Oracle 11g/10g/9i, MS SQL Server 2008/2012, MS Access, DB2, Teradata 14/13/12/V2R6/V2R5, Sybase & Greenplum 4.0/4.1.
Reporting Tools: Business Objects XIR2, SAP BI 7.0, OBIEE 11g/10g, Cognos & Tableau.
Languages: C, C++, SQL, PL/SQL, HTML, JAVA, UNIX Scripting & Python Scripting
Other Tools: Toad, SharePoint, SCM, Putty, GIT, MATT, Autosys, ESP& Control-M.
Operating Systems: LINUX, UNIX, SUN Solaris, Windows7/XP/2000/98
PROFESSIONAL EXPERIENCE
Confidential, Austin, Texas
Senior ETL/Informatica IDQ Developer
Responsibilities:
- Worked with ETL Architects and Senior developers for understanding the requirements.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
- Worked on creating Unit testing documents.
- Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
- Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
- Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
- Worked on unit testing for the loaded data into the target.
- Optimized SQL queries for better performance.
- Created HLDS for different requirements.
- Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
- Used IDQ and Informatica PowerCenter to cleanse the data and load into TGT database
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Created LDO's (Logical Data Objects), CDO's (customized data objects) and physical data objects in IDQ.
- Deployed mappings and mapplets from IDQ to power center for scheduling.
- Provided various INFACMD commands for automation and scheduling the IDQ jobs.
- Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
- Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
Environment: s: Informatica PowerCenter10, Informatica IDQ 9.6, Oracle 11g, UNIX, Data Marts, FTP, MS-Excel, UNIX Shell Scripting, WinSCP
Confidential, NJ
Informatica IDQ Developer
Responsibilities:
- Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, porting and monitoring capabilities of IDQ 9.6.
- Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
- Used Informatica Data Quality (IDQ) for data quality, integration and profiling.
- Optimized SQL queries for better performance.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Deployed mappings and mapplets from IDQ PowerCenter for scheduling.
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance.
- Created LDO's (Logical Data Objects), CDO's (customized data objects) and physical data objects in IDQ.
- Tuned both ETL process as well as Databases.
- Developed various Mappings, Mapplets, Workflows and Transformations for flat files and XML.
- Identified and eliminated duplicates in datasets thorough IDQ components.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
- Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Exposed IDQ mapping/mapplets as web service.
- Utilized of Informatica IDQ 9.6 to complete initial data profiling and matching/removing duplicate data.
- Worked on Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Workflow Designer.
- Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity and Timeliness.
- Tuned performance of Informatica PowerCenter session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
Environment: s: Informatica PowerCenter9.6.2, Informatica IDQ 9.6, Informatica IDE, Oracle 11g, UNIX, Sql Developer, FTP, Toad.
Confidential, Los Angeles, CA
ETL/Informatica Developer
Responsibilities:
- Designed ETL's to extract data from COBOL Files and update the EDW with the Patient related Clinical information. This includes a one-time history load and subsequent daily loads.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, Flat files and COBOL files.
- Created SSIS package which will download data from the respective source and will push into loan data hub.
- Experience on working with complete Software Development Life Cycle of the application.
- Involved in monitoring and maintenance of the Unix Server performance.
- Involved in running shell scripts for the different environments with the same shell scripts.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
- Worked on Power Exchange for importing source data to the Power center.
- Created mappings to read COBOL source file and write it in ASCII file format. Target file is same as source structure.
- Created mapping documents with business rules using MATT Tool.
- Generated complex Transact SQL ( Confidential -SQL) queries, Sub queries, Co-related sub queries, Dynamic SQL queries etc.
- Created sprint plan for developers responsibilities and scheduling sprint plan in version one.
- Fixed defects in different environments for platform code.
- Extracted and reviewed data from heterogeneous sources from OLTP to OLAP using MS SQL Server Integration Services (SSIS).
- Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
- Worked on GIT Bash for code check in into the different environments using GIT commands.
- Used Spash to verify check in code in different environments.
- Performance Tuning of Stored Procedures and SQL queries using SQL Profiler and Index Tuning Wizard in SSIS.
- Worked with different methods of logging in SSIS.
- Worked on unit testing for the loaded data into the target.
- Optimized SQL queries for better performance.
Environment: Informatica PowerCenter 9.6.1/9.6.0 , Oracle 11g, Power Exchange 9.6, MS SQL Server 2008, SSMS, SSIS, UNIX, MATT, Data Marts, Spash, Version one, UNIX Shell Scripting, Data Modeling, GIT.
Confidential, Dallas, Texas
ETL Developer/ Informatica Developer
Responsibilities:
- Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
- Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata.
- Set up batches and sessions to schedule the loads Confidential required frequency using PowerCenter Workflow manager.
- Used Teradata manager, Index Wizard and PMON utilities to improve performance.
- Extensively worked on Autosys to schedule the jobs for loading data.
- Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databases etc.
- Worked on Power Exchange for change data capture (CDC).
- Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
- Developed Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging.
- Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
- Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
- Involved in Informatica Data Masking & Data Subset Data Mapping.
- Executed conversion maintenance on existing legacy system.
- Loaded data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
Environment: Informatica PowerCenter9.5/9.1, Oracle 11g, Power Exchange 9.1, Teradata 13.10, Data Marts, Erwin Data Modeler 4.1, UNIX Shell Scripting, Data Modeling, PL/SQL, Tableau, Autosys & UNIX(Sun Solaris5.8/AIX)
Confidential, Columbus, OH
ETL /Informatica PowerCenter Developer
Responsibilities:
- Coordinated with Business Users for requirement gathering, business analysis to understand the business requirement and to prepare Technical Specification documents (TSD) to code ETL Mappings for new requirement changes.
- Involved in Data Analysis.
- Estimation, Requirement Analysis and Design of mapping document and Planning for Informatica PowerCenter ETL.
- Involved in designing STAR Schema for the business processes.
- Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.
- Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
- Set up batches and sessions to schedule the loads Confidential required frequency using PowerCenter Workflow manager.
- Set up batches and sessions to schedule the loads Confidential required frequency using PowerCenter Workflow manager and accessing Mainframe DB2 and AS400 systems.
- In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance.
- Worked with complex Cognos reports in Report Studio using master-detail relationship, drill through, drill up and drill down, burst options, and Prompts.
- Extensively worked on Autosys to schedule the jobs for loading data.
- Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
- Worked on Power Exchange for change data capture (CDC).
- Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
- Involved in Informatica PowerCenter Data Masking & Data Subset Data Mapping.
- Executed conversion maintenance on existing legacy system.
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
Environment: Informatica PowerCenter 9.1/8.6.1, Oracle 11g, Power Exchange 8.6, MS-Access, DB2,FTP, Cognos 8.3, UNIX Shell Scripting, Data Modeling, PL/SQL, Autosys and LINUX.
Confidential
ETL /Informatica Developer
Responsibilities:
- Responsible for design and development of Sales Data Warehouse.
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Extracted data from heterogeneous sources like Oracle, SQL Server
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica PowerCenter mappings.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Tuned performance of Informatica PowerCenter session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Mapplets and used them in different Mappings.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
- Worked with SAP and Oracle sources to process the data.
- Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica power center.
- Created Unix Shell Scripts to automate sessions and cleansing the source data.
- Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
- Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
- Worked on the data masking activities for cloning the GDW for several buyers
Environment: Informatica PowerCenter 8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, SQL Server, Sun Solaris.