We provide IT Staff Augmentation Services!

Senior Informatica Powercenter & Idq Developer Resume

3.00/5 (Submit Your Rating)

Lake Mary, FL

SUMMARY:

  • 8.5 years of experience in business Intelligence as ETL Informatica PowerCenter /Informatica IDQ Analyst and Lead Developer/ Database developer.
  • Extensively used Informatica PowerCenter 9.6/9.1/8, Informatica Data Quality (IDQ) 9.6/9.1 as ETL tool for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time.
  • Knowledge of data warehouse approaches - Top down (Inmon’s approach) and Bottom up (Kimball’s approach), methodologies- Star Schema, Snowflake.
  • Followed Agile methodology, Scrum process.
  • Experience in relational databases like Oracle, DB2, Teradata, Netezza and SQL Server.
  • Experience in integrating data from flat files - fixed width, delimited, XML, WSDL, Web Services by using various transformations available in Informatica such as - Source qualifier, XML parser, Web services consumer transformation.
  • Extensively used data masking transformation to mask NPI data (SSN, birth date, account number etc) in Dev, QA environments.
  • Used Informatica ILM & TDM (Test Data Management) to create various applications for data masking of sensitive data.
  • Used various Informatica Powercenter and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, labeler, parser, address validator, match, comparison, consolidation, standardizer, merge to perform various data loading and cleansing activities.
  • Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.
  • Complete knowledge of using XML, MQ series as the source and target.
  • Extensively used data masking transformation for masking / scrubbing various sensitive fields such as social security number, credit card number, agreement numbers etc.
  • Integrated Informatica data quality mappings with Informatica powercenter.
  • Worked closely with MDM teams to understand their needs in terms of data for their landing tables.
  • Knowledge (haven’t worked on the product yet) of Informatica MDM (Sipherian MDM).
  • Created various profiles using Informatica Data Explorer (IDE) & IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
  • Used various performance techniques in Informatica such as - partitioning, tuning at source/target/transformation, usage of persistent cache, replacing transformations that use cache wherever possible.
  • Created complex mapplets to be shared among team members.
  • Extensive experience in developing various tasks, workflows, worklets, mapplets and mappings.
  • Designed and enabled Informatica workflows as Web Services using Web Service consumer transformation and experience in testing the web services using Try-It option as well as with Java program.
  • Have experience using SAP as the source & target from Powerexchange for SAP Netweaver.
  • Extensive knowledge on WSDL, XML, SOAP messages and web services.
  • Extensive knowledge on debugging of Informatica mapping/workflows, unit testing of the mapping/workflows.
  • Extensively used Informatica repository manager for exporting / importing of workflows. Workflow monitor for workflow status.
  • Extensive knowledge in database external loaders - SQL loader (Oracle), LOAD (DB2), TPT, Fastload, Multiload, Tpump (Teradata), Bulk writer (Netezza).
  • Extensive experience with Teradata modelling of various entities considering various facts including PI, NPI, AMP loads and other factors.
  • Extensively used data modelling tools such as ERWIN to design teradata tables and relationships among them.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database, Fastexport to export data out of Teradata tables, created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS, expertise in creating databases, users, tables, triggers, macros , views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN , Collect Statistics , Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics, wrote Teradata Macros and used various Teradata analytic functions .
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Extensive experience with using Bulk Writer (external loader) of Netezza to load flat files as well as data using Informatica.
  • Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
  • Created complex SQL queries using common table expressions (CTE), analytical functions like - lead, lag, first value, last value etc.
  • Extensive knowledge on different types of dimension tables - type 1 dimension, type 2 dimension, junk dimension, confirmed dimension, degenerate dimension, role playing dimension and static dimension.
  • Created complex PL/SQL programs, stored procedures, functions and triggers.
  • Declared cursors to move the data between tables, for temporary storing of data to perform various DML operations in stored procedures.
  • Extensive experience in writing unix scripts to invoke Informatica workflows using PMCMD command, to perform data validations between source and target in terms of counts.
  • Created complex unix scripts to perform cleanse and purging of source staging data due to data anomalies.
  • Created complex unix scripts using awk, sed and arrays for updating Informatica parameter files with correct process dates and connection string details in parallel.
  • Extensive knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Created complex Business Objects reports from scratch.
  • Extensive used Jenkins on packaging and deployment of various components.
  • Used Rally for creating stories and tracking task status for Agile.
  • Extensive knowledge on scheduling tools - Control-M, Autosys, Tivoli (TWS) and CRON.
  • Extensively used Control-M enterprise manager to schedule jobs, perform initial data loads, data copy from one environment to another when the environment is initially setup.
  • Extensive experience in writing Autosys JIL scripts.
  • Experience in production support of existing applications and new applications for the SLA period.

TECHNICAL SKILLS:

Architecture: EDW (Enterprise Data Warehousing), BIRA (Business Intelligence Reference Architecture), FDW (Federated Data Warehousing)

Business Systems: Investment Products, Insurance, Enterprise Claims Management Systems, Banking products.

Methodologies/Data: Modeling Tools Data Mart, Dimensional, Snow Flake, Star Schema, ERWIN

Databases: Teradata 15/14/13, Oracle 11g /11i/10g, DB 2 10.1 / 9.7/ 9.5/ 9.1, Netezza, SQL Server 2012 / 2008.

Data Warehousing: Informatica PowerCenter 10/9.6.1/9.5/9.1/8.6, Business Objects XIR3, Qlikview 11.2

Scripting Languages: Unix Shell Script

PROFESSIONAL EXPERIENCE:

Confidential, Lake Mary, FL

Senior Informatica Powercenter & IDQ Developer

Responsibilities:

  • Performed the roles of Senior ETL Informatica and Data Quality (IDQ) developer on a data warehouse initiative and was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW), perform data cleansing activities using various IDQ transformations.
  • Collaborated with data architects, BI architects and data modelling teams during data modelling sessions.
  • Extensive experience in building high level documents depicting various sources, transformations and targets.
  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems - Salesforce and TouchPoint.
  • Used Informatica Data Quality transformations to parse the “Financial Advisor” and “Financial Institution” information from Salesforce and Touchpoint systems and perform various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.
  • Extensively used Standardizer, Labeler, Parser, Address Validator, Match, Merge, Consolidation transformations.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Converted and published Informatica workflows as Web Services using Web Service Consumer transformation as source and target.
  • Created reusable components, reusable transformations and mapplets to be shared among the project team.
  • Used ILM & TDM to mask sensitive data in Dev, QA environments.
  • Used XML & MQ series as the source and target.
  • Used in-built reference data such as - token sets, reference tables and regular expressions to and built new reference data objects for various parse/cleanse/purging needs.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly.
  • Created Informatica mappings keeping in mind about Informatica MDM requirements.
  • Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
  • Worked extensively with Oracle external loader - SQL loader - to move the data from flat files into Oracle tables.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN , Collect Statistics , Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics, wrote Teradata Macros and used various Teradata analytic functions.
  • Knowledge on Teradata Manager, TDWM, PMON, DBQL, extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
  • Used cursors to copy the data between tables whenever new environments were created in Formal environment.
  • Extensively used OLAP queries - Lead, Lag, First Value, Last Value to analyze and tag rows for type-2 processing.
  • Generated explain plans for performance tuning of queries and identifying the bottlenecks for long running queries, worked with DBA to fix the issues.
  • Extensively used PowerExchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Oracle tables.
  • Extensively used PowerExchange for Salesforce to read data from relational sources (Oracle) and load into Salesforce objects.
  • Used Netezza Bulk writer to load huge amounts of data into Netezza database.
  • Extensive experience in querying Salesforce objects using workbench.
  • Extensively used JIRA & ServiceNow for creating requests for access, production migrations, component migrations & production related service requests.
  • Used Jenkins to automate packaging and deployment of various ETL, Unix components.
  • Extensively used Enterprise Manager tool in Control-M to load the charts and run the jobs for initial load of the tables whenever a new environment is created.
  • Owned the defects from production as well as from system testing and worked on the solutions.
  • Coordinated with QA team during QA environment build, reviewing test cases, test execution and defect assignments.
  • Extensive knowledge on defect tracking tools - TRAC, RMS as version control management tool.
  • Assisted in preparing implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Extensive experience in PL/SQL programming, stored procedures, functions and triggers.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Worked extensively on Business Objects reporting tool, created universes from scratch keeping in mind with all of the possible scenarios, contexts, loops, derived tables.
  • Created complex Business Objects reports using Infoview from scratch.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment: Informatica Power Center 10/9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, Salesforce, TDM, Teradata, Oracle 11i, DB2 10.1, SQL Server 2012, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, SQL Developer, SQL Loader, Netezza, Bulk writer, MQ series, Load, Ingest, T-SQL, PL/SQL, RMS, Linux, AIX, ERWIN, Teradata modelling, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, PowerExchange for Salesforce, XML, Rally, UC4, JIRA, Jenkins, ServiceNow, Control-M, Enterprise Manager, Autosys, JIL Scripts, Lotus Notes, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.

Confidential, Austin, TX

Informatica Powercenter & IDQ Lead Developer

Responsibilities:
  • Performed the roles of Senior ETL Informatica and IDQ developer on a data warehouse initiative and was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) and additional target data warehouses.
  • Collaborated with data architects, BI architects and data modelling teams during data modelling sessions.
  • Involved in functional design reviews and lead technical design reviews.
  • Extensively used Informatica transformations – Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into Teradata, Oracle, DB2 and SQL Server targets.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems – Claims, billing, payments.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Used Informatica Data Quality transformations to parse individual consumer information from various vendors such as Acxiom, Alliant to identify the true source of policy holder’s origination and assign credits accordingly by performing various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.
  • Extensively worked on performance tuning of Informatica and IDQ mappings.
  • Created reusable components, reusable transformations and mapplets to be shared among the project team.
  • Extensively used Informatica Data Quality transformations – Labeler, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier.
  • Extensively used in-built reference data such as – token sets, reference tables and regular expressions and created new set of reference tables to identify the noise data and standardization of input data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Created various data quality mappings in Informatica Data Quality tool and imported them into Informatica powercenter as mappings, mapplets.
  • Used TDM to mask sensitive data in Dev, QA environments.
  • Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
  • Worked extensively with Oracle external loader - SQL loader –to move the data from flat files into Oracle tables.
  • Coordinated with QA team during QA environment build, reviewing test cases, test execution and defect assignments.
  • Extensive knowledge on defect tracking tools – TRAC, RMS as version control management tool.
  • Assisted in preparing implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Extensive experience in PL/SQL programming, stored procedures, functions and triggers.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Used Informatica powerexchange to connect to Netezza and loaded huge amounts of data using Bulk writer external loader.
  • Knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Used Jenkins as a packaging and deployment tool for migrating ETL & Unix components.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment: Informatica Power Center 9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, TDM, Netezza, Bulk Writer, Teradata 13, Oracle 11i. DB2 10.1, SQL Server 2012, SQL Developer, SQL Loader, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, T-SQL, PL/SQL, RMS, Linux, AIX, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, MQ Series, PowerExchange for Salesforce, XML, UC4, Control-M, Rally, Enterprise Manager, Autosys, JIL Scripts, Jenkins, Lotus Notes, JIRA, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.

Confidential, Minneapolis, MN

Senior Informatica & IDQ developer

Responsibilities:
  • Created the Design for Extraction process from legacy systems using combined techniques of Data Replication and Change Data Capture.
  • Completed the Gap Analysis which includes identifying the gaps between the downstream partner requests to the source system files and to fill the gaps either by rejecting the downstream partner’s requests or requesting additional files from the source system.
  • EDW integration, creating transactional, periodical accumulating fact table grains and in the development of “Star” base data schema and dimensional changing strategies.
  • Worked extensively with Teradata utilities - Fastload, Multiload and Tpump to load huge amounts of data from flat files into Teradata database, Oracle external loader – SQL Loader to load huge amounts of data into Oracle database, DB2 external loader – LOAD to load huge amounts of data into DB2 database.
  • Created explain plans for long running queries, worked with DBA’s to identify and solve the bottlenecks by adding appropriate indexes.
  • Extensively used Informatica web services Consumer transformation to invoke 3rd party web services on Billing, Payments.
  • Enabled Informatica workflows as web service to be invoked and used by different client systems.
  • Created test Java programs to test the Informatica web services and Try-it option on Web Service Hub.
  • Extensively used Informatica Data Quality transformations – Labeler, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier.
  • Extensively used in-built reference data such as – token sets, reference tables and regular expressions and created new set of reference tables to identify the noise data and standardization of input data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Extensively used XML parser transformations to parse results from Web services output i.e. SOAP messages.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into Oracle database.
  • Extensively used Informatica DVO (Data Validation Option) for data comparison between source and target tables, data validation after workflow completion.
  • Used Informatica DVO to create table pairs, test rules, SQL views options to create various data validation scenario’s and used RunTests command to execute the table pairs and send out emails with the status of data validation.
  • Extensively used Informatica DVO for unit testing in Dev environment.
  • Involved in performance tuning of existing SQL queries.
  • Involved in performance tuning of Informatica mappings.
  • Extensive knowledge on banking products, equities and bonds.
  • Created complex ETL Informatica procedures to load data from legacy systems.
  • Prepared implementation document for moving the code from development to QA to production environments.
  • Worked with QA team and implementation team during different phases of the project.
  • Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment:: Informatica Powercenter 9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, Informatica PowerExchange for Mainframes, Informatica DVO, SQL Server 2008, Oracle 11i, Teradata 12, Fastload, MQ series, Multiload, Tpump, Teradata Parallel Transporter (TPT), Netezza, Bulk writer, AIX-UNIX, T-SQL, COBOL, KORN, Shell script, Autosys, JIL Scripts, Rapid-SQL, Toad, SQL Loader, Load, PL/SQL, PVCS, ServiceNow, Visio, Rumba (Mainframes).

Confidential, Morrisplain, NJ

ETL Informatica & IDQ Developer

Responsibilities:
  • Created technical design specifications documents based on the functional design documents.
  • Developed standards and procedures for transformation of data as it moves from source system to data warehouse, shared it among the other team members.
  • Involved in source data profiling and data analysis.
  • Implemented HIPAA (Health Insurance Portability and Accountability) standards.
  • Used Informatica Powercenter to scrub sensitive information related to 3rd party entities such as other medical providers, patients such as – Social security, account numbers.
  • Worked extensively with DB2 load utilities - LOAD to load huge amounts of data from flat files into DB2 database.
  • Extensively used DB2 Ingest command to load real time data into DB2 tables without causing interruptions to business users.
  • Used Oracle SQL Loader to load flat files into Oracle database.
  • Extensively used Informatica to load data into SQL Server database.
  • Created explain plans for long running queries, worked with DBA’s to identify and solve the bottlenecks by adding appropriate indexes.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into SQL Server database.
  • Involved in performance tuning of existing SQL queries.
  • Involved in performance tuning of Informatica mappings.
  • Extensive knowledge on banking products, equities and bonds.
  • Created complex ETL Informatica procedures to load data from legacy systems.
  • Prepared implementation document for moving the code from development to QA to production environments.
  • Worked with QA team and implementation team during different phases of the project.

Environment: Informatica Power Center 9.6.1/9.5, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, DB2 9.7, Oracle 10g, SQL Server 2008, Business Objects XI R2, Linux, Windows XP, SQL, PL/SQL, XML, T-SQL, TOAD, Autosys, Toad 9.5, Korn shell, Erwin.

Confidential, Owings Mills, MD

Informatica Consultant

Responsibilities:
  • Created technical design specifications documents based on the functional design documents.
  • Developed standards and procedures for transformation of data as it moves from source system to data warehouse.
  • Involved in source profiling and data analysis.
  • Worked extensively with DB2 load utilities - LOAD to load huge amounts of data from flat files into DB2 database.
  • Created explain plans for long running queries, worked with DBA’s to identify and solve the bottlenecks by adding appropriate indexes.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into Oracle database.
  • Used SQL Loader to load huge data into Oracle tables.
  • Involved in performance tuning of existing SQL queries.
  • Involved in performance tuning of Informatica mappings.
  • Prepared implementation document for moving the code from development to QA to production environments.
  • Worked with QA team and implementation team during different phases of the project.

Environment:: Informatica Powercenter 9.1, Informatica PowerExchange, Oracle 10i, DB2 8.6, AIX-UNIX, Shell script, Rapid-SQL, Toad, SQL Loader, Load, PL/SQL, PVCS, Visio, Autosys.

We'd love your feedback!