We provide IT Staff Augmentation Services!

Informatica Mdm Product 360 Pim Consultant Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 20+ years of professional IT consulting experience.
  • Extensive client & business team facing, functional and technical experience delivering end to end.
  • Expert with Data Integration, Data Lake, Data Warehousing, Data Governance, Business Intelligence (BI), Data Modelling, OLAP, Data Modelling, OLAP, ETL tools, centralized data & log processing using ELK stack and OLTP Client server product development projects.
  • Worked on Informatica PowerCenter, Informatica IDQ, Informatica Analyst, Informatica Administrator, Informatica DIH (Data Integration Hub), Informatica Advanced Data Transformation (ADT), Informatica AXON, Informatica EDC, Confidential s3, Confidential Glue, Confidential Athena, Databricks Spark, Hadoop, ELK stack, IBM infosphere information server, Information Analyzer, IBM Information Server Fast track, Cognos, Business Objects.
  • Strong analytical and problem - solving skills with effective verbal and written communication skills.
  • Great planning and organizational skills with a proven track record to multi-task while meeting targets and exceeding expectations.

TECHINICAL SKILLS:

Informatica ETL Products: Informatica PowerCenter 10.2.x/10.1/9.x/8.x/7.x/6.x/5.x, Informatica Data Integration Hub (DIH), Informatica Data Quality (IDQ) 10.2/10.1, Informatica Advanced Data Transformation (ADT) 10.2/10.1, INFA MDM Product 360, Informatica MDM

IBM ETL Products: IBM Infosphere Information Server (DataStage) 11.5/9.1/8.5/8.1/7.5.1 (Enterprise Edition- PX)

DataBricks ETL & Other: Databricks Spark big data ETL, Scala 2.1.1

Cloud Platforms: Amazon AWS, GCP and Azure

Hadoop Administration: Hadoop Administration, HortonWorks Administration

Apache Products: Apache Spark, PIG, HIVE, SQOOP, HBASE, Cassandra, Oozie, Zookeeper, Ambari, Flume, Impala, Kafka

ELK/Elastic Stack: ElasticSearch, Logstash, Kibana, Filebeat

Data Governance: Informatica EDC/EIC, Informatica AXON

Data Profiling: Informatica Data profiling, Informatica Data Analyst, IBM Information Analyzer (IA)

Data Archive/Purge: Informatica Information Lifecycle Management (ILM)

Other ETL tools: Oracle Warehouse Builder (OWB), Oracle ODIEE, Humming Bird Genio ETL, Microsoft SSIS, Hitachi Vantara - Pentaho Data integration, SAP Data Services (BODS), SAP SDI, Talend ETL

Tools: Information Server Manager, Information Services Director, Fast Track, Business Glossary, IBM Metadata Workbench, IBM Master Data Management (MDM)

Business Intelligence: IBM Cognos 10.x/8.x, COGNOS suite (Impromptu 7.x/6.0,PowerPlay Transformer 6.6/7.x, PowerPlay 6.x/7.x,Access Manager, Report Administration, Server Administration, Cognos Upfront), Cognos ReportNet 1.1(Cognos Framework Manager, Report Studio, Query Studio, Cognos Connection), Business Objects 5.1.6/Xi, Crystal Reports, SAS Base, SQL Server Analysis Services (SSAS),Tableau, Qlikview,Tibco SpotFire, Hitachi Vantara - Business analytics Pentaho Report designer, SSRS, Microsoft Power BI, SAS BASE

RDBMS: Oracle 12C/11G/10G/9i/8.x/7.x, MS SQL Server 2012/2008/2005/2000 , Sybase 11.9.2/12.5/15.2/15.5 , DB2 v10/v9.5, Teradata, MySQL

NON RDBMS: Mongo DB, Cassandra, Hbase

Data Model tool: ERWIN

Languages: PL/SQL, SQL * LOADER, C, C++, COBOL

Scripting Languages: UNIX Shell Script, Python, Perl, MS-DOS Script

Incident Management: Remedy, HP SM7, HP SM 9

GUI: Visual Basic 6.0, Visual InterDev and VBA

Internet Technologies: ASP, COM, DCOM, MTS, VB script, Java, Java Script, XML

Tools: TOAD, SQL Navigator, DB-Artisan, XML Spy, Ultra Edit, DB Visualizer, Rapid SQL Developer, SOAPUI

Document Management: Microsoft SharePoint

Defect Tracking: HPQC 10, JIRA

Version Management Tools: GitHub, SVN, Visual Source Safe, PVCS, CVS, MKS, ClearCase

DevOps: Jenkins, IBM Urbancode

ERP: SAP-ABAP 4.6B

Schedule Management: JAMS, TIDAL V5/V6, AutoSys, CRONTAB

MSOffice: MS-Access, MS-Excel, MS-Power point

WORK EXPERIENCE:

Confidential, Joliet, IL

Informatica MDM Product 360 PIM Consultant

Responsibilities:

  • Attending meeting with stakeholders and business users and client technical team to gather the requirements for the Informatica PIM to get data from SAP MDM, SAP ECC, SAP EHSM
  • Understanding client current SAP MDM system to migrate the data to Informatica PIM
  • As part of the PIM demo’s to client, involved in answering client questions on Informatica PIM rich client, Informatica Web client and some part of media manager.
  • Transforming source excel files to load data into Informatica PIM structure preset values
  • Transformed source pdf data into csv and load data into Informatica PIM Structure features
  • Preparing source files for structure groups, structure group features to be loaded into Informatica PIM
  • Used Informatica developer tool to read the source files and transform the files to PIM required format
  • Used Excel Power Query to transform some of the data to send to PIM.
  • Coordinating with offshore team and onsite to get clarifications from the client.

Environment: Informatica Product 360(PIM), Informatica Developer 10.2.1, SQL server 2016, Windows 2016

Confidential

ETL Solution Architect/Specialist/Team lead

Responsibilities:

  • Overall ETL design and architecture of the Bedrock project and Data Quality process flow.
  • Interviewed senior Informatica ETL consultants to setup a team and interviewed INFA developers for support team.
  • Based on the bedrock data delivery requirements, designed logical and physical database objects in the SQL server 2012.
  • Designed and developed XML and JSON files splitter using Informatica advanced data transformation (ADT) to store XML and JSON source records as it is into the master tables.
  • Developed XML and JSON parsing using Advanced Data transformation. Used user defined transformation (UDT) to use the ADT code in INFA PowerCenter
  • Designed and developed custom DIH publication workflows and mappings which read data from XML, JSON and CSV files to publish data.
  • Coordinating with source data provider Murex, MDS and Other departments.
  • Developed custom DIH subscriptions to deliver XML, JSON and CSV data to consumers
  • Developed DIH auto publications and subscriptions to deliver data to DIH API adapter
  • Created export and import DIH specifications to export object and import objects into different environment
  • Created XML & JSON parsers, PowerCenter mappings & workflows, DIH pubs & subs to deliver data for reconciliation between MarkIT data and MDS data.
  • Developed python scripts to parse JSON documents and to automate day to day support tasks.
  • Designed and developed reconciliation code using python to download Confidential s3 files, format s3 files to remove metadata attached to format the files.
  • Created scripts to export and import INFA PowerCenter objects to be used by Jenkins build and Urbancode
  • Provided demo to Directors and stakeholders on the initial deployment of Informatica DIH.
  • Coordinated with QA while QA testing is in-progress
  • Coordinated with various source teams to prepare the data to UAT. Coordinated with business users while they were performing UAT.
  • Used Microsoft PowerBI to generate DQ reports out of custom DQ repository.
  • Created deployment JIRA to get approvals to deploy code to production
  • Arranged multiple meetings to explain about overall ETL design and developed DIH pubs and subs to supporting team to get approval to deploy to production
  • Created Jenkin jobs to commit the code to Github and used GitHub as version management tool
  • Deployed code to production using IBM Urbancode Devops tool.
  • Supporting INFA DIH, INFA PowerCenter during initial deployment of the code to production
  • Support L4 on a rotational basis
  • Leading the team and resolving any technical issues they come across.

Environment: Informatica PowerCenter 10.1, Informatica Advanced Data Transformation (ADT) 10.1, Informatica Data Integration Hub (DIH) 10.1, Windows server 2012 R2, JAMS Scheduler, SQL Server 2012

Confidential

Sr. ETL Consultant/Lead Consultant

Responsibilities:

  • Gone through the ECO business requirements to consume Instrument Mapping data.
  • Designed and developed DataStage Jobs and sequences in IBM Infosphere DataStage
  • Unit test and deploying to QA and UAT.
  • Supporting QA and UAT

Environment: IBM Infosphere information Server DataStage 11.5, Oracle 11G, RedHat Linux

Informatica and Data Governance tools administrator

Confidential

Responsibilities:

  • Capacity planning and submitted a request to infrastructure team to build the INFA servers for DEV, QA, UAT and PROD
  • Submitted a request for NAS storage and coordinated with infrastructure team to get the task done.
  • Conference calls with INFA sales head, professional services directors and other product R & D specialists on the product features.
  • Suggesting new enhancements and features to INFA DIH and INFA PowerCenter products.
  • Downloaded INFA software and installed INFA DIH, Informatica Developer, Informatica PowerCenter INFA Information life cycle management (ILM) on DEV, QA, UAT and PROD environments
  • Configured INFA DIH, INFA PowerCenter in the informatica administrator and INFA ILM product in the ILM product. Configured domain, analyst service, content management service, Data Integration service, Informatica integration service, Repository service, Web services Hub services, Catalog service, cluster service.
  • Created required DB connections in the Informatica administrator.
  • Downloaded INFA AXON, INFA IDQ and INFA EDC/EIC products and installed the products.
  • Configured INFA AXON, INFA IDQ and INFA EDC by creating various services in the Informatica Administrator.
  • Installing INFA DIH 10.2 and configured to run workflows developed in Informatica Developer.
  • LDAP (Microsoft access directory) configured in the Informatica Administrator for windows authentication.
  • Setup IBM MQ to load messages to MQ and configured connection factories and connection destination
  • Configured JNDI and JMS in the Informatica workflow manager to send messages to IBM MQ via JMS
  • Setup SFTP server on windows 2012 R2 to receive the files from source systems to DIH
  • Setup meetings with Devops team to implement Jenkin build and IBM Urbancode to deploy to target environments
  • Developing INFA repositories backup scripts to backup repositories periodically.
  • Setup meeting with Zabbix server monitoring teams and provided them commands to monitor the server and auto restart the services in case if any of the INFA services goes down.
  • Created server build Runbooks for supporting team
  • As an overall ETL administrator, monitored DEV, QA, UAT and PROD servers daily to avoid any interruptions and supporting developers in case if come across any INFA server issues.
  • Opening tickets with INFA support for the product issues and coordinating with them to fix the issues or getting EBF patches to fix the issue.
  • Coordinating with DBA for DB issues
  • Creating groups and users and granting privileges in the informatica administrator

Environment: Informatica PowerCenter 10.1 and 10.2, Informatica Advanced Data Transformation (ADT) 10.1 and 10.2, Informatica Data Integration Hub (DIH) 10.1 and 10.2, Windows server 2012 R2, Redhat Linux 7.3, SQL Server 2012 & SQL Server 2016

Confidential

Sr. ETL Consultant/Lead Consultant

Responsibilities:

  • Initial discussion on the validating the tools
  • Provided demo to project managers on how Confidential glue works in ETL space and how PySpark runs from Confidential Glue
  • Did analysis on Databricks Spark using python-PySpark and also using HQL(Hive)
  • Working on Databricks Spark to process the data and to deliver the data to downstream consumers
  • Used Scala to read data from the RIMES files on the DataBricks environment and converted the files to parquet format.
  • Developed spark jobs using Databricks spark to read files from Confidential S3 and to generate broken links between transactional and reference data and loading data to S3.

Environment: Confidential S3, Confidential Glue ETL, Confidential Athena, Confidential Redshift Spectrum Confidential QuickSight and Databricks Spark 2.3.1, Scala 2.11

Confidential

Sr. ETL Consultant/Lead Consultant

Responsibilities:

  • Discussed with business teams on the requirements on processing application logs
  • Architecting of the overall project, sizing of the server, submitted a request to infrastructure team to build the ELK servers
  • Setup required folder structure on the Linux server
  • Installed elasticsearch, logstash and Kibana stack on Linux servers
  • Configured ELK products to process, store and display the Kibana
  • Installed Filebeat on application server.
  • Design and develop ELK stack solutions.
  • Developed Logstash scripts
  • Conducted demo to Senior portfolio managers and project managers to explain them about ELK
  • Configuring Filebeat setup to deliver the application logs to ELK server
  • Indexing the log data to deliver to business users
  • Designed and created Kibana visualizations and dashboards
  • Day to day support of the ELK stack server and applications

Environment: Elasticsearch 6.3, Logstash 6.3, Kibana 6.3, Filebeat 6.2, RedHat Linux 7.3

Confidential

Sr. ETL Consultant/Lead Consultant

Responsibilities:

  • SRDR Rating Service
  • Migrating GSS DataStage ETL jobs from 8.5 to 11.5
  • BDR file data using Information Analyzer
  • Tableau for CDR and BDR reporting.
  • Hadoop Data Lake project which is developed to load data into HDFS.

Environment: IBM Infosphere information Server 11.5//8.5, Informatica Analyzer 8.5, Tableau 9, Red Hat Linux, TIDAL scheduler, SQL Server 2008, Sybase 15.2.

Confidential

Sr. ETL Consultant

Responsibilities:

  • Designed and developed CASL enhancements and FATCA reconciliation projects.
  • Point of contact for any technical DataStage issues

Environment: IBM Infosphere information Server 8.5, DB2 10, IBM Infosphere MDM Server 10, GNU Linux, ClearCase.

Confidential

Sr. ETL Consultant

Responsibilities:

  • Understanding the client current infrastructure of the client.
  • Gone through the client requirements for the alert messaging low latency requirements.
  • Designed the solution to load the low latency alert messaging into the database.
  • Designed the XML schema model definition for low latency messages.
  • Developed the DataStage jobs to demonstrate client about the design and how to process the real time messages into the Database.
  • Used MQ Connector, XML Input, Difference, joiner, look up, Datasets, Oracle Enterprise stages to demonstrate the low latency alert messaging.
  • Created sequences to continuously run the jobs.
  • Designed the invalid messages log file and designed the strategy to retrieve the invalid XML message.
  • Reviewed the jobs Developed by IBM Indian team and provided recommendations for the improvement.
  • Reviewed the client infrastructure and DataStage set up and suggested the changes to the environment.
  • Provided Low latency designed document to the client and recommended client to follow the designed approach for processing low latency messages as well as error handling.

Environment: IBM WebSphere DataStage EE 8.1, Oracle 10G/11G, WebSphere MQ, Linux

We'd love your feedback!