Informatica Mdm Developer Resume
Mountainview, CA
SUMMARY
- 5+ years of experience in software development,maintenance,testing and troubleshooting on Informatica Master Data Management MDM, ETL applications, data quality,and data integration.
- Informatica Certified Developer.
- Involved in all phases of SDLC from analysis of business requirements and planning to development and deployment.
- Expertise in installing, configuring and managing Informatica MDM Hub server, Informatica MDM Hub Cleanse server, Cleanse Adapters (Address Doctor), Informatica MDM Hub Resource Kit.
- Experienced in configuring landing tables, staging tables, base objects as per the data model, data sources and business rules.
- Prowess in configuring lookup table base objects, created relationships between base objects and lookup tables.
- Designed & developed multiple cleansing and standardization scenarios using Address Doctor.
- Analyzed the Business data and defined the Match and Merge rules and helped business to develop Trust Scores for MDM Hub.
- Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
- Identified Golden Record (BVT) for Customer Data by analyzing the data cleansing and merging duplicate records coming from different source systems.
- Skillful in configuring batch groups for processing staging, loading base objects and for performing match jobs in parallel and sequential order as per the dependency between base objects.
- Performed all Informatica administration functions such as user access accounts, roles, setting standards and associated privileges.
- Configured Hierarchy Manager for MDM HUB implementation which includes implementing hierarchies, relationship types, packages and profiles by using hierarchies’ tool in model workbench.
- Worked on JAVA SIF and implemented webservices.
- Worked on SIFAPI to integrate MDM data with external Applications.
- Experienced in using Java Messaging Services (JMS).
- Experience in developing Web Applications with various Open Source frameworks: Spring Framework, Spring MVC, and Hibernate.
- Used SQL loader to load data from files into Oracle tables.
- Created Mappings in Mapping Designer to load data from various sources using complex transformations like transaction control, Lookup (Connected and Un - connected), Joiner, sorter, Aggregator, Update Strategy, Filter and Router transformations.
- More thantwo years of experience in HadoopDevelopment/Administration built on six years of experience in Java Application Development.
- Good knowledge of Hadoop ecosystem, HDFS, Big Data, RDBMS.
- Experienced on working with Big Data and Hadoop File System (HDFS).
- Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, OoZie.
- Strong Knowledge of Hadoop and Hive and Hive's analytical functions.
- Capturing data from existing databases that provide SQL interfaces using Sqoop.
- Efficient in building hive, pig and map Reduce scripts.
- Implemented Proofs of Concept on Hadoop stack and different big data analytic tools, migration from different databases (i.e Teradata, Oracle,MYSQL ) to Hadoop.
- Successfully loaded files to Hive and HDFS from MongoDB, Cassandra, HBase
- Loaded the dataset into Hive for ETL Operation.
- Good knowledge on Hadoop Cluster architecture and monitoring the cluster.
- Experience in using DBvisualizer, Zoo keeper and cloudera Manager.
TECHNICAL SKILLS
EIM Tools: Informatica MDM Multi-Domain Edition (9.1.x/9.5.x/9.7.x), IDD, Informatica PowerCenter
Cleanse Adapters: Address Doctor
Databases: Oracle 9.x/10g/11g.
Developer Tools: TOAD, SQL Developer,HiveQL
ETL Tools: Informatica Power Center 9.x, Datastage.
Data Modeling: MS Visio 2000, Erwin
Programming Skills: Java, XML,Mapreduce
Application Servers: JBoss, Web logic
Package: MS Office (MS Access, MS Excel, MS Power Point, MS Word).
Hadoop Ecosystem: Map Reduce, Sqoop, Hive, Pig, Hbase,Cassandra, HDFS, Zookeeper
PROFESSIONAL EXPERIENCE
Confidential, Boise, ID
Application Support Lead
Responsibilities:
- Worked on an Agile SDLC model to deploy our product and Payer data.
- Payer data is extracted from facets and maintained across the organization.
- Particiapted in review meetings with users to build payer MDM and provide the real time integration between the MDM and downstream.
- Developed both provider and payer MDM and maintained their relation across the organization.
- Implemented Data&Services integration with customer registry.
- Designed and created BO, staging tables, mappings, transformations as per business requirements.
- Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
- Used small sample data sets to complete DEV match iterations.
- Defined multiple conservative match rules for IDL.
- Used multiple server properties and cleanse properties for matching.
- Performed External match jobs.
- Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
- Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Responsible for creating user groups, privileges and roles to the users using Security Access Manager.
- Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
- Worked on integrating Hierarchies created in MDM Hub Console and IDD.
- High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
- Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
- Implemented Post Landing, Pre-Staging, Save Handler User Exitsusing JAVAEclipse Mars and integrated those user exits with MDM Hub Console using User Object Registry module.
- Configured pre-staging, post-staging, pre-matchinguserexits and worked on SIF Integration.
- Worked on JAVA SIF and implemented webservices.
- Experienced in using Java Messaging Services (JMS).
- Written Hive UDFS to extract data from staging tables.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Involved in the regular Hadoop Cluster maintenance such as patching security holes and updating system packages.
- Managed Hadoop log files.
- Analyzed the web log data using the HiveQL.
- Experience in developing Web Applications with various Open Source frameworks: Spring Framework, Spring MVC, and Hibernate.
- Worked on BPM tool and Integrated to Informatica MDM.
- Configured Custom java cleanse functions and userexits.
Confidential, Dayton, OH
Sr Application developer
Responsibilities:
- Worked on an Agile SDLC model to deploy our product and Payer data.
- Payer data is extracted from facets and maintained across the organization.
- Particiapted in review meetings with users to build payer MDM and provide the real time integration between the MDM and downstream.
- Developed both provider and payer MDM and maintained their relation across the organization.
- Implemented Data&Services integration with customer registry.
- Designed and created BO, staging tables, mappings, transformations as per business requirements.
- Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
- Used small sample data sets to complete DEV match iterations.
- Defined multiple conservative match rules for IDL.
- Used multiple server properties and cleanse properties for matching.
- Performed External match jobs.
- Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
- Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Responsible for creating user groups, privileges and roles to the users using Security Access Manager.
- Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
- Worked on integrating Hierarchies created in MDM Hub Console and IDD.
- High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
- Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
- Configured pre-staging, post-staging, pre-matchinguserexits and worked on SIF Integration.
- Worked on JAVA SIF and implemented webservices.
- Experienced in using Java Messaging Services (JMS).
- Written Hive UDFS to extract data from staging tables.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Involved in the regular Hadoop Cluster maintenance such as patching security holes and updating system packages.
- Managed Hadoop log files.
- Analyzed the web log data using the HiveQL.
- Experience in developing Web Applications with various Open Source frameworks: Spring Framework, Spring MVC, and Hibernate.
- Worked on BPM tool and Integrated to Informatica MDM.
- Configured Custom java cleanse functions and userexits.
Confidential, Cincinatti, OH
MDM Developer
Responsibilities:
- Worked on an Agile SDLC model to deploy our product and customer data.
- Implemented Data&Services integration with customer registry.
- Designed and created BO, staging tables, mappings, transformations as per business requirements.
- Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
- Used small sample data sets to complete DEV match iterations.
- Defined multiple conservative match rules for IDL.
- Used multiple server properties and cleanse properties for matching.
- Performed External match jobs.
- Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
- Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Responsible for creating user groups, privileges and roles to the users using Security Access Manager.
- Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
- Worked on integrating Hierarchies created in MDM Hub Console and IDD.
- High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
- Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
- Configured pre-staging, post-staging, pre-matchinguserexits and worked on SIF Integration.
- Worked on JAVA SIF and implemented webservices.
- Experienced in using Java Messaging Services (JMS).
- Written Hive UDFS to extract data from staging tables.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
Environment: Informatica Multi-domain MDM 9.7.1, Informatica Power Center 9.1, JBoss EAP6.2,Oracle 11g, SQL Developer, Address Doctor,IDD, SIF, Toad, Windows Server 2003
Confidential, MountainView, CA
Informatica MDM developer
Responsibilities:
- Worked on an Agile SDLC model to deploy our product and customer data.
- Implemented Data&Services integration with customer registry.
- Designed and created BO, staging tables, mappings, transformations as per business requirements.
- Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
- Used small sample data sets to complete DEV match iterations.
- Defined multiple conservative match rules for IDL.
- Used multiple server properties and cleanse properties for matching.
- Performed External match jobs.
- Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
- Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Responsible for creating user groups, privileges and roles to the users using Security Access Manager.
- Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
- Worked on integrating Hierarchies created in MDM Hub Console and IDD.
- High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
- Implemented Search feature, Add/Update feature and Task Assignment feature in IDD for Admin user.
- Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Assisted with data capacity planning and node forecasting.
- Implemented Post Landing, Pre-Staging, Save Handler User Exitsusing JAVAEclipse Mars and integrated those user exits with MDM Hub Console using User Object Registry module.
- Configured pre-staging, post-staging, pre-matchinguserexits and worked on SIF Integration.
- Worked on JAVA SIF and implemented webservices.
- Experienced in using Java Messaging Services (JMS).
- Experience in developing Web Applications with various Open Source frameworks: Spring Framework, Spring MVC, and Hibernate.
- Worked on BPM tool and Integrated to Informatica MDM.
- Configured Custom java cleanse functions and userexits.
Environment: Informatica Multi-domain MDM 9.7.1, Informatica Power Center 9.1, JBoss EAP6.2,Oracle 11g, SQL Developer, Address Doctor,IDD, SIF, Toad, Windows Server 2003
Confidential
MDM developer
Responsibilities:
- Participated in Design and Build for Supply Chain domains namely Supplier, Manufacture, Location, Product and Contract.
- Configured 7 source systems and prepared Source to Target Mapping document for ETL load in Landing Tables.
- Configured Landing Tables, Base Objects,Relationships,Staging Tables,Mappings,custom cleanse functions, Match and Merge settings, Trust and Validation Rule.
- Identified issues in the ETL load to reflect the current data model relationships.
- Configured Entity Objects, Entity Types, Hierarchy and Relationship Types for Contract,Product,and Party Hierarchical view in IDD.
- Configured Supply Chain IDD Application for use by Data Stewards.
- Configured Queries and Packages for all the domains for use in IDD.
- Configured Subject Area Groups for Supplier,Manufacture,Product and Contract.
- Configured Roles for Read Only, Approver and IDD specialist for respective domains.
- Created Task Assignment in IDD for Supplier and Manufacture domains.
- Worked with ETL team to schedule Batch Groups for all the domains and are called via ETL workflows which call corresponding MDM SIF API’s.
- Responsible for creating Hive tables based on business requirements
- Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
- Involved in NoSQL database design, integration and implementation.
- Loaded data into NoSQL database HBase
- Used Hive to transfer data between RDBMS and HDFS.
- Converted existing SQL queries into Hive QL queries.
- Created MDM Workflows to manage the Data flow process.
- Experience in Cloud Computing technologies like Amazon Web Services (AWS)
- Configured pre-staging, post-staging, pre-matchinguserexits and worked on SIF Integration.
- Used SOAP UI to perform SIF API calls like clean tables etc.
- Installed and configured Jboss, Hub, Cleanse server, Address Doctor on Production.
- Participated in making the Data Stewards familiarize with the IDD functionality.
- Cleansed US address using Address Doctor with US license.
- Ability to coordinate across teams, working closely with peers to ensure the appropriate focus and sense of urgency is applied to all MDM issues.
- Worked with Informatica Support for MDM issues and upgrades.
- Configured External Authentication using LDAP configuration for Users Authentication.
- Documented the application and teach/share with others as necessary.
Environment: Informatica Multidomain MDM 9.6/9.7.1 HF1, Jboss 6.2 EAP, SQL Server 2012, IDD, Erwin, Address Doctor 5, SOAP UI 5.0.0, Hadoop, Hive, Hbase, AWS.