We provide IT Staff Augmentation Services!

Sr Etl / Cloud Informatica Mdm Developer Resume

2.00/5 (Submit Your Rating)

New Orleans, LA

SUMMARY

  • Over 8+ of experience in Business Requirements Analysis, Designing, coding and testing of Data Warehousing implementations across Financial, Insurance, Pharmaceutical and Educational Industries.
  • Experience in building and managing various data warehousing/data marts using Informatica products such as Power Centre, Power mart, Power Exchange for data models.
  • Strong experience in performing ETL operations like Data Extraction, Data Transformation and Data Loading with Informatica Power Centre and Informatica Power Mart (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor).
  • Involved in all aspects of the project and technologies (Informatica, Unix Scripts, and Business Objects Reports) and work with a team of Onsite and offshore build and Test resources.
  • Strong ETL Skills with Multiple ETL Tools Informatica Power Centre, Cloud, BDE, BDM, Talend BDM, DI, DP, MDM, SAP BODS, Oracle Data Integrator in development in Data Warehouse Migration.
  • Migrate data to Microsoft Azure Cloud Platform, Azure SQL DB, Hadoop data on the Azure HDInsight Service using Informatica.
  • Strong knowledge of Azure Storage Accounts, Containers, Blob Storage, Azure Data Lake, Azure data factory, Azure SQL data warehouse, stretch Database, Machine Learning, Virtual Machines.
  • Strong knowledge of IDQ Mapping Designer, Mapplet Designer, Transformation developer Designer, Workflow Manager and Repository.
  • Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, etc.
  • Knowledge in OLTP/OLAP System Study and E - R modelling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional modelling.
  • Enhanced performance for Informatica sessions by using physical pipeline partitions, DTM Session performance and Parameter Files.
  • Used Informatica BDM IDQ 10.1.1 (Big Data Management): To inject the data from AWS S3 raw to S3 refine and from refine to Redshift.
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters and AWS Cloud
  • Extensive experience in using tools like SQL Plus, TOAD, SQL Developer, and SQL Loader.
  • Experience in data modelling using designed tool Erwin 4.1/4.0 and worked with Oracle Enterprise Manager and Toad.
  • Experience in Extracting data from Facets claims application to pull it into the Staging area.
  • Experience in using Facets claim application to open, add generations, enter, and save information
  • Knowledge in extracting data from various sources like Oracle, DB2, Flat file, SQL SERVER, XML files, Teradata and loaded into Teradata, Oracle database.
  • Strong understanding of Performance tuning in Informatica and Databases with Oracle Stored Procedures, Triggers, Index and experienced in loading data into Data Warehouse/Data Marts using Informatica.
  • Hands-on experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets, and PL/SQL stored procedures.
  • Experience on data profiling & various data quality rules development using Informatica Data Quality (IDQ).
  • Experienced in developing applications in Oracle and writing Stored Procedures, Triggers, Functions, Views, and creating Partitions for better performance.
  • Extensive experience in Oracle SQL. PL/SQL Programming. Experienced in writing and designing the SQL queries.
  • Experience with Facets batches, HIPAA Gateway, and EDI processing.
  • Worked on IDQ tools for data profiling, data enrichment, and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analysing the scorecards to design the data model.
  • SQL*Loader, Developed PL/SQL, SQL*Plus, Tuning using explain plan, etc.
  • Setting up ODBC connections, Batches, and sessions to schedule the loads at required frequency using Power Centre.
  • Experience in UNIX Shell Scripting. Autosys, Control-M for scheduling the Workflows.
  • Familiar with Agile development and waterfall methodologies.
  • Ability to work in teams as well as individually, quick learner and able to meet deadlines Development experience

TECHNICAL SKILLS

Operating Systems: Windows 7 Professional, Windows NT 4.0, Windows 2000 Server, Windows 2000 Advanced Server, Windows 2003 Server, Windows XP, Windows Vista, 7, UNIX and Mac OS

Software: C, Java, SQL, HTML, XML, Oracle 11g / 10g, MS SQL 2008, Teradata 13, MS Access, MSOffice 2010/2007

RDBMS: Oracle, MS SQL Server 7.0, 2000, 2008, DB2.

ETL Tools: Informatica Power Centre 9.5.1/9.0.1/8.6.1/8.0.1/7.5/7.1 , Informatica Cloud, Control-M, IDQ, MDM, Autosys, SharePoint, Erwin.

SAP Data Services BODS: Designer, Monitor, Admin Console & Repository.

Reporting Tools: SQL Server Reporting Services (SSRS), Tableau, Power View, SharePoint 2007

Data Modelling Tools: Erwin, (Star schema/Snowflake)

Mark-up Languages: XML, HTML, DHTML.

Database Query Tools: SQL Server Execution Plan, MS SQL Server Query Analyzer, SQL Profiler, Red Gate SQL Data Compare, Red Gate SQL Data Generator, Red Gate SQL Search, Azure Sql Server, AWS S3 and AWS Redshift.

Version Control Tools: SVN, Team Foundation Server, VSS

Atlassian Tools: JIRA, Confluence

PROFESSIONAL EXPERIENCE

Confidential, New Orleans, LA

Sr ETL / Cloud Informatica MDM Developer

Responsibilities:

  • Integrated Active VOS into the hub and made sure that default workflows are integrated. Deployed the default workflows manually if required.
  • Created RESTful Web Services on IDD using provisioning tool which is used by third-party applications for accessing the MDM Hub.
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Used 7+ address validators using different scopes by crossing the default limit of 3 address validators in one mapping. Changed the default behaviour of the address doctor.
  • Used the ETL process for cleansing, delta detection for minimizing the processing load on MDM before bringing in the data into landing tables.
  • Worked on profiling the data using Developer/Analyst Tool for identifying the data integrity from different sources.
  • Creating JAVA User Exits using SIF APIs to customize MDM Hub functionality.
  • Extracted data from various source systems like Oracle, SQL Server and DB2 to load the data into Landing Zone and then by using Java copy command loaded into AWS-S3 Raw Bucket.
  • Used validation rules extensively (50+) above and beyond Informatica advised limited of around 27 for picking up the right column by MDM.
  • Developed Cloud mappings to extract the data for different regions
  • Create Centralized repository for all code artifacts using Azure Data Catalog
  • Developed mappings for loading the data from landing to the stage tables.
  • Created the custom cleanse functions and configured custom as well as default cleanse functions into mappings.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns, and Rulesets.
  • Used SIF API's (GET, SearchMatch, PUT, CleansePut, ExecuteBatchDelete, etc.) to test search, update, cleanse, insert, and delete data from SoapUI.
  • Configured match rule set filters for meeting the different data scenarios using SQL filter, segment matching/Segment all matching and non-equal matching.
  • Performed match/merge, ran match rules to check the effectiveness of MDM on data, and fine-tuned the match rules.
  • Involved in analysing different modules of facets system and EDI interfaces to understand the source system and source data.
  • Accepted inbound transactions from multiple sources using FACETS.
  • Supported integrated EDI batch processing and real-time EDI using FACETS.
  • Created custom indexes using Register Custom Index SIF API for improving the performance of the load, match, and merge process.
  • Installing and Configuring of Informatica MDM Hub Console, Hub Store Cleanse and Match Server Address of Informatica PowerCenter applications
  • Implemented the pre-land and land process of loading the dataset into Informatica MDM Hub
  • Configured and documented the Informatica MDM Hub to perform loading, cleansing, matching, merging and publication of MDM data
  • Worked on Real-Time Integration between MDM Hub and External Applications using SIF API for JMS
  • Diligently worked with Data Steward Team for designing, documenting and configuring Informatica Data Director and worked on IDD user exits.
  • Worked independently on different claims systems - FACETS, NASCO, WGS.
  • Worked on scheduling the load, stage, match, merge jobs in an appropriate sequence.
  • Used Active VOS for configuring workflows like one step approval, merge, and unmerge tasks.
  • Configured static lookups, dynamic lookups, bulk uploads, and Smart search in IDD.
  • Worked provision tool for custom configuring various views as part of entity 360 views for different roles.
  • Configured JMS Message Queues and appropriate triggers for passing the data to the contributing systems/receiving downstream systems.

Environment: Multi-Domain MDM 10.1, IDD, Oracle 11g, Oracle PL/SQL, Windows Application Server, JBoss Application Server, Active VOS, SIF API, Informatica Power Centre 10.1, Provisioning tool, Address Doctor 5.1, PowerShell/Bat scripts.

Confidential

ETL / Informatica MDM Developer

Responsibilities:

  • Involved in leading and monitoring the team, assigning the task, reviewing the development activity and status calls.
  • Produced detailed MDM design specifications consistent with the high-level MDM design specifications.
  • Coordinated with ETL team for performing the batch process to populate data from an external source system to landing tables in the hub.
  • Analysed the source systems data for SNO based on profiling results that helped to determine the trust scores and validation rules.
  • Convert specifications to programs and data mapping in an ETL Informatica Cloud environment
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Configured Landing, staging tables and Base Object tables.
  • Trust scores and validation rules are configured in the hub.
  • Worked on integration of the external application with MDM Hub using SIF APIs.
  • Design, document, and configure the Informatica MDM Hub to support initial data loads and incremental loads, cleansing.
  • Worked on Address Doctor for cleansing addresses using the Developer tool before feeding into landing tables.
  • Analysis and implementation of the existing claim adjudication process in FACETS.
  • Used SOAP UI to perform SIF API calls like clean tables etc.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns, and Rulesets.
  • Used filters, segment/segment all matching and non-equal matching.
  • Performed match /merge and ran match rules to check the effectiveness of MDM on data and fine-tuned the match rules.
  • Customized User Exists for deferent scenarios.
  • Used Hierarchy tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles.
  • Data integration with claim processing engine (Facets).
  • Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director.
  • Used Native BPM for configuring workflows like One-step approval, merge, and unmerge.
  • Used Repository Manager/Change List for migrating incremental as well as bulk meta-data.

Environment: Multi-Domain MDM 9.7, IDD, Address Doctor, Oracle 11g, Oracle PL/SQL, SIF API, Windows Application Server, Native BPM.

Confidential

ETL / Informatica Developer

Responsibilities:

  • Worked closely with Development managers to evaluate the overall project timeline.
  • Interacted with the users and making changes to Informatica mappings according to the Business requirements.
  • Developed the Informatica Mappings by the usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Worked on design, development and Performance Tuning Informatica mappings, Workflow Batch process of STG, DW, DM with Worklets, Pushdown Optimizations, Drop, Create index scripts, parallelism to ensure the batch comprising of 850 tables completed on time on delta runs and tuning complex queries in Oracle.
  • Worked closely with Informatica Admins, Operation for Code movement to SIT, UAT and finally to Production for smooth code movement of Informatica, DB object to ensure Initial load, Delta Load were conducted promptly and resolved all the issues in Initial Load and Delta smoothly.
  • Worked on resolution of Production Fixes on Delta run enhancement of the existing Informatica code.
  • Worked strongly on Documentation for Operation Using Deployment Group, import-export the procedure, run book for Prod Support. Knowledge Transfer to ensure the smooth running of the Batch.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router, and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Involved in standardization of Data like changing a reference data set to a new standard.
  • Data, if validated from the third party before providing to the internal transformations, should be checked for its accuracy (DQ).
  • Used Address validator transformation in IDQ.
  • Involved in massive data profiling using IDQ (Analyst tool) before data staging.
  • Created partitioned tables, partitioned indexes for manageability, and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for clean-up and update purposes.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts using SSIS, Worked on Data transfer from a Text file to SQL Server by using bulk insert task in SSIS.
  • Extensively used the Business Objects functionality such as Master-Detail, Slice and Dice, Drill Down and Hierarchies for creating reports.
  • Implemented slowly changing dimensions Type 2 using the ETL Informatica tool.
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Created Tableau a worksheet which involves Schema Import, Implementing the business logic by customization.
  • Created Use-Case Documents to explain and outline data behaviour.
  • Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
  • Used Address validator transformation for validating various customers address from various countries by using the SOAP interface.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Involved in the deployment of IDQ mappings to application and different environments.
  • Defects are logged and change requests are submitted using the defects module of Test Director.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.

Environment: Informatica Power Centre 9.5/9.1, IDQ, SAP Data Services, SAS, Business Objects 3.1, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, TOAD, MS Excel 2007.

We'd love your feedback!