We provide IT Staff Augmentation Services!

Technical Lead, Informatica Cloud (iics) Resume

4.00/5 (Submit Your Rating)

Indianapolis, IN

SUMMARY

  • 9+ Years of experience in Information Technology, Data Warehousing, and ETL process using Informatica Power Center/Informatica Cloud.
  • Experience wif Software Development Life Cycle (SDLC) and Waterfall, Agile methodology.
  • Experience in the Analysis, Design, Development, Testing and Implementation phases of Business Intelligence solutions using ETL tool Informatica Power Center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor).
  • Experience in data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snowflake Schema, FACT and Dimension Tables), OLTP, and OLAP.
  • Having strong hands - on experience in extraction of the data from various source systems like Oracle, PostgreSQL, DB2, My Sql, Sql Server, SFDC, Flat Files and XML.
  • Proficient in the development of Extract, Transform and Load processes wif a good understanding of source to target data mapping, ability to define and capture Meta data and Business rules.
  • Experience in Data Analysis, Data Mapping, Data Modeling, Data Profiling and development of Databases for business applications and Data warehouse environments.
  • Experience in creating Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in writing Stored Procedures, Package, Functions, Triggers, Views and Materialized Views using SQL and PL/SQL.
  • Strong experience in Enterprise Data Warehouse environment wif Informatica/Informatica Cloud
  • Data integration wif SFDC using Informatica cloud
  • Proficiency in Data Warehousing techniques for Data Cleaning, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment.
  • Worked on implementing cloud projects to setup infrastructure for Informatica cloud
  • Experience in supporting and performing Unit testing, System testing, Integration testing, UAT and production support for issues raised by application users.
  • Experience in performance Tuning, identifying and resolving performance bottleneck in various levels like sources, targets, mappings and sessions.
  • Experienced wif coordinating cross-functional teams.
  • Good knowledge in creating Reports and Dashboards using Power BI
  • Excellent interpersonal and communication skills, and is experienced in working wif senior level managers, businesspeople, and developers across multiple disciplines. Very good team player and self-starter wif ability to work independently and as part of a team.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center10.4/9.6.1, IICS informatica cloud

Reporting Tools: Power BI

Scheduling Tools: CA Workload Automation,, Cron Tab, Informatica Scheduler

Databases: Oracle, PostgreSQL, DB2, Sql Server, Azure SQL server, SQL datawarehouse

Modeling: Dimensional Modeling, ER Modeling

Programming Languages: SQL, PL/SQL, Unix Shell Scripting

Methodologies: Agile, Waterfall

Operating Systems: Unix, Windows

PROFESSIONAL EXPERIENCE

Confidential, Indianapolis, IN

Technical Lead, Informatica Cloud (IICS)

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Working closely wif the business teams to understand the requirements along wif Business Analyst & ETL Architect Team.
  • Develop integration processes using Informatica’s Intelligent Cloud Services (IICS) and Informatica PowerCenter using best practices around data movement including things like audit, balance, and control, and error handling.
  • Developed ETL programs using Informatica Intelligent Cloud Services to implement the business requirements.
  • Document creating source to target mapping documents, creating technical design documents and upload them to share point. These documents are referred by the respective teams to perform further activities. Participate in code reviews
  • Responsible to analyze Source System and Created design specification and technical specifications based on functional requirements/Analysis.
  • Created mappings using Informatica Cloud transformations Source and target, Data mask, Expression, Filter, Hierarchy Builder, Hierarchy parser, joiner, lookup, Router, sequence Generator, sorter to load the data from PostgreSQL, Oracle to Databases and vise versa.
  • Created ETL jobs to provide data to subscribers by making use of APIs - RunAjob utility to run in a loop for multiple Trails
  • Built Processes using Application Integration Service Connectors, Connections, process objects, sub-process.
  • Developed audit, parameterization, error logging framework in Informatica cloud for better tracking of the data.
  • Developed Informatica cloud mapping to load all the data from PostgreSQL to Normalized and De-Normalized XMLs.
  • Experience in creating Databases/ Warehouses in Snowflake for different projects.
  • Involved in Migrating legacy data warehouse.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Created tasks, sequential and concurrent batches for proper execution of mappings using task flow.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and tan created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.
  • Worked for resolving different ETL production tickets wif Tier 1 team.
  • Worked solving data issues in DWH raised by end users using Trouble tickets based on severity.

Environment: Informatica Intelligent cloud services, Informatica Power Center 10.4/9.6.1, Oracle 12c/11g, Toad, Postman, PostgreSQL.

Confidential

Data Warehouse Specialist

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Developed ETL programs using Informatica Intelligent Cloud Services to implement the business requirements.
  • Good knowledge of Hadoop Architecture and various components such as HDFS and Experience in analyzing data using Hive QL and exporting and importing data using Sqoop from Relational Database system to HDFS and vice-versa
  • Responsible to analyze log files created by multiple applications and to pull log data into Hive Tables using HiveQL, Nifi and Sqoop which were built using Shell scripts
  • Created DDLs for Hive table creation, utilized these DDLs through shell scripts for automated process
  • Involved in creating table Partitions and buckets for optimization
  • Created mappings using Informatica Cloud transformations Source and target, Data mask, Expression, Filter, Hierarchy Builder, Hierarchy parser, joiner, lookup, Router, sequence Generator, sorter to load the data from Salesforce to Databases.
  • Migrated power center 10x mappings into Informatica cloud
  • Developed audit, parameterization, error logging framework in Informatica cloud for better tracking of the data.
  • Shelling Scripting for processing Source system files and working wif enterprise teams on scripting to move business requested targeted files to desired location using Routing IDs
  • Scheduling of Workflows by designing application in CA Workload Automation tool by creating necessary dependencies in each of the workflow when needed
  • Developed Informatica cloud mapping to load all the data from flat files - XML, csv and PostgreSQL into Oracle Database
  • Placing the developed code into GitHub and branching using Git. Check-in, check-out of files, commits, Perform Push/pull and merging branches
  • Implemented Pass Through, Auto Hash Partition for performance tuning in Informatica
  • Debugging wif debugger and Verbose Initialized, Verbose Data types in Informatica
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Created tasks, sequential and concurrent batches for proper execution of mappings using task flow.
  • Generated reports using Power BI to the decision makers for strategic planning
  • Developed Power BI reports and dashboards from multiple data sources
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and tan created integration Testing Document.
  • Followed Informaticarecommendations, methodologies and best practices.
  • Worked for resolving different ETL production tickets wif Tier 1 team.
  • Worked solving data issues in DWH raised by end users using Trouble tickets based on severity.

Environment: Informatica Intelligent cloud services, Informatica PowerCenter, Hive, HDFS, Sqoop, Oracle 12c/11g, Toad, Putty, HP Quality Center, GitHub, Jira, Unix, Windows, Postman, Power BI

Confidential

ETL Informatica Developer

Responsibilities:

  • Worked closely wif the business teams to understand the requirements along wif Business Analyst & ETL Architect Team.
  • Involved in the discussions wif data modelers to review the design of the data model and to create database tables.
  • Involved in creating the technical design documents along wif Business Analyst and design walk through wif data architects.
  • Used Informatica Power Center 10.2 /9.6.1 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run wif the logic embedded in the mappings.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Created sessions, sequential and concurrent batches for proper execution of mappings using workflow manager.
  • Wrote Unix Scripts to create flat file outputs and also collect flat files from the server and load tan to a database.
  • Extensively used Informatica debugger to figure out the problems in mapping.
  • Involved in troubleshooting existing ETL bugs.
  • Worked wif business and formulated them into requirement documents and documented the source to target data flow.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and tan created integration Testing Document.
  • Followed Informaticarecommendations, methodologies and best practices.
  • Worked for resolving different ETL production tickets wif Tier 1 team.
  • Worked solving data issues in DWH raised by end users using Trouble tickets based on severity.

Environment: Informatica Power Center 10.1/9.6.1, Oracle 12c/11g, Putty, Win SCP, Toad, HP Quality Center, Jira, Unix, Windows, ServiceNow

Confidential

ETL Informatica developer

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated wif business customers to discuss the issues and requirements.
  • Extracted data from different sources and staged into a single place and applied business logic to load them in the DWH.
  • Used Informatica Power Center 9.1 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run wif the logic embedded in the mappings.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Extensively used Informatica debugger to figure out the problems in mapping.
  • Involved in troubleshooting existing ETL bugs.
  • Created a list of the inconsistencies in the data load on the client side to review and correct the issues on their side.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and tan created integration Testing Document.
  • Followed Informaticarecommendations, methodologies and best practices.
  • Worked for different ETL failures in production wif Tier 1 team.
  • Updated issue details in Remedy and following SLA process defined on each application
  • Handled Production Support for the monitoring of daily and Monthly Jobs Part of the implementation team
  • Proactively communicate status and issues/risks wif project stakeholders dat include Business stakeholders, IT Leads

Environment: Informatica Power Center 9.1, Oracle 10g, Putty, Win Scp, Sql Assistant, Toad, HP Quality Center, Unix, Windows, Remedy

We'd love your feedback!