We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

2.00/5 (Submit Your Rating)

Pittsburg, PA

SUMMARY

  • 8+ years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications.
  • Expertise in using ETL Tool Informatica PowerCenter 10.2.1/9.6.5 (Mapping Designer, Workflow Manager, Repository Manager and ETL concepts.
  • Expertise in working with various sources such as Oracle 12c/11g/10g/9i, SQL Server, DB2, UDB, Netezza, Teradata, flat files, XML, COBOL, Mainframe.
  • Experience on creating multiple Data Synchronization tasks for incremental load using Informatica Cloud which include Salesforce service cloud, salesforce sales cloud and salesforce financial service cloud.
  • 3+ years of Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules.
  • Experience working with Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
  • Data integration with Azure SQL Data Warehouse and on premises data warehouse using Informatica cloud.
  • Extensive knowledge in RDBMS, Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Strong knowledge on IDQ Mapping Designer, Mapplet Designer, Transformation developer Designer, Workflow Manager and Repository
  • Good understanding on Hive SQL.
  • Experience in writing PL/SQL, T - SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Experience on data profiling & various data quality rules development using Informatica Data Quality (IDQ).
  • Worked on IDQ tools for data profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analyzing the scorecards to design the data model
  • Strong experience in implementing CDC using Informatica PowerExchange 10.2/9. x. Creating registration groups and extraction groups for CDC implementation.
  • Experience working with data lake implementation. Involved in development using Informatica to load data into Hive and impala systems.
  • Hands on experience working on Waterfall Model as well as Agile Model, implementation of various sprint.
  • Interacted with end-users and functional analysts to identify and develop Business Requirement Documents (BRD) and transform it into technical requirements.
  • Strong experience with Informatica mapping i.e., Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository.
  • Designed and developed complex mappings to move data from multiple sources into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, Update Strategy, and Stored Procedure from varied transformation logics in Informatica.
  • Worked with Teradata various utilities like Fast Load and Multi Load and Teradata Parallel transporter and highly experienced in Teradata SQL Programming.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the Performance bottlenecks.
  • Having strong hands-on experience in extraction of the data from various source systems ranging from Mainframes like DB2, Flat Files, VSAM files, etc. to RDBMS like Oracle, SQL Server, Teradata etc.
  • Extensively used multiple Slowly Changing Dimension (SCD-Type1,2,3) technique in ETL Transformation.
  • Expertise in OLTP/OLAP System Study, Analysis, E-R diagram, developing Dimensional Models using Star schema and Snowflake schema techniques used in relational, dimensional and multidimensional modeling.
  • Worked on optimizing the mappings by creating re-usable transformations and Mapplets. Created debugging and performance tuning of sources, targets, mappings, transformations and sessions.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
  • Experience in Task Automation using UNIX Scripts, Job scheduling and Communicating with Server using PMCMD command. Extensively used Autosys for Job monitoring and scheduling. Automated the ETL process using UNIX Shell scripting.
  • Proficient in converting logical data models to physical database designs in Data Warehouse Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
  • Experience in defining standards, methodologies and performing technical design reviews.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter (10.2/9.6.5/8.1 ), PowerExchange (10.2/9.6,), IICS R29, IDQ 10.1

BI Tools: Cognos 10/9, Tableau 10

Data Modeling: ERwin 9.5.2/7.3/4.1 , MS - Visio

Databases: Teradata 15/14, Oracle 12c/11g/10g/9i, SQL Server 2008, DB2, MySQL, PostgreSQL

Languages: XML, Java, HTML, JAVA, PL/SQL C++, C, UNIX Shell Scripting, SQL, PL/SQL.

Big Data: Hadoop Ecosystem (HDFS, Hive, HDFS)

OS: MS-DOS, HP UNIX, Windows and Sun OS.

Methodologies: Ralph Kimball’s Star Schema and Snowflake Schema.

SAP(ERP): SAP IDOCs, BAPI, RFC, SAP Scripts

Others: MS Word, MS Access, T-SQL, TOAD, SQL Developer, Microsoft Office, Teradata Viewpoint, Teradata SQL Assistant, Icescrum, Rally, JIRA, Control - M, Autosys, GitHub

PROFESSIONAL EXPERIENCE

Confidential, Pittsburg PA

Sr. Informatica Developer

Responsibilities:

  • Migrating Oracle Databases to SQL Server (Oracle to SQL) and loading the converted database objects into SQL Server then migrate it to SQL Server.
  • Prepared standard DESIGN AND CODE documents, Technical Specification, ETL Specifications and Migration documents then involved with the Business analysts in requirements gathering, analysis, testing and project coordination.
  • Design and reviewed the Interface/Conversion code before moving to QA to ensure design standard is met.
  • Maintained daily Tech Tracker/Issue logs for the updates and issues related to objects and progress. Used JIRA to track task and manage the project with workflows and sprints.
  • Solved various performance and complex logic related issues with effective and efficient solution in the ETL process (performance tuning) and identified various bottlenecks and critical paths of data transformation in sessions for different Interface.
  • Developed Workflow, Session and mapping to load heterogeneous data into SAP using BAPI, RFC and IDOCs with Power Connect & also used Power exchange for DB2/SQL server Vise Versa.
  • Design the validation process for conversion and migration for data accuracy and consistency. Created Reusable Mapplet for time saving development for interfaces.
  • Used different transformations like Connected & Unconnected Lookup, Stored Procedure, Router, Aggregator, Normalizer, Source Qualifier, Joiner, Expression, Update Strategy, Aggregator and Sequence generator, Transaction Control Transformations to data conversion before loading to Target.
  • Performed different type testing proactively to identify data anomalies and inaccuracies by using Unit, Integration and Smoke testing.
  • Performed various Performances Tuning activities at Database level (source, target, Look Up), Informatica level (workflows, mapping and session) as well as database level in Oracle, SQL Server, My-SQL using Sql developer.
  • Used Tidal Automation Adapters for Informatica, to schedule Informatica jobs through the Tidal platform, enabling them to leverage its capabilities to automate, simplify and improve job scheduling and ETL performance.

Environment: Informatica PowerCenter 10.2, Informatica PowerExchange 10.2.1, Oracle 11g/12c, SAP R/3 7.1, SQL Server 2019, My-SQL DB2, PL/SQL, AS400, SQL developer, T-SQL, AIX/UNIX, Notepad++, JIRA, Cloud, Flat files, Tidal.

Confidential, NJ

Sr. Informatica Developer

Responsibilities:

  • Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • By mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Data integration, warehouse and Migration using Informatica Cloud, PowerCenter & SFDC.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Effectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sql Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files.
  • Experience working with Informatica Powerexchange integration with IICS to read data from Condense Files and load into Azure Sql Datawarehouse environment
  • Build human task workflow scenarios and IDQ scorecards to support data remediation
  • Read and understand the API documentations.
  • Developed several reusable transformations and Mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Implemented various Performance Tuning techniques.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
  • Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
  • Worded on profiling source data and determining all source data possible values and metadata characteristics.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Wrote BTEQ scripts and also Teradata utilizes like FASTLOAD, MLOAD, and FASTEXPORT.
  • Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
  • Used PMCMD command to run workflows from command line interface.
  • Improved performance testing in Mapping and the session level.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Migration of code from development to Test and upon validation to Pre-Production and Production environments.
  • Provided technical assistance to business program users and developed programs for business and technical applications.
  • Responsible for team members’ work assignment and tracking.
  • Written documentation to describe development, logic, testing, changes and corrections.

Environment: Informatica PowerCenter 10.2, IICS, Oracle 12c, MySQL, Teradata 15.1, IICS R29, Teradata SQL Assistant, MSSQL Server 2012, DB2, Erwin 9.6, Control-M, Putty, Linux, Shell Scripting, ClearCase, WinSCP, Notepad++, JIRA, Cloud, Agile, IDQ, Tableau 10, SQL Developer, HP Quality Center, QlikView, Cognos 10, T-SQL.

Confidential, Branchburg, NJ

ETL Informatica Developer

Responsibilities:

  • Worked with business analyst to understand business requirement in order to transform business requirements into effective technology solutions. Also involved in robust teamwork with Managers, Architects and Data Modelers to understand the business process and functional requirements.
  • Created high-level, low-level design documents, ETL specification documents, data model document and test document.
  • To implement business rules created robust mapping, Mapplets and reusable transformation using Informatica PowerCenter and its different transformation including Joiner, Look-up, Rank, Filter, Router, Expression, Aggregators, Sequence Generator, Sorter, Update Strategy.
  • Improved and enhanced various jobs in different cycles by creating reusable and common job techniques, extensively used parameters at run time to push in job related parameters.
  • Worked with connected and unconnected look-up and configured the same for implementing complex logic.
  • Worked with Informatica Data Quality (IDQ) 9.1 for data cleansing, data matching and data conversion
  • Designed/Developed IDQ reusable mappings to match accounting data based on demographic information.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues. Extensively worked on Performance Tuning of ETL Procedures and processes.
  • Experience working with Custom Built Query to load dimensions and Facts in IICS.
  • Worked on various Salesforce.com objects like Accounts, Contacts, Cases, Opportunity.
  • Worked on performance tuning by identifying the bottlenecks in Sources, Targets, and Mapping. Enhanced Performance on Informatica sessions using large data files by using partitions.
  • Actively participating in proving technical proposal for upgraded existing ETL and OBIEE code at client locations (In order to make use of advanced features of Informatica newer version).
  • Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
  • Created Unix Shell Scripts for ETLjobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversion
  • Managed the authenticity of jobs by checking-in checking-out code from Star Team code versioning tool.
  • Maintainingdaily batch cycleand providing 24-hour Production support.
  • Developed business object reports like universe, crystal reports for business validation.
  • Worked with Offshore team to oversee the development activity, reviewed code to make sure it confirms the standard programming practice at USAA.
  • Good working experience in Tableau Dashboard and extensive uses of Tableau features like Data blending, Extracts, Parameters, Filters, Calculations, Context Filters, Hierarchies, Actions, Maps etc.
  • Build number of UNIX Shell scripts for PL/SQL programs to schedule them on Control M.

Environment: Informatica PowerCenter 10.1/9.6, PowerExchange 10.1/9.6, IICS, IDQ, Oracle 11g, IDQ 10.1, Teradata 14, Netezza, SQL/PLSQL, OBIEE, Salesforce, Flat files, XML files, SQL 2014, UNIX/LINUX, Control M, Putty, Rally.

Confidential - New York, NY

ETL Informatica Developer/Data Analyst

Responsibilities:

  • Discussing requirements with client.
  • Analyzing requirements, understanding the current system and then preparing the high level and low-level design.
  • Preparing clarification log to avoid misunderstanding between the onsite and offshore team.
  • Understanding the impact of changes to be made on the existing system/tool.
  • Identifying the correct files in the database.
  • Perform an impact analysis for changes on database. Time estimation for each requirement.
  • Generation of test cases for software testing. Ensure appropriate backups are taken to protect data integrity.
  • Extensive experience delivering Data warehousing implementations, Data migration and ETL processes to integrate data across multiple sources using Informatica PowerCenter and Informatica Cloud Services
  • Experience working with Salesforce connector to read data from Salesforce objects into Cloud Warehouse.
  • Experience working with cloud monitoring, administrator concepts.
  • Developed and Tested Mappings and Workflows as per mapping specification.
  • Experienced in performing the analysis, design, and programming of ETL processes for Teradata.
  • Extensively worked in the Performance Tuning of the ETL mappings and workflows.
  • Involved in writing shell scripts and added these shell scripts in Autosys as scheduled daily, weekly, monthly.
  • Extensively created mapping/mapplets, reusable transformations using transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy, etc.
  • Strong expertise in using both Connected and Un-Connected Lookup transformations.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Developed Re-Usable Transformations, Mapplets and Worklets.
  • Used Informatica scheduler to run workflows on UNIX box and monitored the results.

Environment: Informatica PowerCenter 9.5.1, Informatica Cloud, Oracle 11g, MS Access, Data Quality 9.6, TOAD 9.0, SQL, T SQL, UNIX, Windows XP, Linux.

Confidential, Watertown, MA

Informatica Developer / Data Analyst

Responsibilities:

  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Have generated reports using OBIEE 10.1.3 for the future business utilities.

Environment: Informatica 9.5.0, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin4.x, Unix, Tortoise SVN, Flat files.

We'd love your feedback!