We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Chandler, AZ

SUMMARY

  • Over 9 years of Experience in System Analysis, Design, Development, Testing and Implementation of Data Warehouse Applications using ETL/Informatica Power center, Informatica Data Quality, Talend Data Integration, Teradata Utilities.
  • Experience on working MPP (Massively parallel processing) and SMP (symmetric multiprocessing) Databases like Oracle, Netezza, Teradata, SQL Server and MySQL.
  • Implemented Data Warehousing Methodologies using ETL/Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, and Talend Open Studio Data Integration.
  • Profound knowledge about teh architecture of theTeradata and experienced onTeradata Utilities like Multi Load, Fast Load, Teradata Parallel Transporter (TPT) and BTEQ.
  • Expertise in Data Modeling techniques like Dimensional and Fact tables, Star Schema, Snowflake Schema Implemented in Oracle, Teradata, Netezza and SQL Server.
  • Worked on Homogeneous & Heterogeneous Source Systems using different databases like Oracle, Netezza, MS SQL Server, DB2, Teradata and Flat Files.
  • Extensively Worked on Informatica transformations including - Source Qualifier, Connected - Unconnected Lookup, Filter, Expression, Router, Union, Normalizer, Joiner, Update, Rank, Aggregator, Sorter, and Sequence Generator and created complex mappings.
  • Experienced on Validating, Profiling, Cleansing, Standardizing, Improving/Enhancing Address Information, Implementing Data Governance practices and Other Data Quality operations using IDQ.
  • Experience in creating Talend mappings using tDBConnection, tFileInput Delimited, tMap, tJoin, tReplicate, tConvertType, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tFilter, tFieetc
  • Implemented SlowlyChangingDimensionsType-1&2 Methodology for accessing teh full history of Accounts and Transaction information.
  • Expertise on Implementing Complex Business Rules by creating Re-Usable Tasks and Non-Reusable Tasks.
  • Experienced on Debugging of Informatica Mappings for Error Handling, Bug Fixes and Performance tuning by Identifying and Resolving Various Bottlenecks at Various Levels like Source, Target, Mapping and Session.
  • Experienced on Scheduling teh Talend and Informatica jobs using Autosys and Atomic Applications.
  • Experienced on Documenting Functional Specifications, ETL Technical Design, Mapping Details, Data Validation Test Cases, and Code Migration Guide.
  • Hands-on experience in Developing and Supporting Metadata Application, Data Analyzer Application, IDQ, Power Exchange etc.
  • Experience on writing PL/SQL and T-SQL Stored Procedures, Indexes, Functions, Procedures, and Triggers and Cursors using Oracle and SQL Server.
  • Experience on using teh Informatica command line utilities like PMCMD to Control Workflows.
  • Involved on Migrating Informatica Components (Mappings, Workflows) to higher Environments like Dev, SIT, UAT and Prod.
  • Written Python Script, Unix Shell scripts and BTEQ Scripts to handle Source files, Informatica loads and Notifying Business users.
  • Experienced and Good working knowledge on Big Data/Hadoop Eco System Architecture and Applications Apache Hive, HBase, HDFS, Apache Spark, Sqoop, and Oozie Implementation.
  • Experience in working Agile Safe & Waterfall structure methodology and coordinating with onshore and offshore team members for Project Implementations.
  • Good communication, Interpersonal, learning and organizing skills matched with teh ability to manage stress, time, and people effectively.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center, Informatica Data Quality (IDQ), Talend Data Integration, Test Data Manager (TDM), Power Exchange, and Informatica Analyst (IDE).

Cloud Service: Amazon Web Service (AWS), Salesforce (CRM)

BI & Reporting: Tableau Desktop and Tableau Data Prep

Data Modeling: Relational and Dimensional Modeling, Starand Snow Flake schema, Fact and Dimensions tables, Entities, Attributes.

Databases: Oracle, MS SQL Server, Teradata Utilities, HIVE, MS access.

Other tools: Autosys, Atomic, Toad, Tortoise SVN, GitHub, MS Office, MS-Visio, JIRA, ALM, UDeploy, Jenkin

Environment: s: Windows, Linux

Languages: SQL, PL/SQL, T-SQL, VQL, HQL, OOP, Shell Script, Python

PROFESSIONAL EXPERIENCE

Confidential, Chandler, AZ

Sr. Informatica Developer

Responsibilities:

  • Working with Business Team and Product owner to get teh Requirements for Report Deliveries and Supporting across application.
  • Analyzing teh Lending Investments and Commercial Real Estate Systems to Design, Develop, Test and Implement ETL mappings to deliver teh Business Requirements.
  • Creating Solution design, Mapping flow and Test case document to create teh Informatica mappings.
  • Design and Creating Data models to create Metadata tables and target tables in Oracle and SQL server applications.
  • Creating ETL Informatica mappings and workflows using various Transformations and tasks. (Transformations - Expression, Router, Connected and Unconnected Look-up, Joiner, Aggregator, Rank, HTTP, Web Service, XML Generator, XML Parser etc.).
  • Working on creating ETL mappings using various Salesforce (SFDC) Objects like Property, Account, Reporting Deadline Items, Asset, Investment Operating Management, etc. as a Source and Target.
  • Extracting and processing data from/into third party vendors using XGI calls in HTTP transformation.
  • Written/created and Implemented Data Quality rules to identify critical checks before generating teh reports and updating into Salesforce SFDC and notify to PO and Business users.
  • Enhance ETL Informatica mappings to implement new Business Requirements.
  • Written complex SQL scripts to perform Unit and Integration testingon Source to Target data and notify to application team.
  • Deploying teh ETL components from Lower environments to higher environments.
  • Written Unix shell scripts to process source files into Salesforce SFDC and generate teh reports to PO and Business team.
  • Coordinating with upstream and downstream teams to share functional and technical requirements to deploy and implement in higher environments.
  • Write/Create and implement Autosys jill code to automate batch process in Higher Environments (SIT, UAT and PROD/BCP).

Environment: Informatica 10.4, AWS, Salesforce (SFDC), Oracle, SQL server, SQL Developer, PL-SQL, Jira, Autosys, UDeploy, Jenkin, GitHub, Putty.

Confidential, Phoenix AZ

DW System Analyst | Lead Informatica Developer

Responsibilities:

  • Working with Business Analytics Team and Architects for Requirements gathering, business analysis, Testing, and Deliverables.
  • Analyzing Global Rebates and Rewards Data warehouse system for Financial Reporting.
  • Creating Fact and Dimension tables and working with DBA team to create Physical models in Dev, SIT, UAT and PROD Environments.
  • Written Masking logic to mask data in lower environment using Informatica Test Data manager for SIT and UAT.
  • Design and developing a Reusable Framework for Financial Data Engineering (Balancing & checks) between Source and Target.
  • Extracting data from RDBMS (Netezza and Oracle) and loading data intoNetezza Database using Informatica Power center and Talend Data Integration.
  • Created simple and complex Informatica mappings to process History data to downstream system.
  • Implemented SCDType-2Methodology for Capturing Reward Account and Card member Information to maintain History data.
  • Used parallel processing capabilities, Session-partitioning and Target table partitioning techniques.
  • Created Unit Test Cases and Integration Test cases to ensure teh successful Execution of data loading process.
  • Involved in creating Balance and Control framework using Python for Data Comparison between source and target systems.
  • Performance tuning activities which include identifying teh bottle necks and thereby tuning ETL code for best performance.
  • Working on Session logs for Identifying Bottlenecks, Error handling and troubleshooting in E1 and E2 Environments.
  • Migrating teh Informatica Mappings, and Workflows from Dev to Test and Prod by using LARA.
  • Preparing Source to Target mapping documents to Business team for ETL logic understanding.
  • Created ETL jobs for Generating flat files using Talend Data Integration (tFileOutputDlimited, tFileInputDelimited, tMap, tJoin, tDBConnection, tJavaRow, tUnite and tJavaFlex etc.) components.
  • Created Business Intelligence Dashboards and Reports for analyzing teh Global Rewards data for spend Analysis.
  • Created Data Analytical view’s, Dashboards, workbook and Reports on COVID-19 Impact and Rewards Transactions.
  • Validated teh target data Files after post generation using NumPy and Pandas Libraries in Python.
  • Written Unix shell scripts to automate teh ETL Process for Parameter file updates, Files moving to downstream.
  • Scheduled and monitoring teh ETL jobs in Production after teh post install and Fixing teh Production issues.
  • Creating Change Request for Deploying teh Development code into Production using ServiceNow.
  • Working in Agile SAFE methodology, coordinating with multiple teams for Project Implementations and Deliverables.

Environment: Informatica Power Center 10.2, Informatica Test Data Management, Informatica Analyst, Talend Data Integration 7.1, Oracle 12c/11g, Netezza, Hive, Python 2.7, Atomic, LARA, MS-Visio, ServiceNow, Putty, WinSCP, Tableau and GitHUB.

Confidential, Fremont CA

Sr. Informatica Developer

Responsibilities:

  • Working with Business Teams, and Production teams for Requirements gathering, and Project coordination.
  • Analyzing Business Requirement Documents and Working on Jira Tickets and ALM Tickets for Informatica Code Changes and Bug Fixes to Load Correct Business LOB’s data into STG tables
  • Worked on Integrating Various Heterogeneous source systems like Oracle, Teradata XML Files, and HDFS Flat files into Target Database’s (Oracle and Teradata).
  • Written Python script to download teh filesfrom partner system and Checked teh Volume counts between Data file and control files.
  • Written few shell Scripts to Extract teh files from Hadoop Distributed File System (HDFS) and handled teh delimiter changes to import files into informatica Source Directory.
  • Extracting teh Source data, loading into Metadata Tables, and generating XML files to importing teh Mappings in Informatica Power Center.
  • Design and Developed Simple and Complex ETL Informatica Data Quality (IDQ) Mappings using various Transformations like (Decision, Merge, Match, Address Validator, Parser, Labeler, standardizer, Case converter, Comparison, Duplicate Record Exception and Key generator etc.).
  • Deployed teh Data Quality Mappings into Power center client Designer.
  • Design, anddeveloped simple and complex ETL Informatica Power center Mappings and Workflows Using with Various Transformations, and Tasks Data loaded into STG and BASE tables.
  • Created Informatica Mapping with SCD Type-1 and 2 Methodology to track teh Historical Data.
  • Managed teh Meta data associated with teh ETL processes used to populate teh Data Warehouse.
  • Improving Session Performance by Identifying Bottlenecks in Source Qualifier and Transformation Level using with Debugger to Debug teh Informatica Mappings and fixed them.
  • Using PMCMD Commands to run Informatica Jobs In DEV, SIT, UAT and PROD Environments.
  • Creating Parameter files and AutosysJobs to run teh Pre-Processing Scripts and Informatica workflows dynamically on Daily and Monthly loads.
  • Working with Functional team to make sure required data has been Extracting, Transforming, and Loading into Target and Performing teh Unit Tests and Fixing teh Errors to Meet teh Business Requirements.
  • Prepared Unit Test Query’s to Test and Validate teh Target Data in DEV, SIT, UAT and PRODEnvironments and Documented for Business halpful.
  • Created Functional Specification Documents and Mapping Documents for Understanding teh Customer’s data to Business users.
  • Worked on JIRA and ALM tickets to Enhance and Fix teh Infomatica code Bug Fixes.
  • Experience on working Java Framework for Creating and generating teh Informatica mappings and workflows.
  • Deployedthe Informatica Components, Test Query’s, Parameter Files, Metadata Scripts, Shell Scripts, and Autosys Code to Higher Environments (DEV, SIT, UAT, and PROD).

Environment: Informatica Power Center 9.6.1, Data Quality, Talend Data Integration6.7,Java Framework, Oracle 11g, Python 2.7, Teradata 15.0, XML Files, SQL Server 2008, SSMS, Flat files, Excel, Cobol files, Unix, Windows,Autosys, Jira and ALM.

Confidential

SQL/Informatica Developer

Responsibilities:

  • Requirements gathering from Business team and Downstream application for Technical Analysis.
  • Analyzed teh Source data from multiple Source Systems.
  • Written SQL query to Extract data from Relational databases and flat files.
  • Designed logical and physical data models for Star and Snowflake schemas using Erwin.
  • Created Packages and stored Procedures to Implement Data Quality checks.
  • Worked on Informatica Power Center Client tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Involved in creating logical structure and physical structure tables for Data load.
  • Used various Transformations like Filter, Expression, Router, Normalizer, Sequence Generator, Update Strategy, Joiner, and Union to develop Mappings in teh Informatica Designer.
  • Implemented Change Data Capture Methodology’sSlowly Changing Dimension Type-1 and Type-2.
  • Worked on different Tasks in Workflows like Sessions, Events Raise, Event Wait, Decision, E-mail, Command, Worklets, Assignment, Timer, and scheduling of teh workflow.
  • Used Debugger to test teh mappings and fixed teh bugs.
  • Wrote Unix shell Scripts and PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Created Source and target Queries to validate data after teh Post load.
  • Prepared migration document to move teh Informatica Components from lower to higher Environment.

Environment: Informatica Power Center 8.6/8.1, Oracle 10g, TOAD, PL/SQL, SQL Server, CSV Files, Erwin, UNIX, Microsoft Office.

We'd love your feedback!