We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

5.00/5 (Submit Your Rating)

Chandler, AZ

SUMMARY

  • Over 9 years of Experience in System Analysis, Design, Development, Testing and Implementation of Data Warehouse Applications using ETL/Informatica Power center, Informatica Data Quality, Talend Data Integration, Teradata Utilities.
  • Experience on working MPP (Massively parallel processing) and SMP (symmetric multiprocessing) Databases like Oracle, Netezza, Teradata, SQL Server and MySQL.
  • Implemented Data Warehousing Methodologies using ETL/Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, and Talend Open Studio Data Integration.
  • Profound noledge about the architecture of theTeradata and experienced onTeradata Utilities like Multi Load, Fast Load, Teradata Parallel Transporter (TPT) and BTEQ.
  • Expertise in Data Modeling techniques like Dimensional and Fact tables, Star Schema, Snowflake Schema Implemented in Oracle, Teradata, Netezza and SQL Server.
  • Worked on Homogeneous & Heterogeneous Source Systems using different databases like Oracle, Netezza, MS SQL Server, DB2, Teradata and Flat Files.
  • Extensively Worked on Informatica transformations including - Source Qualifier, Connected - Unconnected Lookup, Filter, Expression, Router, Union, Normalizer, Joiner, Update, Rank, Aggregator, Sorter, and Sequence Generator and created complex mappings.
  • Experienced on Validating, Profiling, Cleansing, Standardizing, Improving/Enhancing Address Information, Implementing Data Governance practices and Other Data Quality operations using IDQ.
  • Experience in creating Talend mappings using tDBConnection, tFileInput Delimited, tMap, tJoin, tReplicate, tConvertType, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tFilter, tFieetc
  • Implemented SlowlyChangingDimensionsType-1&2 Methodology for accessing the full history of Accounts and Transaction information.
  • Expertise on Implementing Complex Business Rules by creating Re-Usable Tasks and Non-Reusable Tasks.
  • Experienced on Debugging of Informatica Mappings for Error Handling, Bug Fixes and Performance tuning by Identifying and Resolving Various Bottlenecks at Various Levels like Source, Target, Mapping and Session.
  • Experienced on Scheduling the Talend and Informatica jobs using Autosys and Atomic Applications.
  • Experienced on Documenting Functional Specifications, ETL Technical Design, Mapping Details, Data Validation Test Cases, and Code Migration Guide.
  • Hands-on experience in Developing and Supporting Metadata Application, Data Analyzer Application, IDQ, Power Exchange etc.
  • Experience on writing PL/SQL and T-SQL Stored Procedures, Indexes, Functions, Procedures, and Triggers and Cursors using Oracle and SQL Server.
  • Experience on using the Informatica command line utilities like PMCMD to Control Workflows.
  • Involved on Migrating Informatica Components (Mappings, Workflows) to higher Environments like Dev, SIT, UAT and Prod.
  • Written Python Script, Unix Shell scripts and BTEQ Scripts to handle Source files, Informatica loads and Notifying Business users.
  • Experienced and Good working noledge on Big Data/Hadoop Eco System Architecture and Applications Apache Hive, HBase, HDFS, Apache Spark, Sqoop, and Oozie Implementation.
  • Experience in working Agile Safe & Waterfall structure methodology and coordinating with onshore and offshore team members for Project Implementations.
  • Good communication, Interpersonal, learning and organizing skills matched with the ability to manage stress, time, and people TEMPeffectively.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center, Informatica Data Quality (IDQ), Talend Data Integration, Test Data Manager (TDM), Power Exchange, and Informatica Analyst (IDE).

Cloud Service: Amazon Web Service (AWS), Salesforce (CRM)

BI & Reporting: Tableau Desktop and Tableau Data Prep

Data Modeling: Relational and Dimensional Modeling, Starand Snow Flake schema, Fact and Dimensions tables, Entities, Attributes.

Databases: Oracle, MS SQL Server, Teradata Utilities, HIVE, MS access.

Other tools: Autosys, Atomic, Toad, Tortoise SVN, GitHub, MS Office, MS-Visio, JIRA, ALM, UDeploy, Jenkin

Environment: s: Windows, Linux

Languages: SQL, PL/SQL, T-SQL, VQL, HQL, OOP, Shell Script, Python

PROFESSIONAL EXPERIENCE

Confidential, Chandler, AZ

Sr. Informatica Developer

Responsibilities:

  • Working with Business Team and Product owner to get the Requirements for Report Deliveries and Supporting across application.
  • Analyzing the Lending Investments and Commercial Real Estate Systems to Design, Develop, Test and Implement ETL mappings to deliver the Business Requirements.
  • Creating Solution design, Mapping flow and Test case document to create the Informatica mappings.
  • Design and Creating Data models to create Metadata tables and target tables in Oracle and SQL server applications.
  • Creating ETL Informatica mappings and workflows using various Transformations and tasks. (Transformations - Expression, Router, Connected and Unconnected Look-up, Joiner, Aggregator, Rank, HTTP, Web Service, XML Generator, XML Parser etc.).
  • Working on creating ETL mappings using various Salesforce (SFDC) Objects like Property, Account, Reporting Deadline Items, Asset, Investment Operating Management, etc. as a Source and Target.
  • Extracting and processing data from/into third party vendors using XGI calls in HTTP transformation.
  • Written/created and Implemented Data Quality rules to identify critical checks before generating the reports and updating into Salesforce SFDC and notify to PO and Business users.
  • Enhance ETL Informatica mappings to implement new Business Requirements.
  • Written complex SQL scripts to perform Unit and Integration testingon Source to Target data and notify to application team.
  • Deploying the ETL components from Lower environments to higher environments.
  • Written Unix shell scripts to process source files into Salesforce SFDC and generate the reports to PO and Business team.
  • Coordinating with upstream and downstream teams to share functional and technical requirements to deploy and implement in higher environments.
  • Write/Create and implement Autosys jill code to automate batch process in Higher Environments (SIT, UAT and PROD/BCP).

Environment: Informatica 10.4, AWS, Salesforce (SFDC), Oracle, SQL server, SQL Developer, PL-SQL, Jira, Autosys, UDeploy, Jenkin, GitHub, Putty.

Confidential, Phoenix AZ

DW System Analyst | Lead Informatica Developer

Responsibilities:

  • Working with Business Analytics Team and Architects for Requirements gathering, business analysis, Testing, and Deliverables.
  • Analyzing Global Rebates and Rewards Data warehouse system for Financial Reporting.
  • Creating Fact and Dimension tables and working with DBA team to create Physical models in Dev, SIT, UAT and PROD Environments.
  • Written Masking logic to mask data in lower environment using Informatica Test Data manager for SIT and UAT.
  • Design and developing a Reusable Framework for Financial Data Engineering (Balancing & checks) between Source and Target.
  • Extracting data from RDBMS (Netezza and Oracle) and loading data intoNetezza Database using Informatica Power center and Talend Data Integration.
  • Created simple and complex Informatica mappings to process History data to downstream system.
  • Implemented SCDType-2Methodology for Capturing Reward Account and Card member Information to maintain History data.
  • Used parallel processing capabilities, Session-partitioning and Target table partitioning techniques.
  • Created Unit Test Cases and Integration Test cases to ensure the successful Execution of data loading process.
  • Involved in creating Balance and Control framework using Python for Data Comparison between source and target systems.
  • Performance tuning activities which include identifying the bottle necks and thereby tuning ETL code for best performance.
  • Working on Session logs for Identifying Bottlenecks, Error handling and troubleshooting in E1 and E2 Environments.
  • Migrating the Informatica Mappings, and Workflows from Dev to Test and Prod by using LARA.
  • Preparing Source to Target mapping documents to Business team for ETL logic understanding.
  • Created ETL jobs for Generating flat files using Talend Data Integration (tFileOutputDlimited, tFileInputDelimited, tMap, tJoin, tDBConnection, tJavaRow, tUnite and tJavaFlex etc.) components.
  • Created Business Intelligence Dashboards and Reports for analyzing the Global Rewards data for spend Analysis.
  • Created Data Analytical view’s, Dashboards, workbook and Reports on COVID-19 Impact and Rewards Transactions.
  • Validated the target data Files after post generation using NumPy and Pandas Libraries in Python.
  • Written Unix shell scripts to automate the ETL Process for Parameter file updates, Files moving to downstream.
  • Scheduled and monitoring the ETL jobs in Production after the post install and Fixing the Production issues.
  • Creating Change Request for Deploying the Development code into Production using ServiceNow.
  • Working in Agile SAFE methodology, coordinating with multiple teams for Project Implementations and Deliverables.

Environment: Informatica Power Center 10.2, Informatica Test Data Management, Informatica Analyst, Talend Data Integration 7.1, Oracle 12c/11g, Netezza, Hive, Python 2.7, Atomic, LARA, MS-Visio, ServiceNow, Putty, WinSCP, Tableau and GitHUB.

Confidential, Fremont CA

Sr. Informatica Developer

Responsibilities:

  • Working with Business Teams, and Production teams for Requirements gathering, and Project coordination.
  • Analyzing Business Requirement Documents and Working on Jira Tickets and ALM Tickets for Informatica Code Changes and Bug Fixes to Load Correct Business LOB’s data into STG tables
  • Worked on Integrating Various Heterogeneous source systems like Oracle, Teradata XML Files, and HDFS Flat files into Target Database’s (Oracle and Teradata).
  • Written Python script to download the filesfrom partner system and Checked the Volume counts between Data file and control files.
  • Written few shell Scripts to Extract the files from Hadoop Distributed File System (HDFS) and handled the delimiter changes to import files into informatica Source Directory.
  • Extracting the Source data, loading into Metadata Tables, and generating XML files to importing the Mappings in Informatica Power Center.
  • Design and Developed Simple and Complex ETL Informatica Data Quality (IDQ) Mappings using various Transformations like (Decision, Merge, Match, Address Validator, Parser, Labeler, standardizer, Case converter, Comparison, Duplicate Record Exception and Key generator etc.).
  • Deployed the Data Quality Mappings into Power center client Designer.
  • Design, anddeveloped simple and complex ETL Informatica Power center Mappings and Workflows Using with Various Transformations, and Tasks Data loaded into STG and BASE tables.
  • Created Informatica Mapping with SCD Type-1 and 2 Methodology to track the Historical Data.
  • Managed the Meta data associated with the ETL processes used to populate the Data Warehouse.
  • Improving Session Performance by Identifying Bottlenecks in Source Qualifier and Transformation Level using with Debugger to Debug the Informatica Mappings and fixed them.
  • Using PMCMD Commands to run Informatica Jobs In DEV, SIT, UAT and PROD Environments.
  • Creating Parameter files and AutosysJobs to run the Pre-Processing Scripts and Informatica workflows dynamically on Daily and Monthly loads.
  • Working with Functional team to make sure required data TEMPhas been Extracting, Transforming, and Loading into Target and Performing the Unit Tests and Fixing the Errors to Meet the Business Requirements.
  • Prepared Unit Test Query’s to Test and Validate the Target Data in DEV, SIT, UAT and PRODEnvironments and Documented for Business helpful.
  • Created Functional Specification Documents and Mapping Documents for Understanding the Customer’s data to Business users.
  • Worked on JIRA and ALM tickets to Enhance and Fix the Infomatica code Bug Fixes.
  • Experience on working Java Framework for Creating and generating the Informatica mappings and workflows.
  • Deployedthe Informatica Components, Test Query’s, Parameter Files, Metadata Scripts, Shell Scripts, and Autosys Code to Higher Environments (DEV, SIT, UAT, and PROD).

Environment: Informatica Power Center 9.6.1, Data Quality, Talend Data Integration6.7,Java Framework, Oracle 11g, Python 2.7, Teradata 15.0, XML Files, SQL Server 2008, SSMS, Flat files, Excel, Cobol files, Unix, Windows,Autosys, Jira and ALM.

Confidential

SQL/Informatica Developer

Responsibilities:

  • Requirements gathering from Business team and Downstream application for Technical Analysis.
  • Analyzed the Source data from multiple Source Systems.
  • Written SQL query to Extract data from Relational databases and flat files.
  • Designed logical and physical data models for Star and Snowflake schemas using Erwin.
  • Created Packages and stored Procedures to Implement Data Quality checks.
  • Worked on Informatica Power Center Client tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Involved in creating logical structure and physical structure tables for Data load.
  • Used various Transformations like Filter, Expression, Router, Normalizer, Sequence Generator, Update Strategy, Joiner, and Union to develop Mappings in the Informatica Designer.
  • Implemented Change Data Capture Methodology’sSlowly Changing Dimension Type-1 and Type-2.
  • Worked on different Tasks in Workflows like Sessions, Events Raise, Event Wait, Decision, E-mail, Command, Worklets, Assignment, Timer, and scheduling of the workflow.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote Unix shell Scripts and PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Created Source and target Queries to validate data after the Post load.
  • Prepared migration document to move the Informatica Components from lower to higher Environment.

Environment: Informatica Power Center 8.6/8.1, Oracle 10g, TOAD, PL/SQL, SQL Server, CSV Files, Erwin, UNIX, Microsoft Office.

We'd love your feedback!