We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

5.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • ETL Developer with over 4+ years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application system.
  • Good Exposure on AWS Cloud (IAM, S3, RDS, EC2, Redshift, Glue, Athena, etc).
  • Experience in creating data lakes using AWS Glue with data sources such as S3, DynamoDB, and Redshift.
  • Experience in creating ETL jobs and did simple transformations in python on the source data from data lakes and targeted them to store in S3.
  • Performed SQL queries on AWS Athena on the database from AWS Glue.
  • Experience in Data Warehousing and ETL application development, Data Modeling, Data Analysis, Reporting, ETL Administration, Support, Maintenance, Testing and Documentation.
  • Performed all dimensions of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Power Center (Repository Manager, Designer, Server Manager, Workflow Manager and Workflow Monitor).
  • Experienced in developing complex business rules through Informatica transformations, Workflows/Worklets, Mappings/Mapplets, Test Plans, Test Strategies and Test Cases for Data Warehousing projects.
  • Strong SQL knowledge, developed complex queries for analysis and data extraction.
  • Experience on PL/SQL stored procedures and enabled them to be called from Informatica Power Center.
  • Technical expertise in ETL includes development of complex requirement - based codes in Informatica, performance tuning and creation of reusable components to decrease cost.
  • Proficient in creating UNIX Shell Scripts, Strong Experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages.
  • Experienced in job scheduling using Informatica Scheduler on Linux environments.
  • Worked as Informatica admin and involved in upgrading Informatica Power Center from version 9.1.0 to 9.6.1 and also worked with Informatica vendors during outages
  • Loaded unstructured data into Hadoop File System (HDFS) and Performed Data Quality checks.
  • Experience in creating shell scripts for daily target table counts validation along with source data counts.
  • Created Adhoc mappings changes during UAT phase.
  • Developed Tableau visualizations and dashboards using Tableau Desktop during my master’s course.
  • Prepared Dashboards and worksheets in Tableau by creating action filters, parameters and calculated sets.
  • Strong understanding of advanced Tableau features including calculated fields, parameters, table calculations, row-level security, R integration, joins, data blending, and dashboard actions.
  • Excellent team player and self-starter with good ability to work independently, possess good analytical, problem solving and logical skills, and manage business expectations with a delivery-focused approach. Self-motivated able to set effective priorities to achieve immediate, long term goals, and meet operational deadlines.

TECHNICAL SKILLS

Languages: SQL, Java, C

Scripting: UNIX shell scripting

Databases: Oracle 10g, SQL Server, DB2, DynamoDB, Redshift

Cloud Technology: AWS Cloud

ETL Tools: Informatica power center v 9.61, 10.1.1, BDE Version, AWS Glue

Other Tools/Skills: Tableau, HDFS, Hive, SQL developer, Toad, PL/SQL, AWS Athena

Methodologies: Agile

Operating Systems: Windows 95/98/2000/NT/XP/07/08, MS DOS, IOS

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

ETL/Informatica Developer

Responsibilities:

  • Worked with Data Architects to understand and identify the source extracts.
  • Ability to understand and interpret the requirements to accurately design the mapping data flow.
  • Designed and maintained the ETL strategies.
  • Developed mappings on Informatica Power Center for different layers like MF to Hadoop Data Lake and then Hadoop to Report Mart.
  • Loaded unstructured data into Hadoop File System (HDFS).
  • Created separate mappings as part of incremental and full loads using SCD Type 1 logic.
  • Created UTC for Data Lake and then into Report Mart tables.
  • Monitored the Production jobs which run on daily basis and validating the jobs through production databases and ETL job control.
  • Created and scheduled workflows using Workflow Manager to load the data into the Target Database.
  • Tuned Informatica mappings and sessions for optimum performance.
  • Worked on PL/SQL stored procedures and enabled them to be called from Informatica Power Center.
  • Enhanced and fine-tuned complex SQL queries.
  • Performing Data Quality checks as part of the Report Mart data standards and suggesting necessary fixes if any needed to business.
  • Created shell scripts for daily target table counts validation along with source data counts.
  • Performed Unit testing for the code developed and prepared test case documents for the same.
  • Worked on Adhoc mappings changes during UAT phase and delivering the necessary validation reports to business in time.
  • Tested the code for different databases i.e. DB2 and Big data.
  • Worked in Agile methodology, Attended SCRUM meetings, and standup meetings.
  • Prepared few Project Overview documents, which includes the detailed description of the structure, flow of Auto Warranty project.

Environment: Informatica PowerCenter, SQL Server, Oracle, Workflow Manager, HDFS, Informatica Power Connect/Power Exchange, Agile, SQL, PL/SQL, UNIX Scripting and Windows.

Confidential, San Francisco, CA

ETL/Informatica Developer

Responsibilities:

  • Performed deployment activities, worked on setting up new environment (created power center folders, server directories, connections), and imported Informatica codes from different environments, created new parameter files and executed SQL scripts.
  • Involved in upgrading Informatica Power Center from version 9.1.0 to 9.6.1.
  • Involved in installing hot fixes and utilities released from Informatica Corporation.
  • Assigned license to the Informatica admin server in all environments and configured Informatica power center clients on RDP.
  • Worked on ETL code elevation from Dev to QA/Test/Acceptance.
  • Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • Generated and used SQL queries to fetch the required statistics from repository database in different environments.
  • Worked in On-call, production support for almost 15K jobs that run in different environments and handled failures in different environments.
  • Created SCD Type 1 and Type 2 mappings by using transformations like source qualifier, expression, filter, lookups, router, update strategy, sequence generator, etc. as per the business logic.
  • Created mappings by Extracting source flat files using UNIX scripts by SFTP process and loaded into various relational databases.
  • Provided scheduling for jobs using Informatica Scheduler tool as per business requirements.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Worked on Object Versioning and its features. Used Pre-session, Post-Session and created tasks to send emails to various business owners through the Workflow Manager.
  • Created shell scripts to kick off Informatica workflow/s and PL/SQL procedures.
  • Documented common issues and resolution procedures.

Environment: Informatica PowerCenter, Oracle, Workflow Manager, Informatica Power Connect / Power Exchange, SQL, PL/SQL, Agile and Windows.

Confidential, Boca Raton, FL

ETL Developer

Responsibilities:

  • Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
  • Experience in writing Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
  • Analyzed the source data coming from different sources (Oracle, DB2, XML, Flat files) and worked on developing ETL mappings.
  • Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis.
  • Implemented mapping level optimization with best route possible without compromising with business requirements.
  • Created Sessions, reusable worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency.
  • Worked on fixing invalid Mappings, testing of Stored Procedures and Functions, and Integration Testing of Informatica Sessions.
  • Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
  • Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
  • Performed Data profiling for data quality purposes.
  • Proven Accountability including professional documentation, and weekly status report.
  • Performed Quantitative and Qualitative Data Testing.
  • Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.

Environment: Informatica PowerCenter, Oracle, Informatica Power Connect / Power Exchange, UNIX Scripting.

Confidential

ETL Developer

Responsibilities:

  • Worked in life cycle development including Design, ETL strategy, troubleshooting and Reporting. Identifying facts and dimensions.
  • Extensively used Informatica Power Mart and created mappings using transformations like connected and flagging the record using update strategy for populating the desired slowly changing dimension tables.
  • Involved in the development of Data Mart and created mappings using Informatica.
  • Utilized the best practices for the creation of mappings and used transformations like Filter, Expression, Sequence generator and Lookup.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Involved in production support working with various mitigation tickets created while the users working to retrieve the database.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Tested data and data integrity among various sources and targets.

Environment: Informatica, Business Objects, UNIX, SQL, PL/SQL and MS VSS.

We'd love your feedback!