We provide IT Staff Augmentation Services!

Sr. Informatica And Snowflake Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Over 12 years of IT experience includes Analysis, Design, Development and Maintenance, 11 years of data warehousing experience using Informatica ETL (Extraction, Transformation and Loading Tool) Power center / Power mart, Power Exchange.
  • Strong Experience in working with ETL Informatica ( 10.4/10.1/9.6/8.6/7.1.3 ) which includes components Informatica PowerCenter Designer, Workflow manager, Workflow monitor, Informatica server and Repository Manager.
  • Solid experience in Dimensional Data modeling, Star Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling, ERWIN 3.x, Oracle Designer, Data Integrator.
  • Extensive experience with Shell Scripting in the UNIX Environment. Used UNIX scripting and Scheduled PMCMD to interact with Informatica Server.
  • Experience in change implementation, monitoring and troubleshooting of AWS Snowflake databases and cluster related issues.
  • Good experience in Unit Test, System Integration Test and User Acceptance Test.
  • In - depth knowledge of loading oracle data into Salesforce cloud using informatica SFDC plugin.
  • Experience in uploading data into AWS-S3 bucket using information amazonS3 plugin
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Good knowledge with the Agile and Waterfall methodology in the Software Development Life Cycle
  • Facilitating meetings and project demos at the end of the Sprint.
  • Actively contributes in agile ceremonies: Retrospectives, IPM’s, Grooming and Estimation sessions.
  • Expertise in developing SQL and PL/SQL codes through various Procedures/Functions, Packages, Cursors and Triggers to implement the business logics of database.
  • Experience in change implementation, monitoring and troubleshooting of AWS Snowflake databases and cluster related issues.
  • Experience in using Python programming in data transformation type activities.
  • Realtime experience with loading data into AWS cloud, S3 bucket through informatica.
  • Experience Using AWS services like EC2, S3, DMS, Lambda, Cloudformation, DynamoDB
  • Created AWS data pipelines using Python, Pyspark, EMR services and steps functions
  • Created Python code for Lambda functions perform necessary logic and derive the values.
  • Exposed to create and query a NoSQL table on DynamoDB database.
  • Designed and developed efficient Reconciliation, Error handling methods and implemented throughout the mappings.
  • 3 Years of Experience ETL-Snowflake Developer.
  • Works on loading data into Snowflake DB in the cloud from various sources.
  • Setting up the Account, Created Users and Roles, Providing Privileges to users on DB Objects.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.4/10.1/9.6/8.6/7.1.3 MuleSoft, Informatica Power Exchange, Informatica data quality (IDQ). SFDC Data loader

AWS Services: EC2, Lambda, DynamoDB, SNS, SQS, Cloudwatch, CloudTrails, S3, Code deploy, Code Pipeline, Code commit, EMR, PySpark

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimensional Tables, Physical and Logical Data Modeling.

Business Intelligence: Business Objects XIR2, OBIEE, Tableau, SAS

RDBMS: Oracle 19c/12c/11g/8i/SQL, SQL*Plus, MS SQL Server 2008, 2005, DB2 UDB 7.1, TeraData, Netezza, MySql, MS Access, DynamoDB

Programming Skills: Unix Shell Scripting/SQL, Transact SQL, C, Java Script, Python 3.6, Visual Basic 6.0/5.0, XML, XSD, XBRL, Java.

Cloud applications: AWS, salesforce, Snowflake

PROFESSIONAL EXPERIENCE

Confidential

Sr. Informatica and Snowflake Developer

Responsibilities:

  • Created an enterprise data warehouse project (OVT) to provide standardized data definitions, values and reporting customers and transactions data for building blocks of Confidential business.
  • Designed and developed ETL process using Informatica 10.4 tool to load data from wide range of sources such as Oracle, flat files, salesforce, Aws cloud.
  • Based on the Business logic, developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, HTTP transformation, XML Transformations in the mapping.
  • Developed and supported Cox Integration projects (OVC/OVT/SFDC,ISSE), making sure data is following across multiple systems as per business need.
  • Created data pipe lines using Python, PySpark and EMR services on AWS .
  • Created Glue-jobs to pull Dimensional tables and views data from OVT oracle database.
  • Closely working with ET-AWS Analytics Team on implementation of ISSE project uploading Transaction and Revenue data into Salesforce cloud and AWS-S3 buckets.
  • Extracting and uploading data into AWS S3 buckets using Informatica aws plugin.
  • Creating Salesforce run time reports as per business requirement.
  • Responsible for migrating the folders, mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Created Snowpipe for continuous data load.
  • Bulk loading and unloading data into Snowflake tables using COPY command.
  • Created DWH, Databases, Schemas, Tables, write SQL queries against Snowflake.
  • Validate the data feed from the source systems to Snowflake DW cloud platform.
  • Integrated and automated data workloads to Snowflake Warehouse.
  • Ensure ETL/ELT’s succeeded and loaded data successfully in Snowflake DB.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Used Partitions in Sessions to improve the performance of the database load time.
  • Uploading data from Oracle Exadata database into Salesforce cloud env using Informatica Salesforce plugin
  • In-depth knowledge using Salesforce data loader Import/Export data into SF.
  • Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data in to the Oracle database.
  • Followed agile methodology.
  • Created mappings in the designer to implement Type 2 SCD.
  • Providing daily status to Client about the assignment/User stories for current sprint.
  • Created Informatica Mappings and Workflows to Run Informatica MDM stage, Base Objects and Match Merge Jobs.
  • Automated and running Informatica MDM jobs using power center.
  • Used Email task, Control task, event wait and command tasks in Informatica Workflows.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Extracting AMP reports data from Netezza database and loading into Salesforce as a part of Media integration development.

Environment: Informatica10.4, Power Exchange /CDC, Oracle Exadata, Salesforce, Flat files, PL/SQL, Sql Developer, AS400 (DB2), Netezza, Python3.6, PySpark, EMR, Salesforce, S3-files, DynamoDB, EC2,SNS, Lambda, CloudWatch, Step functions, snowflake, snowpipe.

Confidential, Atlanta, GA

Sr. ETL / Informatica Developer

Responsibilities:

  • Designed and developed ETL process using Informatica 9.1.1 tool to load data from wide range of sources such as Oracle, webservices, Flat files and Netezza .
  • Based on the Business logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, HTTP transformation, XML Transformations in the mapping.
  • Responsible for migrating the folders, mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.
  • Created/Build UNIX shell scripts to pull data from vendors and dropped into Informatica environment using FTP process.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data in to the Oracle database.
  • Followed agile methodology.
  • Providing daily status to Client about the assignment progress.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Created ETL swap process to load same set of tables to avoid ETL load failure issues.
  • Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data in to the Oracle database.
  • Created/Build UNIX shell scripts to pull data from vendors and dropped into Informatica environment using FTP process.
  • Documenting the guidelines for support team.
  • Uncompressed source zip files by calling UNIX scripts through Informatica CMD prompt.
  • Pulled/Processed web services data using informatica and loaded XML responses into Oracle staging tables using informatica HTTP, XML parser Transformations.
  • Used Email task, Control task, event wait and command tasks in Informatica Workflows.

Environment: Informatica9.1.1(PowerCenter, Designer, Workflow Manager,, Monitor), AS400 (DB2), Flatfiles, PL/SQL, SQL*Loader, TOAD, Sql Developer, Natezza 7.0.2, Agninity Workbench, UNIX Sun Solaris, Informatica, Scheduler, Web-Services

Confidential, Atlanta, GA

Sr. Informatica (ETL) Consultant

Responsibilities:

  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Sql Server 2005, AS400 (DB2), Flat files and Oracle etc.
  • Actively participated in understanding business requirements, analysis and designing process from Business Analyst.
  • Based on the Business logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, SQL transformations in the mapping.
  • Used connected, unconnected lookup static and dynamic cache to implement business logic and improve the performance.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Used designer debugger along with session log to monitor data flow and fix related bugs.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Created mappings in the designer to implement Type 2 SCD.
  • Used Partitions in Sessions to improve the performance of the database load time.
  • Registration and Extraction Oracle CDC tables using Power exchange navigator.
  • Imported Power exchange Register tables to Implement CDC on Informatica.
  • Created sessions and used pre and post session properties to execute scripts and to handle errors.
  • Running Informatica MDM Batch Group jobs and doing analysis on Rejects records.
  • Created UNIX scripts to Schedule Informatica Workflows through PMCMD command.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, work lets, sessions, tasks etc.
  • Used Email task, Control task, Link and command tasks in Informatica Workflows.
  • Automated Informatica MDM jobs to load data into staging, Base Objects and Match Merge process.

Environment: Informatica9.1.1 (PowerCenter, Designer, Workflow Manager, Repository Manager, Monitor), Power Exchannge, Oracle11g, AS400 (DB2), Flatfiles, PL/SQL, SQL*Loader, TOAD, Sql Developer, Power Exchange, UNIX Sun Solaris, Informatica MDM, Informatica Scheduler

We'd love your feedback!