We provide IT Staff Augmentation Services!

Snowflake Cloud & Etl Lead Resume

0/5 (Submit Your Rating)

Hillsboro, OR

SUMMARY

  • 12 Years of IT experience with expertise in analysis, design, development and implementation of Data Mart / Data Warehousing applications using ETL tool INFORMATICA Power Center 9.6, 9.1, 8.6 (Designer / Workflow Manager/Workflow Monitor), Informatica Power Exchange IDQ.
  • Experience in CLOUD data migration using AWS, GIT, Jenkins, Airflow and Snowflake.
  • Strong working experience with various clients ranging from Retail & Supply Chain and Telecom.
  • Experience in various phases of IT projects Software Development Life Cycle (SDLC) such as analysis, design, coding and testing, deployment and production support.
  • Experience in developing both Process Development and Process Design.
  • Extensive Experience on End - To-End implementation of Data warehouse and Strong understanding of Data warehouse concepts and methodologies.
  • Functioned as a Sr. Informatica & Cloud lead as part of the ETL development & support team. Lead both onsite and offshore teams including allocation of development and support tasks offshore to reduce cost
  • Proven abilities in process improvements and best practices, database environment control and change management.
  • Experienced in using SQL for data analysis and data profiling. Well versed in writing SQL queries and performing end to end validations.
  • Involved in Data profilingand analysis for identifying data quality issues
  • Working experience with AS400 source system to analyse the data issues.
  • Experience in integrating data from various sources Oracle, DB2, and Netezza.
  • Extensive experience in Teradata utilities (SQL, FastLoad, MultiLoad, Fast Export, Tpump, etc.)
  • Extensively used Informatica Client tools - Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Experience in debugging and Performance tuning of targets, sources, mappings and Sessions.
  • Excellent knowledge in identifying performance bottlenecks and in tuning the Mappings and Sessions by implementing various techniques like partitioning techniques
  • Worked closely with business application teams, business analysts, data architects, database administrators and BIS reporting teams to ensure the ETL solution meets business requirements. Performance improvement changes to the existing code of Informatica, queries using explain plan. Well versed with database partitioning and Informatica mappings performance improvement using partitioning. Processed large volumes of data and large files.
  • Experience in understanding the source system data from legacy systems.
  • Experienced in Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1/9.1.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.
  • Experience in Job Scheduling tool like CA Workload Automation, Control-M & IDR (Informatica Data Replicator)

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 8.6, 9.1/9.6/10.2 , IDQ, SSIS.

Could Tools: Snowflake Cloud Computing, AWS.

DevOps: GIT, Jenkins.

Programming Languages: C#.Net, VB.NET, PL/SQL, Python, UNIX Shell Scripting

Databases: Oracle, Teradata, Snowflake, Sql Server, DB2, Netezza.

Scheduling Tools: Airflow, Coach, CA Workload Automation, Control-M, Autosys, IDR (Informatica Data Replicator).

Development Tools: SQL plus, Toad, SAS, MS Sql Server, Aginity Workbench, System I Navigator, Microsoft Visual Studio.

Data Quality Tools: IDQ 8.6/9.1, Master craft Data Profile.

PROFESSIONAL EXPERIENCE

Confidential, Hillsboro, OR

Snowflake Cloud & ETL Lead

Responsibilities:

  • Involved in designing the ETL Strategy and Architecture of the Project.
  • Modifying existing Informatica mappings to load to Snowflake DB.
  • Recreating existing Teradata objects in Snowflake.
  • Converting Informatica mapping logic to Snow SQL queries.
  • Developing the Process Development and Process Design of the project.
  • Replicating logic from Teradata utilities like Fast load, Multi load to load data into target database.
  • Replicating the Audit logic from Teradata to Snowflake.
  • Developing python scripts to facilitate DB object creation in Snowflake.
  • Involved in complete Data migration testing from SAP BW to Snowflake in AWS cloud using Query surge, Informatica, AWS (Big Data platform), Airflow scheduler, Teradata SQL and Python scripting.
  • Tracking and managing stories, tasks & defects using Version One.
  • Tuning the performances in Snowflake with different options.
  • Organize daily technical discussions with the Onsite team also including the individual offshore work stream leads and set expectations for offshore delivery.
  • Providing UAT support and Deployment support.
  • Worked in a Scrum Agile process with two week iterations delivering new Snowflake objects and migrating data at each iteration.
  • Providing development and testing status to Client and other project stakeholders on regular basis.

Environment: Informatica Powercenter 9.6.x, Teradata, Snowflake, Autosys, Version One, Git, AWS, Airflow, PL/SQL, Python, Jenkins, Intellij.

Confidential, EI Segundo CA

Informatica Sr. Designer

Responsibilities:

  • Meeting with business users to understand the requirements and identify the sources.
  • Gather requirements, Analyse, Design, Code, Test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Source systems viz. Oracle11g, Main frame (DB2) systems, flat files, Unix Shell Scripting.
  • Responsible for Functional design, technical design, coding, testing, debugging, and documentation.
  • Been a part of Data governance initiative directed towards arriving at a holistic view of the Supply chain platform to facilitate Global scalability
  • Acted as an SME for Supply chain analytics project aimed to create a central ODS to drive supply/demand planning across the firm.
  • Played an extremely important role on the project by being a bridge between the real data, business expectations and the reporting results.
  • Act as Onsite coordinator to offshore team for handling Informatica Mappings, UNIX shell scripts & Issues.
  • Worked with Data Profile Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of DP.
  • Design, development of mappings, transformations, sessions, workflows and ETL batch jobs, shell scripts to load data into ODS and generating the output extracts using Informatica, Unix Shell scripts, and Informatica scheduler.
  • Data analysis on AS400 source system on multiple regions for different type of regions data conversions.
  • Designing solution for system enhancements/fixes raised through Change Requests and confirming for client concurrence on Functional Specifications & approach for Implementation.
  • Interacting with client during the development stage and resolving the post development issues. Also, developing new functionalities requested by the client.
  • Worked with Oracle, DB2, and CSV, Fixed Width Flat Files as Sources and Targets.
  • Documentation of Source to Target matrix, mapping specifications, Mapping inventory, Unit test plans and data base sizing and extracts.
  • Configuring & scheduling the jobs during the maintenance and source system downtime in Informatica scheduler,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing
  • Extensively used Router, Sorter, Aggregator, Expression, and Lookup transformations to implement complex logics and to generate the extracts.
  • Unit test processes to ensure proper functioning, and collaborate with other team members during system and integration testing phases to ensure proper functioning as part of the larger system.
  • Co-ordinate with all stakeholders until the completion of the SDLC cycle of every Change Request.
  • Created and loaded the necessary tables in staging area from various sources.
  • Scheduled jobs using Informatica scheduler to run the daily, weekly & monthly loads and the depending workflows estimating the appropriate times.
  • Worked with session logs & workflow logs for trouble shooting
  • Monitoring the Scheduled jobs flow as Predecessors and Successors to analyse.
  • Worked with Unix Shell Scripting to change the file names, FTP the files schedule and run the jobs.
  • Established the production support model for the Control Tower extraction project and helped creation of necessary run books for support team.
  • Experience in analysing the data to identify the scenarios for testing and wrote queries to pull the data was involved in Testing along with Business Analysts and Users.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, Front End Testing, User Acceptance Testing for the three environments while moving the code

Environment: Informatica 9.6, Oracle, DB2, Netezza, JDA, Shell Scripting.

Confidential, EI Segundo, CA

Informatica Sr. Designer

Responsibilities:

  • Meeting with business users to understand the requirements and identify the sources.
  • Gather requirements, Analyse, Design, Code, Test highly efficient and highly scalable integration solutions using Informatica, Oracle, SQL, Source systems viz. Oracle11g, Main frame (DB2) systems.
  • Responsible for Functional design, technical design, coding, testing, debugging, and documentation.
  • Act as Onsite coordinator to offshore team for handling Informatica Mappings, CA Workload Automation Tool & Issues.
  • Worked with Data Profile Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of DP.
  • Design, development of mappings, transformations, sessions, workflows and ETL jobs, The ETL jobs to load data into Source/s to Stage, & ODS using Informatica, PL/SQL, Stored Procedures, CA Workload Automation scheduling.
  • Data analysis on AS400 source system on multiple regions of different type of data conversions.
  • Designing solution for system enhancements/fixes raised through Change Requests and confirming for client concurrence on Functional Specifications & approach for Implementation.
  • Interacting with client during the development stage and resolving the post development issues. Also, developing new functionalities requested by the client.
  • Worked with Oracle, DB2, and CSV, Fixed Width Flat Files as Sources and Targets.
  • Documentation of Source to Target matrix, mapping specifications, Mapping inventory, Unit test plans and data base sizing.
  • Configuring & scheduling the jobs during the maintenance window using CA Workload Automation,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing
  • Extensively used Router, Sorter, Aggregator, Expression, and Lookup, Stored Procedure, Update Strategy transformations to implement complex logics and to load fact tables.
  • Migration of Informatica workflows using Informatica admin tool from Development to QA and Production.
  • SQL Overrides are extensively used to pull the data from multiple tables and tuned queries where created to avoid performance issues.
  • Worked with Pre-and Post SQL commands to modify the source and target tables.
  • Debugger has been a vital tool used in fixing the issues during unit testing.
  • Unit test processes to ensure proper functioning, and collaborate with other team members during system and integration testing phases to ensure proper functioning as part of the larger system.
  • Worked closely with the QA team to develop the test cases and documenting the expected results document with the data from the databases.
  • Co-ordinate with all stakeholders until the completion of the SDLC cycle of every Change Request.
  • Created and loaded the necessary tables in staging area from various databases.
  • Scheduled jobs using Informatica scheduler & CA Automation tool to run the daily, weekly & monthly loads and the depending workflows estimating the appropriate times.
  • Worked with session logs & workflow logs for trouble shooting
  • Monitoring the Scheduled jobs flow as Predecessors and Successors to analyse.
  • Worked with Unix Shell Scripting to change the file names, FTP the files schedule and run the jobs.
  • Experience of working with large amounts of data, data analysing and data validation.
  • Worked on stored procedures, packages and triggers in PL/SQL for deployment purpose.
  • Experience in analyzing the data to identify the scenarios for testing and wrote queries to pull the data was involved in Testing along with Business Analysts and Users.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, Front End Testing, User Acceptance Testing for the three environments while moving the code

Environment: Informatica 9.1, Oracle, DB2, Netezza, Data Profile tool, Application Analyser, CA Workload Automation.

Confidential, Denver, CO

Data & Analytics ETL Lead

Responsibilities:

  • Meeting with business users to understand the requirements and identify the sources.
  • Played an extremely important role on the project by being a bridge between the real data, business expectations and the reporting results.
  • Act as Onsite coordinator to offshore team for handling Informatica Mappings, Control-M Tool, UNIX shell scripts & Issues.
  • Requirements gathering and Analysis.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
  • Designing solution for system enhancements/fixes raised through Change Requests and confirming for client concurrence on Functional Specifications & approach for Implementation.
  • Interacting with client during the development stage and resolving the post development issues. Also, developing new functionalities requested by the client.
  • Worked with Oracle, CSV and Fixed Width Flat Files as Sources and Targets.
  • Configuring & scheduling the jobs during the maintenance window using CONROL - M,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing
  • Setting up or modify the existing Job in Control M as and when required.
  • Attending Go/No Go meeting and giving inputs on support perspective for theproduction migration of the application.
  • Worked as a co-ordinator for Change co-ordination during Maintenance period andresponsible for approving change requests during maintenance periods.
  • Proactively involved in grooming new team members.
  • Co-ordinate with all stakeholders until the completion of the SDLC cycle of every Change Request.
  • Created and loaded the necessary tables in staging area from various databases.
  • Scheduled jobs using Informatica scheduler (Control-M & IDR) to run the daily, weekly & monthly loads and the depending workflows estimating the appropriate times.
  • Involved in meeting with source system developers understand the data, its relationships and to estimate on the risks and extra work necessary on the BI end if the source structures change.
  • Worked with session logs & workflow logs for trouble shooting
  • Used shell scripting to schedule the jobs
  • Monitoring the Scheduled jobs flow as Predecessors and Successors to analyse.
  • Monitoring the Informatica Data Replicator tool for scheduler’s execution flow.
  • Created and modified UNIX scripts to run the loads.
  • Worked with Unix Shell Scripting to change the file names, FTP the files schedule and run the jobs.
  • Experience of working with large amounts of data, data analysing and data validation.
  • Worked on stored procedures, packages and triggers in PL/SQL for deployment purpose.
  • Played the role of an AIP like handling HD Tickets, IR Implementation, Outages and Engaging SWATS.
  • Performed the task of monitoring the PANS/SAS jobs running on Data Warehouse and in case of any errors will try to find and resolve it within the SLA.

Environment: Informatica 9.1, IDQ, Oracle, Control-M, IDR, SAS, Shell Scripting

Confidential

ETL Lead

Responsibilities:

  • Meeting with business users to understand the requirements and identify the sources.
  • Played an extremely important role on the project by being a bridge between the real data, business expectations and the reporting results.
  • Act as Onsite coordinator to offshore team for handling Informatica Mappings, Control-M Tool, UNIX shell scripts & Issues.
  • Requirements gathering and Analysis.
  • Effort estimation & timelines.
  • Informatica Data Quality (IDQ 9.1) is the tool used here for data quality measurement.
  • Utilized of Informatica IDQ 9.1 to complete initial data profiling and matching/removing duplicate data.
  • Designing solution for system enhancements/fixes raised through Change Requests and confirming for client concurrence on Functional Specifications & approach for Implementation.
  • Interacting with client during the development stage and resolving the post development issues. Also, developing new functionalities requested by the client.
  • Worked with Oracle, CSV and Fixed Width Flat Files as Sources and Targets.
  • Configuring & scheduling the jobs during the maintenance window using CONROL - M,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing
  • Setting up or modify the existing Job in Control M as and when required.
  • Attending Go/No Go meeting and giving inputs on support perspective for theproduction migration of the application.
  • Worked as a co-ordinator for Change co-ordination during Maintenance period andresponsible for approving change requests during maintenance periods.
  • Proactively involved in grooming new team members.
  • Co-ordinate with all stakeholders until the completion of the SDLC cycle of every Change Request.
  • Worked with SQL Server as the source system and the target on which the mart resided.
  • Created and loaded the necessary tables in staging area from various databases.
  • Scheduled jobs using Informatica scheduler & Control-M to run the daily, weekly & monthly loads and the depending workflows estimating the appropriate times.
  • Involved in meeting with source system developers understand the data, its relationships and to estimate on the risks and extra work necessary on the BI end if the source structures change.
  • Worked with session logs & workflow logs for trouble shooting
  • Used shell scripting to schedule the jobs
  • Monitoring the Scheduled jobs flow as Predecessors and Successors to analyze.
  • Created and modified UNIX scripts to run the loads.
  • Worked with Unix Shell Scripting to change the file names, FTP the files schedule and run the jobs.
  • Experience of working with large amounts of data, data analyzing and data validation.
  • Worked on stored procedures, packages and triggers in PL/SQL for deployment purpose.
  • Unit test processes to ensure proper functioning, and collaborate with other team members during system and integration testing phases to ensure proper functioning as part of the larger system.

Environment: Informatica 9.1, IDQ, Oracle, Sql Server, Control-M, Shell Scripting

Confidential

ETL Lead

Responsibilities:

  • Meeting with business users to understand the requirements and identify the sources.
  • Played an extremely important role on the project by being a bridge between the real data, business expectations and the reporting results.
  • Act as Onsite coordinator to offshore team for handling SSIS Packages, Control-M Tool & Issues.
  • Effort estimation & timelines.
  • Designing solution for system enhancements/fixes raised through Change Requests and confirming for client concurrence on Functional Specifications & approach for Implementation.
  • Interacting with client during the development stage and resolving the post development issues. Also, developing new functionalities requested by the client.
  • Worked with Sql Server, CSV and Fixed Width Flat Files as Sources and Targets.
  • Configuring & scheduling the jobs during the maintenance window using CONROL - M,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing
  • Setting up or modify the existing Job in Control M as and when required.
  • Attending Go/No Go meeting and giving inputs on support perspective for theproduction migration of the application.
  • Worked as a co-ordinator for Change co-ordination during Maintenance period andresponsible for approving change requests during maintenance periods.
  • Proactively involved in grooming new team members.
  • Co-ordinate with all stakeholders until the completion of the SDLC cycle of every Change Request.
  • Worked with SQL Server as the source system and the target on which the mart resided.
  • Created and loaded the necessary tables in staging area from various databases.
  • Involved in meeting with source system developers understand the data, its relationships and to estimate on the risks and extra work necessary on the BI end if the source structures change.
  • Monitoring the Scheduled jobs flow as Predecessors and Successors to analyse.
  • Experience of working with large amounts of data, data analysing and data validation.
  • Played the role of an AIP like handling HD Tickets, IR Implementation, Outages and Engaging SWATS.

Environment: SSIS, Sql Server, Control-M.

Confidential

Responsibilities:

  • Configuring & scheduling the jobs during the maintenance window using CONROL - M,to maintain the flow of data between upstream & downstream dependencies.
  • Worked as Release Co-ordinator during the release, which involves migration of code tothe production servers, migrating new applications to production & implementingfunctional changes to existing.
  • Setting up or modify the existing Job in Control M as and when required.
  • Attending Go/No Go meeting and giving inputs on support perspective for theproduction migration of the application.
  • Create and Maintain SSIS packages using SQL Server Business Intelligence Development Studio (BIDS)
  • Created Jobs and Schedule SSIS Packages.
  • Scheduled jobs using Informatica scheduler (Control-M & IDR) to run the daily, weekly & monthly loads and the depending workflows estimating the appropriate times.
  • Played the role of an AIP like handling HD Tickets, IR Implementation, Outages and Engaging SWATS.

Environment: Informatica 9.1, Oracle, SSIS, Sql Server, Control-M, MAP Tool, Shell Scripting

Confidential

Senior Software Engineer

Responsibilities:

  • Involved in business requirement and design discussions with the clients and managers.
  • Requirement Analysis: Analyze the requirement document and check for the feasibility of the same
  • Client Interaction for the Application: Interact with client and understand the requirements and discuss for the approach.
  • High Level Design: The agreed upon approach or the proposed approach is then documented.
  • Low Level Design: Document the impact analysis and component design and get sign off on the same.
  • Development/ Implementation: Develop the application/module as per the proposed/agreed upon approach. In corporate the code changes
  • Unit testing and removing defects: Testing the application using the test plans for correctness.
  • Experience in analyzing the data to identify the scenarios for testing and wrote queries to pull the data was involved in Testing along with Business Analysts and Users.

Environment: ASP. Net 3.5, C#.Net, SSIS, SSRS, Microsoft Visual Studio, SQL Server 2005, IIS

Confidential

Senior Software Engineer

Responsibilities:

  • Involved in the preparation of Design Specification documents.
  • Involved in Design and Coding in coordination with the development of the entire GUI.
  • Involved in Database designing, setting up database schemas, tables, indexes, views, and other database objects.
  • Developed database stored procedures to support the back end functionality.
  • Create and Maintain SSIS packages using SQL Server Business Intelligence Development Studio (BIDS)
  • Created Jobs and Schedule SSIS Packages.
  • Responsible for coordinating team meetings, presentations, joint reviews and status reporting to the Client.
  • Involved in preparation of test cases, discussing the detailed functionality of the application related to the test plan and strategy with team members.
  • Prepared and executed unit test plan

Environment: AASP. Net 3.5, C#.Net, SSIS, SSRS, Microsoft Visual Studio, SQL Server 2005 IIS

Confidential

Software Engineer

Responsibilities:

  • Involved in the preparation of Design Specification documents.
  • Involved in Design and Coding in coordination with the development of the entire GUI.
  • Involved in Database designing, setting up database schemas, tables, indexes, views, and other database objects.
  • Developed database stored procedures to support the back end functionality.
  • Responsible for coordinating team meetings, presentations, joint reviews and status reporting to the Client.
  • Involved in preparation of test cases, discussing the detailed functionality of the application related to the test plan and strategy with team members.
  • Prepared and executed unit test plan

Environment: ASP. Net 3.5, VB.Net, Microsoft Visual Studio, SQL Server 2000, IIS

Confidential

Software Engineer

Responsibilities:

  • Involved in the preparation of Design Specification documents.
  • Involved in Design and Coding in coordination with the development of the entire GUI.
  • Involved in Database designing, setting up database schemas, tables, indexes, views, and other database objects.
  • Developed database stored procedures to support the back-end functionality.
  • Responsible for coordinating team meetings, presentations, joint reviews and status reporting to the Client.
  • Involved in preparation of test cases, discussing the detailed functionality of the application related to the test plan and strategy with team members.
  • Prepared and executed unit test plan

Environment: ASP. Net 3.5, VB.Net, Microsoft Visual Studio, SQL Server 2000

We'd love your feedback!