We provide IT Staff Augmentation Services!

Senior Snowflake Developer Resume

2.00/5 (Submit Your Rating)

Johnston, IA

SUMMARY

  • Total 9+ hands on experience with building product ionized data ingestion and processing pipelines using Java, Spark, Scala etc and also experience in designing and implementing production grade data warehousing solutions on large scale data technologies
  • Strong experience in migrating other databases to Snowflake.
  • Work with domain experts, engineers, and other data scientists to develop, implement, and improve upon existing systems.
  • Experience in analysing data using HiveQL
  • Participate in design meetings for creation of the Data Model and provide guidance on best data architecture practices
  • Experience with Snowflake Multi - Cluster Warehouses.
  • Experience in Splunk reporting system.
  • Understanding of SnowFlake cloud technology.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.
  • Professional knowledge of AWS Redshift
  • Experience in building Snowpipe.
  • Experience in using Snowflake Clone and Time Travel.
  • Experience in various data ingestion patterns to hadoop.
  • Participates in the development improvement and maintenance of snowflake database applications
  • Experience in various methodologies like Waterfall and Agile.
  • Extensive experience in developing complex stored Procedures/BTEQ Queries.
  • In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles
  • Build the Logical and Physical data model for snowflake as per the changes required
  • Define roles, privileges required to access different database objects.
  • In-depth knowledge of Snowflake Database, Schema and Table structures.
  • Define virtual warehouse sizing for Snowflake for different type of workloads.
  • Worked with cloud architect to set up the environment
  • Coding for Stored Procedures/ Triggers.
  • Designs batch cycle procedures on major projects using scripting and Control
  • Develop SQL queries SnowSQL
  • Develop transformation logic using snowpipeline.
  • Optimize and fine tune queries
  • Performance tuning of Big Data workloads.
  • Have good Knowledge in ETL and hands on experience in ETL.
  • Operationalize data ingestion, data transformation and data visualization for enterprise use.
  • Mentor and train junior team members and ensure coding standard is followed across the project.
  • Help talent acquisition team in hiring quality engineers.
  • Experience in real time streaming frameworks like Apache Storm.
  • Worked on Cloudera and Hortonworks distribution.
  • Progressive experience in the field of Big Data Technologies, Software Programming and Developing, which also includes Design, Integration, Maintenance.
  • Hands-on experience with Snowflake utilities, Snows, Snow Pipe, Big Data model techniques using Python / Java.
  • ETL pipelines in and out of data warehouse using combination of Python and Snowflakes Snow SQL Writing SQL queries against Snowflake.

TECHNICAL SKILLS

ETL Tools: Talend MDM 7.1/6.x/5.x, Informatica 7x/8x, SSIS, Lyftron

Big Data Technologies: Hadoop ecosystem, Spark, HDFS, Map Reduce, Hive, PIG, Sqoop, NOSQL

Reporting Tools: Business Objects XI R2, Cognos8.x/7.x, Micro strategy and MS Access Reports

Operating System: Business Objects XI R2, Cognos8.x/7.x, Micro strategy and MS Access Reports

Database: Oracle 9i/10g, 11g, SQL Server 2008/2012, DB2, Teradata, Netezza,AWS Redshift, Snowflake.

Language: C, Java, SQL, PL/SQL, UNIX

Modelling Tools: Visio, Erwin 7.1.2

Cloud Technologies: Lyftron, AWS, Snowflake, Redshift Professional Experience

Spark Hive: LLAP, Beeline, Hdfs, MapReduce, Pig, Sqoop, HBase, Oozie, Flume

Programming Languages: Scala, Python, Perl, Shell scripting.

Hadoop Distributions: Cloudera, Hortonworks

PROFESSIONAL EXPERIENCE

Confidential, Johnston, IA

SENIOR SNOWFLAKE DEVELOPER

Responsibilities:

  • Developed Logical and Physical data models that capture current state/future state data elements and data flows using Erwin 4.5.
  • Responsible for design and build data mart as per the requirements.
  • Extensively worked on Views, Stored Procedures, Triggers and SQL queries and for loading the data (staging) to enhance and maintain the existing functionality.
  • Done analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.
  • Created Data acquisition and Interface System Design Document.
  • Designed the Dimensional Model of the Data Warehouse Confirmation of source data layouts and needs.
  • Deploy various reports on SQL Server 2005 Reporting Server
  • Installing and Configuring SQL Server 2005 on Virtual Machines
  • Migrated hundreds of Physical Machines to Virtual Machines
  • Conduct System Testing and functionality after virtualization
  • Extensively involved in new systems development with Oracle 6i.
  • USED SQLCODE returns the current error code from the error stack SQLERRM returns the error message from the current error code.
  • Used Import/Export Utilities of Oracle.
  • Created the External Tables in order to load data from flat files and PL/SQL scripts for monitoring.
  • Writing Tuned SQL queries for data retrieval involving Complex Join Conditions.
  • Extensively used Oracle ETL process for address data cleansing.
  • Developed and tuned all the Affiliations received from data sources using Oracle and Informatica and tested with high volume of data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Oracle and Informatica PowerCenter.
  • Created common reusable objects for the ETL team and overlook coding standards.
  • Reviewed high-level design specification, ETL coding and mapping standards.
  • Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
  • Used ETL to extract files for the external vendors and coordinated that effort.
  • Migrated mappings from Development to Testing and from Testing to Production.
  • Performed Unit Testing and tuned for better performance.
  • Created various Documents such as Source-to-Target Data Mapping Document, and Unit Test Cases Document.

Confidential, New York city, NY

Sr. Snowflake Developer

Responsibilities:

  • Defines project scope and objectives.
  • Develops detailed work plans, schedules, project estimates, resource plans, and status reports. - Conducts project meetings and is responsible for project tracking and analysis.
  • Experience with end to end implementation of Snowflake cloud data warehouse
  • Created Snow pipe for continuous data load.
  • Created internal and external stage and transformed data during load.
  • Worked with both Maximized and Auto-scale functionalities.
  • Used Temporary and Transient tables on different datasets.
  • Experience in working with AWS, Azure and Google data services
  • Expertise in Snowflake data modeling, ELT using Snowflake SQL, implementing stored procedures and standard DWH+ETL concepts
  • Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python etc. to do Extract, Load and Transform data 3+ years of experience in Data management (e.g. DW/BI) solutions.
  • Served as the Snowflake Database Administrator responsible for leading the data model design and database migration deployment production releases to endure our database objects and corresponding metadata were successfully implemented to the production platform environments; (Dev, Qual and Prod) AWS Cloud (Snowflake).
  • Performed day-to-day integration with the Database Administrators (DBA) DB2, SQL Server, Oracle and AWS Cloud teams to ensure the insertion of database tables, columns and its metadata have been successfully implemented out to the DEV, QUAL and PROD region environments in AWS Cloud - Aurora and Snowflake.
  • Good working knowledge of any ETL tool (Informatica or SSIS).
  • Ability to effectively function in a cross team’s environment.
  • Worked on Cloud Database Snowflake Cloud Data warehouse and Integrated Automated Generic Python Framework to Process XML, CSV, JSON, TSV, TXT files.
  • Manages the integration of vendor tasks and tracks and reviews vendor deliverables.
  • Works with the project team to analyze and resolve problems.
  • Reports project status to stakeholders on a regular basis.
  • Good understanding on Agile process and Agile project execution.
  • Monitor data quality, identify data quality issues, oversee remediation plans, implementation of data controls, and manage data quality remediation strategies. Define data quality strategy and participate in a data quality working group.
  • Oversee and ensure that new systems implemented at the enterprise level follow data quality guidelines.
  • Proficiency in data modeling
  • Very good understanding of data management concepts such as 3NF, Dimensional and their specific applications
  • Very good knowledge of RDBMS topics, ability to write complex SQL, PL/SQL
  • Design, develop, test, implement and support of Data Warehousing ETL using Talend.
  • Develop stored procedures/views in Snowflake and use in Talend for loading Dimensions and Facts.
  • Implemented Change Data Capture technology in Talend in order to load deltas to a Data Warehouse.
  • Wrote ETL jobs to read from web APIs using REST and HTTP calls and loaded into HDFS using java and Talend.
  • Created Talend Mappings to populate the data into dimensions and fact tables.

Confidential, New York, NY

Snowflake Developer

Responsibilities:

  • Implement data transformations and data structures for data warehouse and lake/repository.
  • Manage cloud and/or on-premises solutions for data transfer and storage.
  • Establish data structures for all enterprise data stored in business intelligence systems.
  • Performed ETL data translation using informatica of functional requirements to Source to Target Data Mapping documents to support large datasets (Big Data) out to the AWS Cloud databases; Snowflake and Aurora.
  • Performed logical and physical data structure designs and DDL generation to facilitate the implementation of database tables and columns out to the DB2, SQL Server,
  • Establish interfaces between the data warehouse and reporting tools, such as PowerBI.
  • Assist data analysts with connecting reporting and analytics software to data warehouses, lakes, and other data sources.
  • Manage access and permissions to data.
  • Expertise with Snow SQL, advanced concepts query performance tuning, time travel etc. and features/tools data sharing, events, Snow Pipe etc.
  • Evaluate Snowflake Design considerations for any change in the application
  • Build the Logical and Physical data model for snowflake as per the changes required.
  • Good working knowledge of any ETL tool (Informatica or SSIS).
  • Ability to effectively function in a cross team’s environment.
  • Worked on Cloud Database Snowflake Cloud Data warehouse and Integrated Automated Generic Python Framework to Process XML, CSV, JSON, TSV, TXT files.
  • Manages the integration of vendor tasks and tracks and reviews vendor deliverables.
  • Works with the project team to analyze and resolve problems.
  • Reports project status to stakeholders on a regular basis.
  • Good understanding on Agile process and Agile project execution.
  • Monitor data quality, identify data quality issues, oversee remediation plans, implementation of data controls, and manage data quality remediation strategies. Define data quality strategy and participate in a data quality working group.
  • Oversee and ensure that new systems implemented at the enterprise level follow data quality guidelines.
  • Proficiency in data modeling

Confidential

SQL Developer

Responsibilities:

  • GeneratedSQL and PL/SQL scriptsto install create and drop database objects, including tables, views, primary keys, indexes, constraints, packages, sequences, grants and synonyms.
  • Database design and development includingTables, Primary and Foreign Keys, IndexesandStored Procedures.
  • Involved in performance tuning using withTKPROF.
  • Developed Shell scripts for job automation and daily backup.
  • Developed highly optimized stored procedures, functions, and database viewsto implement the business logic and also created clustered and non-clustered indexes.
  • Used various SSIS tasks such as Conditional Split, Multicast, Fuzzy Lookup etc., which did Data Scrubbing, including data validation checks during Staging, before loading the data into the Data warehouse.
  • Created SSIS package to load data from Flat Files, Excel and XML Files to Data warehouse and Report-Data mart using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation, and Slowly Changing Dimension.
  • Designed, deployed, and maintained complex canned reports using SQL Server 2008 Reporting Services (SSRS).
  • Created and maintained data flow diagrams for existing and future ETL processes.
  • Involved in Data Extraction, Transformation and Loading (ETL) between Homogenous and Heterogeneous systems using SQL tools (SSIS, Bulk Insert).
  • Design and Deploy SSIS Packages
  • Creating SSIS Packages, Configure SSIS packages for different environments using DTS Config.
  • Extract-Transform-Load data into SQL Server 2008 Normalized database from various sources
  • Involved in error handling of SSIS packages by evaluating error logs.
  • Extensively used high level backendPL/SQLprogramming involving different databases.

Confidential

SQL Developer

Responsibilities:

  • Involved in the requirement gathering, project plan, effort estimations leading to successful product delivery.
  • Create and maintain database for Server Inventory, Performance Inventory
  • Working with SQL, and T-SQL, VBA
  • Involved in creating tables, stored procedures, indexes
  • Creating / Running jobs with packages
  • Design, Develop, Deploy Packages with WMI Queries
  • Importing Data from various sources like Excel, SQL Server, Front base
  • Creating Linked Servers to other databases like Front base, and import data
  • Ensuring Data Consistency, Analyzing the Data
  • Generate Dash Board Reports for Internal Users using SQL Server 2005 Reporting Services
  • Backing up the Database
  • Deploy various reports on SQL Server 2005 Reporting Server
  • Installing and Configuring SQL Server 2005 on Virtual Machines
  • Migrated hundreds of Physical Machines to Virtual Machines
  • Conduct System Testing and functionality after virtualization
  • Extensively involved in new systems development with Oracle 6i.
  • USEDSQLCODEreturns the current error code from the error stackSQLERRMreturns the error message from the current error code.
  • UsedImport/Export Utilitiesof Oracle.
  • Read data from flat files and load into Database usingSQL Loader.
  • Created theExternal Tablesin order to load data from flat files and PL/SQL scripts for monitoring.
  • Writing Tuned SQL queries for data retrieval involvingComplex Join Conditions.
  • UtilizedSQL Loaderto perform bulk data loads into database tables from external data files.
  • Used the UNIX for Automatic Scheduling of jobs. Involved inUnit Testingof newly created PL/SQL blocks of code.
  • Working experience with Kimball Methodology and Data Vault Modeling.
  • Define virtual warehouse sizing for Snowflake for different type of workloads.
  • Experience in Tableau Server administration tasks including Tableau server optimization and performance tuning.
  • Extensively used Autosys for scheduling the UNIX jobs.
  • Major challenges of the system were to integrate many systems and access them which are spread across South America; creating a process to involve third party vendors and suppliers; creating authorization for various department users with different roles.
  • Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
  • CSM certified, worked in Scaled Agile (SAFE) environment as System/DBT QA, hands on experience with Rally, JIRA.
  • Involved in design and development of GLS application developed in C/C++ on HP UNIX.
  • Setup full CI/CD pipelines so that each commit a developer makes will go through standard process of software lifecycle and gets tested well enough before it can make it to the production.
  • Validated the data load process for Hadoop using the HiveQL qurey’s.
  • Evaluate Snowflake Design considerations for any change in the application
  • Build the Logical and Physical data model for snowflake as per the changes required
  • Define roles, privileges required to access different database objects.
  • Define virtual warehouse sizing for Snowflake for different type of workloads.
  • Design and code required Database structures and components
  • Build the Logical and Physical data model for snowflake as per the changes required.
  • Developed the Pysprk code for AWS Glue jobs and for EMR.

We'd love your feedback!