Seniro Snowflake Developer Resume
Houston, TX
SUMMARY
- Eleven years of IT professional experience in Data warehousing (ETL/ELT) technologies including requirements gathering, data analysis, design, development, system integration testing, deployments, and documentation.
- Over 4 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse
- Solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse.
- Experience withSnowflake Multi - Cluster Warehouses.
- Experience withSnowflake Virtual Warehouses.
- Experience working with onprem database to snowflake datawarehouse
- In-depth knowledge ofData Sharingin Snowflake.
- Developed ETL pipelines in and out of data warehouse using the combination of Python and Snowflake’s SnowSQL.
- Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies.
- Experience in building data warehousing and data marts with OLTP vs OLAP, star vs snowflake schema, normalization vs de-normalization methods.
- Supported various reporting teams and experience with data visualization tool Tableau.
- Very good at SQL, data analysis, unit testing, debugging data quality issues.
- Azure Knowledge - Knowledge in various Azure PAAS offerings e.g.ADLS, Azure BLOB, Batch Service, Azure Data factory V1, Key Vaults etc.
- Worked with data scientists to understand the requirements and propose and identify data sources and alternatives.
- Understand data classification, and adhere to the information protection,Data Goveranance and privacy restrictions on data.
TECHNICAL SKILLS
Languages: Shell Scripting (K-Shell, C-Shell), PL/SQL, Control-M Scheduling, Autosys, Java script, HTML, Python
Databases: Teradata,Oracle, POSTGRES, MongoDB,Azure CosomsDB,SQL Server,My SQL
Agile Methodologies/Dev ops: Jenkins, GitHub,Jira,CI/CD using PowerShell, Confluence,Stash
Big Data/Hadoop/Cloud: HDFS, Hive, Sqoop, Pig, Kafka, Map Reduce, AZURE Data services, Snowflake Design Development
Web services: REST API, SOAP
ETL/Snowflake: Talend Data Integration, Informatica suite(Powercenter,IDQ,DIH,MDM,BDM)Webservice, SOAP
Cloud computing: Azure Data factory,Azure stream analytics,Azure Databricks,Azure Datalake storage
Reporting Tools: POWER BI, SAS and Tableau
PROFESSIONAL EXPERIENCE
Confidential, Houston, TX
Seniro Snowflake Developer
Responsibilities:
- Gathered business requirement to determine the feasibility and to convert them to technical tasks in the design document.
- Ingested data from 13 API’s into Snowflake.
- Worked on Python script for data ingestion process into staging.
- Worked on flattening JSON’s into tables.
- Create Tasks to schedule the jobs.
- Involved in decommissioning Hadoop data base and ingesting data into Snowflake.
- Created Snow warehouse 1x/2x Large, Database, Snow pipes and other utilities.
- Knowledge in multi cluster, auto scaling and auto suspend the warehouse.
- Created worksheets using SQL to stage, transform data.
- Developed scripts using PL/SQL and Stored Procedures.
- Involved in building scalable distributed data lake system for Confidential real time and batch analytical needs.
- Created or setup SQS notifications when Kinesis writes new data files to gets the data near real time, triggered Lambda script to automate the job to append data into target through Snow pipes.
- Performed tuning of Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
- Cloned data bases, tables, schemas etc.
- Worked on Create, Alter & update tables, views, file formats, data etc.
- Worked with different file formats, like CSV, JSON, PARQUET etc.
- Worked using multiple roles Account Admin, SYSADMIN etc.
- Egress data to vendors for business application reporting
Environment: Snowflake, S3, AWS Redshift, Alteryx, Power BI
Confidential, Washington, DC
Snowflake Consultant
Responsibilities:
- Informatica ETLs from on-premise platform to Azure based cloud platform
- Designed the flow from existing onprem data to snowflake
- Lift and shift, SQL code converted from Oracle to snowflake compatability
- Snowsql configured in python for audit mechanism
- Worked on Python based Auto ETL to load Historical and Incremental data from different source like Hyperion,Salesforce to S3, then S3 to snowflake.
- Migrated and make it compatbale the Tableau extract from snowflake reporting tables
- General ledger tables consolidation for snowflake to increase the perfromace and increase refresh time
- Implemented Snowsql configuration with vault integration without password prompt.
- Applied snowsql hints to increase the query performance.
- Using Merge statement COPY data from External storage to Snowflake
- Worked on Complex Informatica ETL migration to snowflake
- Worked on Snowflake Materized views to publish data to snowflake in like and extract mode.
- Used FLATTEN table function to produce lateral view of VARIANT, OBJECT and ARRAY column.
- Migrating the historical and incremental data from Oracle to Snowflake
- Used Airflow to ingest data by creating and debugging DAG’s.
- Migration from Informatica onprem to informatica cloud using power exchange
- Heavily using with Agile process management tools (e.g.Gus)
- Familiarity with Middleware technologies and test automation (Selenium, SoapUI, etc.)
- Analyze and manage 3rd Party and other external data sets
Environment: Snowflake, Azure, Python,Informatica, Tablueau,Airflow,Tidal.
Confidential, San Fransisco, CA
Sr Data Engineer
Responsibilities:
- Gathered business requirement to determine the feasibility and to convert them to technical tasks in the design document.
- Created Snow warehouse 1x/2x Large, Database, Snow pipes and other utilities.
- Worked on multi cluster, auto scaling and auto suspend the warehouse.
- Created worksheets using SQL to stage, transform data.
- Cloned databases, tables, schemas etc.
- Worked on Create, Alter & update tables, views, file formats, data etc.
- Worked with different file formats, like CSV, JSON, PARQUET etc.
- Worked using multiple roles Account Admin, SYSADMIN etc.
- Worked on SnowSQL and Snowpipe
- Converted Talend Joblets to support the snowflake functionality.
- Created Snowpipe for continuous data load.
- Develop stored procedures/ views in Snowflake and use in Talend for loading Dimensions and Facts.
- Used COPY to bulk load the data.
- Created data sharing between two snowflake accounts.
- Created internal and external stage and transformed data during load.
- Redesigned the Views in snowflake to increase the performance.
- Used Talend big data components like Hadoop and S3 Buckets and AWS Services for redshift.
- Validating the data from SQL Server to Snowflake to make sure it an accurate match.
- Involved in the Requirements gathering to conduct the POC on Snowflake.
- Building solutions once for all with no band-aid approach.
- Implemented Change Data Capture technology in Talend to load deltas to a Data Warehouse.
- Design, develop, test, implement and support of Data Warehousing ETL using Talend.
- Created Rich Graphic visualization/ dashboards to enable a fast read on claims and key business drivers and to direct attention to key area.
- Created series of signboard and dashboard reports, built for a new process being built in Horizon.
- Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
- Created complex data Views Manually using multiple measures, also used sort, Filter, group, Create Set functionality.
- Published Reports, workbooks & data source to server and exporting reports in different Formats.
- Using URL’s, Hyperlinks, Filters for developing Complex dashboards.
- Worked creating Aggregations, calculated Fields, table calculations, Totals, percentages using Key Performance Measures (KPI) and Measure.
- Validated the reports before publishing reports to server.
- Administered user, user groups, and scheduled instances for reports in Tableau.
- Developed Custom SQL scripts to view required claim data in developed dashboards.
- Optimized SQL scripts to improve the efficiency on the database.
- Involved in performance tuning for tableau Dashboards.
Environment: Environment: Snowflake, Alteryx, Hive, Impala, informatica IDQ, Tableau
Confidential, Durham, NC
Sr Data Engineer
Responsibilities:
- Served as SME and technical liaison between IT developers and business partners
- Drafted support procedures to decrease client reported ticket/issue turnaround
- Guided team while Integrating data from different source systems and loading through Axway server using informatica.
- Coordinated Setting up SFTP with ssh keys for different server to exchange source and target files.
- ODS data has been loaded from source to stage amd target
- Oracle Stored Procedure code review and testing with plsqlut7 functionality.
- Expertise in managing a robust environment across Windows and UNIX
- Scheduling and designing ControlM jobs, Batch Monroring .
- Responsible for facilitating development team in understanding the technical approach to execute the requirements, guiding and mentoring the team members.
- Identifying risks and issues and taking the necessary steps to address them upfront
- Owning the preparation of the estimations to the level of data elements and the code, coordinating with the client to explain and justify the estimates while coordinating and complying with the required deal review process and meeting the required standards
- Responsible for understanding the key system and environment to get a strong grip of the purpose, objectives and process flows of each system and how it will it play its role in the project.
- Review, develop, and implement strategies that preserve the data integrity, accessibility as in the current data source to meet the requirement
- Helping in preparing a project plan, monitoring the progress on daily basis, reporting the progress to the delivery management and the client on regular basis
Environment: Informatica Power Center 10.X,Toad 9.1, SQL Developer, UNIX,Oracle 12c, Windows, Stash, Jenkins, GitHub, Control-M, Hadoop.
Confidential, Reston, VA
DWH-BI Consultant
Responsibilities:
- Responsible for the requirement gathering and design.
- Worked as a MicroStrategy Architect responsible for Schema Design and Development using desktop.
- Building Mappings, Mapplets, Workflows using Informatica 8.6.1.
- Created data marts for reporting in Teradata.
- Using Repository Manager to Export and Import XML from Development Folders to central and respectively to other environments.
- Responsible for Building new reports, dashboards, reports troubleshooting, enhancements, and performance tuning MicroStrategy Intelligence Server for better performance.
- Created and designed documents with different controls like HTML Containers, grid-graph, panel stack etc.
- Used different formatting grouping techniques to support the end user requirements on documents.
- Designed dynamic dash boards with more interactivity using components like selectors, widgets etc.
- Created documents with links on attribute, metric, hierarchy, object prompt on a Grid Graph to enable another document.
- Handled prompts answer in target documents in different ways like using existing prompt answers from the source, using the objects selected in the source etc.
Environment: MicroStrategy 9.0.1, Informatica 8.6.1, Teradata v13, Power Shell, IBM Data Architect
Confidential
Sr. Informatica Developer
Responsibilities:
- Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
- Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
- Created the ETL technical specification for the effort based on business requirements. The source system being mainly Oracle Master Data.
- Worked closely with Business Analyst and Shared Services Technical Leads in defining technical specifications and designs for Oracle based large data warehouse environment.
- Developed detailed ETL implementation design based on technical specification for BI effort within the ETL design standards and guidelines.
- Control-M job scheduling,Design and monitor the jobs in Planning.
- Created unit testing scripts and representative unit testing data based on business requirements.
- Ensured testing data is available for unit and acceptance testing within development and QA environments.
- Unit tested ETL code/scripts to ensure correctness of functionality and compliance with business requirements.
- Refined ETL code/scripts as needed to enable migration of new code from development to QA and QA to production environments following the standard migration signoff procedure.
- Worked with the team Manager and Business Analysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
- Performed problem assessment, resolution and documentation, and, performance and quality validation on new and existing BI environments.
Environment: Informatica power center 9.X, Toad 9.1,SQL Developer, UNIX,Oracle 11g, SQL Server, Erwin data model, Windows.