Snowflake Architect & Developer Resume
SUMMARY:
- Overall 12+ years of experience in ETL Architecture, ETL Development, Data Modelling, Database Architecture with Talend Bigdata, Lyftron, Informatica, Apache Spark, AWS, NoSql, Mongo, Postgres, AWS Redshift & Snowflake.
- Designed and developed a new ETL process to extract and load Vendors from Legacy System to MDM by using the Talend Jobs.
- Having experience in Snowflake cloud data warehousing shared technology environment for providing stable infrastructure, architecture, best practices, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities.
- Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
- Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute.
- Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, caching etc.
- Extensively used Talend BigData components like tRedshiftinput, tRedshiftOutput, thdfsexist, tHiveCreateTable, tHiveRow, thdfsinput, thdfsoutput, tHiveload, tS3put, tS3get.
- Extensive experience in migrating data from legacy platforms into the cloud with Lyftron, Talend, AWS and Snowflake.
- Experience in data modeling, data warehouse, and ETL design development using RalphKimball model with Star/Snowflake Model Designs with analysis - definition, database design, testing, and implementation process.
- Data analysis, Database programming (Stored procedures; Triggers, Views), Table Partitioning, performance tuning
- Strong knowledge of Non-relational (NoSQL) databases viz. Document, Column, Key-Value and Graph databases
- Developed jobs in both Talend (Talend Platform for MDM with Big Data ) and Talend (Talend Data Fabric).
- Neo4j architecture, Cipher Query Language, Graph Data modelling, Indexing.
- Trained in all the Anti money laundering Actimize components of Analytics Intelligence Server (AIS) and Risk Case Management (RCM), ERCM and Plug-in Development.
- Very good experience in UNIX shells scripting.
- Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
- Strong Experience in Business Analysis, Data science and data analysis.
- Strong knowledge of SDLC (viz. Waterfall, Agile, Scrum) and PMLC.
- Strong Knowledge of BFS Domain including Equities, Fixed Income, Derivatives, Alternative Investments, Benchmarking etc.
- Strong Accounting knowledge of Cash Flow, Income Statements, Balance Sheet and ratio analysis.
TECHNICAL SKILLS:
ETL Tools: Talend MDM 7.1/6.x/5.x, Informatica 7x/8x, SSIS, Lyftron
Big Data Technologies: Hadoop ecosystem, Spark, HDFS, Map Reduce, Hive, PIG, Sqoop, NOSQL
Reporting Tools: Business Objects XI R2, Cognos8.x/7.x, Micro strategy and MS Access Reports
Operating System: Windows NT/XP, UNIX. Sun Solaris 8/7.0, IBM AIX 4.3
Database: Oracle 9i/10g, 11g, SQL Server 2008/2012, DB2, Teradata, Netezza, AWS Redshift, Snowflake.
Language: C, Java, SQL, PL/SQL, UNIX
Modelling Tools: Visio, Erwin 7.1.2
Cloud Technologies: Lyftron, AWS, Snowflake, RedshiftProfessional Experience
PROFESSIONAL EXPERIENCE:
Confidential
Snowflake Architect & Developer
Software Platform & Tools: Talend, MDM, AWS, Snowflake, Bigdata, MS SQL Server 2016, SSIS, C#, Python
Responsibilities:
- Creating new tables and audit process to load the new input files from CRD.
- Created complex mappings in Talend 7.1 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutputetc tMDMInput, tMDMOutput.
- Working on Snowflake modeling and highly proficient in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and change data capture.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts and data load.
- Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
- Developed Talend Bigdata jobs to load heavy volume of data into S3 data lake and then into Snowflake data warehouse.
- Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute.
- Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, caching etc
- Developed Talend MDM jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
- Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
- Worked on MDM modeling through MDM perspective through Talend 5.5.1 suite and developed jobs to push data to MDM.
- Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend ETL tool.
- Database objects design including stored procedure, triggers, views, constrains etc.
- Designing ETL jobs in SQL Server Integration Services 2015.
- Working with Traders and Business analyst to finalize the requirements.
- Mapping of incoming CRD trade and security files to database tables.
- Building business logic in stored procedure to extract data in XML format to be fed to Murex systems.
- Creating Conceptual, Logical and physical data model in Visio 2013.
- Database objects design including Stored procedure, triggers, views, constrains etc.
- Designing ETL jobs in SQL Server Integration Services 2015.
- Working with Traders and Business analyst to finalize the requirements.
- Mapping of incoming CRD trade and security files to database tables.
- Building business logic in stored procedure to extract data in XML format to be fed to Murex systems.
Confidential
Sr. ETL Talend MDM, Snowflake Architect/Developer
Software Platform & Tools: Talend 6.x, MDM,AWS, Snowflake, Bigdata, Jasper, JRXML, Sybase 15.7, Sybase IQ 15.5
Responsibilities:
- Worked in industrial agile software development process i.e. Daily Stand-ups, pre-iteration meetings, Iteration planning, Backlog refinement, Demo calls, Retrospective calls.
- Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
- Worked on Snowflake Shared Technology Environment for providing stable infrastructure, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities
- Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
- Worked on Snowflake Schemas and Data Warehousing
- Developed Talend Bigdata jobs to load heavy volume of data into S3 data lake and then into Redshift data warehouse.
- Talend MDM Designed and developed the Business Rules and workflow system.
- Migrated the data from Redshift data warehouse to Snowflake.
- Expertise with MDM, Dimensional Modelling, Data Architecture, Data Lake & Data Governance.
- Build dimensional modelling, data vault architecture on Snowflake.
- Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema
- Designed high level ETL/MDM/Data Lake architecture for overall data transfer from the OLTP to OLAP with the help of multiple ETL/MDM tools and also prepared ETL mapping processes and maintained the mapping documents.
- Analysis of Test Track tickets and creating JIRA stories.
- Support user and production queries.
- Performance tuning of slow running queries and stored procedures in Sybase ASE.
- Replication testing and configuration for new tables in Sybase ASE.
- Stored procedure migration from ASE to Sybase IQ for performance enhancement.
- Maintenance and development of existing reports in Jasper.
- Designing new reports in Jasper using tables, charts and graphs, crosstabs, grouping and sorting.
Confidential
Sr. Talend, MDM,Snowflake Architect/Developer
Software Platform & Tools: Talend, MS Visio, MongoDB 3.2.1, ETL, Python, PyMongo, Python Bottle Framework, Java Script
Responsibilities:
- Analysing and documenting the existing CMDB database schema.
- Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
- Created complex mappings in Talend 7.1 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutput, tMDMInput, tMDMOutput etc.
- Developed data validation rule in the Talend MDM to confirm the golden record.
- Developed Talend Bigdata jobs to load heavy volume of data into S3 data lake and then into Redshift data warehouse.
- Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema
- Designing application driven architecture to establish the data models to be used in MongoDB database.
- Maintain and support existing ETL/MDM jobs and resolve issues.
- Data modelling activities for document database and collection design using Visio.
- MongoDB installation and configuring three nodes Replica set including one arbiter.
- Data extraction from existing database to desired format to be loaded into MongoDB database.
- Assisting in web design to access the data via web browser using Python, Pymongo and Bottle framework.
Confidential
Sr. Database Consultant
Software Platform & Tools: Sybase, Unix Shell scripting, ESP scheduler, Perl, SSIS, Microsoft SQL server 2014
Responsibilities:
- Stored procedures and database objects (Tables, views, triggers etc) development in Sybase 15.0 related to regulatory changes.
- Performance tuning for slow running stored procedures and redesigning indexes and tables.
- Preparing data dictionary for the project, developing SSIS packages to load data in the risk database.
- Unix Shell scripting to automate the manual works viz. reports validation, job re-runs.
- Enhance the system documentation.
- Resolve open issues and concerns as discussed and defined by BNYM management
- Analyse, design, code, unit/system testing, support UAT, implementation and release management.
Confidential
Technology Lead
Software Platform & Tools: ETL, MFT, SQL Server 2012, MS Visio, Erwin
Responsibilities:
- Analysing the current data flow of the 8 Key Marketing Dashboards.
- Analysing the input data stream and mapping it with the desired output data stream.
- High level data design including the database size, data growth, data backup strategy, data security etc.
- Establishing the frequency of data, data granularity, data loading strategy i.e. Delta load, full load.
- Ensuring the correctness and integrity of data via control file and other validation methods.
- Data warehouse experience in Star Schema, Snowflake Schema, Slowly Changing Dimensions (SCD) techniques etc.
- Co-ordinating the design and development activities with various interfaces like Business users, DBA’s etc. for the project.
- Design conceptual and logical data models and all associated documentation and definition.
- Produce and/or review the data mapping documents.
- Designing the database reporting for the next phase of the project.
Confidential
Data Architect
Software Platform & Tools: SQL Server 2007, SSRS, Perl, UNIX, ETL (Informatica), Dot Net(C#), Windows Services, Microsoft Visio
Responsibilities:
- Dataflow design for new feeds from Upstream.
- Data warehouse design and development.
- Using SQL Server profiler to diagnose the slow running queries.
- Tuning the slow running stored procedures using effective indexes and logic.
- Writing stored procedures in SQL server to implement the business logic.
- ETL development using Informatica powercenter designer.
- Scheduling the jobs using Control-M.
- Root cause analysis for any issues and Incidents in the application.
- Documenting guidelines for new table design and queries.
- Change Coordinator role for End-to-End delivery i.e. taking requirements from clients for any change, providing initial timelines, analysis of change and its impact, passing on the change to the respective module developer and following up for completion, tracking the change in system, testing the change in UAT, deploying the change in prod env, post deployment checks and support for the deployed changes.