Sr Etl/snowflake Admin/developer Resume
White Plains, NY
SUMMARY
- 11+ Years of experience in full life cycle development involving analysis, design, development, deployment, testing, implementation maintenance and Support of large data warehouse applications in Web - based and Client/Server environment.
- Experience in building Data warehouse systems and Business Intelligence systems including Pentaho ETL and Cognos
- Experience in Talend Data Integration.
- Hands on experience on Datawarehouse Star Schema Modeling, Snow - Flake Modeling, Fact & Dimension Tables, Physical and Logical Data Modeling.
- ETL and BI experience using pentaho studio.
- Developed mappings in Pentaho Data Integration (PDI) ETL tool to load the data using various steps including Row Normalizer, Row Denormaliser, Database Lookup, Fuzzy Lookup Database Join, Calculator, Add Sequence, Merge Join, Insert/Update.
- Experience in integrating XML, JSON format files into Datawarehouse
- Experience in Realtime Data process into Datawarehouse database.
- Experience in High Volume data into Destination using Batch Jobs.
- Pentaho ETL (PDI) tool upgradation 3.2 to 7.0 and 8.1
- Reports Migration from SSRS to Pentaho
- In-Depth understanding of SnowFlake Multi-cluster Size and Credit Usage
- Experience with Snowflake Virtual Warehouses, Maximum and Multi-Cluster Warehouses.
- Experience in building Snowpipe.
- In-depth knowledge of Data Sharing in Snowflake.
- Depth knowledge of Snowflake Database, Schema and Table structures.
- Experience in using Snowflake Zero copy Clone and Time Travel.
TECHNICAL SKILLS
Business Intelligence Tools: IBM Cognos Analytics 11, Framework Manager, Power BI, Pentaho Report Designer, SSRS.
ETL Tools: Pentaho (PDI), SSIS.
Datawarehouse: Snowflake
Databases: SQL Server, Oracle, DB2, Amazon Redshift
DB Tools: SQL*Plus, SQL*LoaderWeb Servers IIS, Tomcat and Apache
Programming skills: HTML, JavaScript, SQL, Python, Unix Shell scripting
PROFESSIONAL EXPERIENCE
Confidential, White Plains, NY
Sr ETL/SnowFlake Admin/Developer
Responsibilities:
- Created Data mappings for source data coming from APIs, Table input, Text file and Excel input.
- Prepared interval data for measurement table for legacy data.
- Canonical files generation using Gzip for newly added data and pushed to C3 flatforms (Analysis).
- Used bunch of steps in Pentaho transformations including Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs.
- Analyzed data sources, identified the issues, and address the same that can impact the ETL process.
- Created Users and Roles in Snowflake ang Grant permissions based on Roles.
- Created Virtual Warehouse based on requirement ( for ETLs created XSmall, and Reporting Team Large Server).
- Involved in Migrating Objects from SQL Server Database to Snowflake.
- Created internal and external stage and transformed data during load.
- Used COPY to bulk load the data from external stage (Amazon S3).
- Created Snowpipe for continuous data load.
- Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
- Used Temporary and Transient tables on diff datasets.
- Cloned Production data for code modifications and testing in lower environments (Dev/QA).
- Shared sample data using grant access to customer for UAT.
- Time traveled set to 60 days to recover missed data as per given requirement.
- Performed the code review by taking the transformations from offshore team.
- Troubleshoot issues with failed ETL process in Production environment.
- ETL Deployments
Environment: MS SQL Server 2016, SAP HANA, mangodb, Windows 2016, Pentaho Kettle (PDI) 8.3, SVN, Snowflake Cloud Warehouse, Amazon S3.
Confidential, Charlotte, NC
Sr BI Developer/SnowFlake Developer
Responsibilities:
- Created Data Flow Mappings to extract data from source system and Loading to Target.
- Used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
- Created transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update, Add constants, Filter, Value Mapper, Stream lookup, Join rows, Merge join, Sort rows, Database Lookup, Set Environment Variables.
- Created and saved Pentaho jobs in enterprise repository and scheduled them to run in production on Daily/weekly basis.
- Used dimension lookup/update step to populate data into SCDs.
- Used the Pentaho Enterprise Repository to create folders, store transformations and jobs, move, lock, revise, delete, and restore artifacts.
- Scheduled meetings with Senior Data Leads and analyzed the data.
- Developed Stored Procedures to get the data from various Facets and load into temporary tables.
- Prepared the documents for all the modules developed.
- Used Pentaho Report designer to create various reports having drill down functionality by creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.
- Created single value as well as multi-value drop down and list type of parameters with cascading prompt in the reports.
- Worked on Premise data base SQL Server migration to Snowflake
- Created Snowpipe for continuous data load.
- Created Streams and Task for scheduling purpose.
- Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
Environment: MS SQL Server 2012, Window 2012 Server, Pentaho Kettle (PDI) 8.1, SnowFlake, Pentaho Report Designer 7.1, Tidal, SVN, JIRA, Dart, Code Warden.
Confidential, Charlotte, NC
ETL Developer/Talend/Pentaho/SnowFlake
Responsibilities:
- Interacted with the Business Analysts to understand the process flow and the business.
- Actively participated in team to gather requirements to develop this BI project and participated in designing Physical and Logical of Data warehouse.
- Migrated Pentaho Jobs and Transformations into Talend Data Integration Jobs.
- Exported all the data using Pentaho ETL to Amazon S3.
- Uploaded CSV file data to Snowflake Internal stage and loaded into Tabales.
- Loaded data from Amazon S3 to Snowflake.
- Created Data Flow Mappings to extract data from source system and Loading to Target.
- Large Volume of Data loading into Warehouse tables using Batch ETL Process.
- Used Pentaho Data Integration to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
- Created transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update.
- Created and saved Pentaho jobs in enterprise repository and scheduled them to run in production on weekly basis.
- Used dimension lookup/update step to populate data into SCDs.
- Experienced in performing Data Masking/Protection using Pentaho Data Integration (Kettle).
- Dealt with slowly changing dimensions type 1&2.
- Involved in Prod Support to research and resolve the daily load issues.
- Worked on ETL flow documentation.
- ETL Deployments.
Environment: Pentaho (PDI ETL) 7.1, Talend DI v6.5.1, SnowFlake, Amazon S3, SQL Server2014, Oracle,TOAD, XML, SVN
Confidential, Oshkosh, WI
ETL Developer/Pentaho
Responsibilities:
- Created user accounts in Pentaho Enterprise Console for end users/Business Analysts who were supposed to view the reports using Pentaho User Console.
- Created mapping documents to define and document one-to-one mapping between source data attributes and entities in target database.
- Used Pentaho Data Integration to create all ETL transformations and jobs.
- Used different types of input and output steps for various data sources including Tables, Access, Text File, Excel and CSV files
- Encrypt the User information.
- Identify and analyze data discrepancies and data quality issues and works to ensure data consistency and integrity.
- Implemented Slowly Changing Dimension Type 1 and Type2 in ETL jobs for certain Dimensions.
- Wrote Shell scripts in UNIX and PL/SQL scripts to automate daily routine jobs for production databases.
- Modified existing Oracle PL/SQL code of stored procedures, functions, and packages.
- Saved Pentaho jobs in enterprise repository and scheduled them to run in production on daily basis.
- Cognos Analytics 11 Installation in Linux machines
- Cognos Gateway Configuration with Apache Webserver
- User roles creation in Cognos.
- Cognos admin activities like daily backups, logs checking/cleaning, validating data source connections.
- Cognos packages publishing from Framework Manager, and reports deployment (Development to Test, Pre-production and Production).
- Reports scheduling from Cognos portal per requirements.
- Validating the packages and all the reports associated to the packages.
- Tuning the queries to improve the report performance.
- Creating complex reports that involve query prompts, layout calculations, conditions, and filters in Report Studio.
- Improved the performance and usability of the reports by creating appropriate prompts and filters.
- Developed Multi page reports with prompt pages and conditional variables according to the end user requirements.
- Reports integration into third party Applications.
Environment: Cognos10.2, Oracle 12c, PowerPoint, Microsoft Excel, Microsoft Word, Erwin, Cognos Framework Manager, Pentaho Kettle PDI 6.1, SVN.
Confidential, O'Fallon, MO
Senior BI Analyst
Responsibilities:
- Developed ETL process using Pentaho PDI to extract the data from legacy system.
- Worked with DBAs to archive the old data to get more performance on the loads.
- Scheduled the jobs using Pentaho Scheduler.
- Provided technical support to other developers and coordinated escalation of issues.
- Built Relational Models for Reporting and DMR for OLAP Style Analysis and Reporting using multi layer Approach and Modeling best Practices to meet the reporting requirements.
- Involved in Design and Development of Security Architecture to meet the Business needs.
- Ensured Ambiguous Relationships are resolved and implemented Model enhancement techniques to improve Model Performance.
- Created Model Query Subjects, Data Source Query Subjects, Stand alone and Embedded Filters to implement the Complex business Logic.
- Implemented internal Cognos Security - Column Level, Row Level Security using Security tables to secure sensitive data with Security Filter, Macros, Session Parameters and Parameter Maps.
- Developed Hierarchies and Measures to support Adhoc Reporting, Multi Dimensional Reporting.
- Published Packages to Cognos Connection and implemented Package Level Security.
- Managing User ID’s and User groups like Providing Privileges to the users and User Groups in Access Manager.
- Using Transformer cubes as data source import into Framework Manager, and created Models, packages for Analysis Studio.
- Developed Report Templates for Standard and Adhoc reporting to make the Reports look Consistent.
Environment: MS SQL Server 2008, Oracle 11g, Cognos 8.4, Cognos 10, Cognos Framework 8,10, Unix, Window 2008 Server, Pentaho Kettle (PDI) 3.2, 4.1.