We provide IT Staff Augmentation Services!

Sr. Etl/ Pentaho Developer Resume

4.00/5 (Submit Your Rating)

FL

PROFESSIONAL SUMMARY:

  • Skilled software professional with 7+ years of experience in creating OLTP and OLAP Data ware house using Pentaho Suite (Pentaho Data Integration/Kettle, Pentaho BI Server, Pentaho Meta Data and Pentaho Analysis Tool & Mondrian OLAP) versions 5.4/4.4/4.2 on Linux and Windows platform.
  • Experienced in developing BI Reports and Dashboards using Pentaho Reports and Pentaho Dashboards.
  • Worked on the complete Software Development Life Cycle (SDLC).
  • Hands on experience working on various Informatica Suites and Informatica Power Centre Tool
  • Excellent Knowledge in RDBMS Concepts and Data Modeling using Erwin 4.0, Star and Snowflake Schemas.
  • Experience with Pentaho on Amazon Web Services (AWS) Elastic MapReduce (EMR 3.4) that uses Hadoop.
  • Expertise in Informatica ETL and reporting tools. Deep understanding of the Data Warehousing SDLC and architecture of ETL, reporting and BI tools
  • Experience with Amazon Redshift.
  • Experience with data integration using Pig Scripts, processing the data over HDFS and storing it into S3 bucket with e Confidential yption.
  • Proficient in writing T - SQL Statements, Complex Stored Procedures, Dynamic SQL queries, Batches, Scripts, Functions, Triggers, Views, Cursors and Query Optimization.
  • Hands on experience in creating and debugging Stored Procedures, Functions, Packages, Triggers, Cursors and Object Types in PL/SQL using TOAD and Oracle SQL Developer.
  • Experience with exception/error handling in PL/SQL.
  • Experience in converting stored procedures, functions and triggers written in PL/SQL code into T-SQL.
  • Responsible for performance tuning of stored procedures, Database Tables using Table Partitioning, SQL Profiler and Database tuning wizard.
  • Hands on experience on the whole ETL (Extract Transformation & Load) process.
  • Extremely motivated, diligent, conceptually strong team player with ability to take new roles and adapt quickly to new technology.
  • Detail-oriented, results-driven, excellent verbal and written communication skills with interpersonal and conflict resolution skills and possesses strong analytical skills.

TECHNICAL SKILLS:

Database: SQL Server 2014/2008/2005/2000 , Oracle 9i, MySQL, MSSQL, Microsoft Access

Tools: Toad, Oracle SQL Developer, SQL Plus, Oracle Enterprise Manager, SQL Server Management Studio, Business Intelligence Development studio (BIDS), SQL Profiler, SSIS, Informatica Power Center 9.1/8.6/8.1/7.1

Pentaho Suite: Pentaho Data Integration (Kettle) 5.4, Pentaho BI Server, Pentaho Analysis Tool, Pentaho Report Designer

Reporting Tools: Tableau,Spotfire, SSRS, Crystal Reports, MS Access, MS Excel

Languages: T-SQL, PL/SQL, ASP.NET, C#,VB.Net C, C++, Java, Visual Basic 6.0

Operating System: Windows 2000/ 2003 Server/ Win XP/Vista/7, Unix, Red Hat Linux

PROFESSIONAL EXPERIENCE:

Confidential, FL

Sr. ETL/ Pentaho Developer

Responsibilities:

  • Extensively worked with Business Users in gathering requirements and actively cataloging and supporting various issues and providing their solution.
  • Responsible for coding SSIS processes to import data into the Data Warehouse from Excel Spreadsheet, Flat Files and OLEDB Sources.
  • Used bunch of transformations in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.
  • Participated in design of Staging Databases and Data Warehouse/Data mart database using Star Schema/Snowflakes schema in data modeling.
  • Troubleshoot BI tool problems and provide technical support as needed. Perform other tasks as assigned.
  • Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.
  • Gathered business requirement by understanding business Processes and needs.
  • Installed and Configured Pentaho BI Suite 4.2 & 4.4 along with Enterprise Repository in Pentaho BI server.
  • Installed and configured Pentaho Suite 5.4.0 and tested the transformations using the same.
  • Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.
  • Prepared ETL (Extract, Transform and Load)standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Configured Pentaho BI Server for report deployment by creating database connections in Pentaho enterprise console for central usage by the reports deployed in the repository.
  • Implemented Logic with Database lookup table to maintain Parent- Child relationship and maintain hierarchy.
  • Used Pentaho Design Studio for creating custom parameters as well as generating report.
  • Used Pentaho Report designer to create various reports having drill down functionality by creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.
  • Automated file transfer processes and mail notifications by using FTP Task and Send Mail task in Transformations
  • Performed Data cleansing by creating tables to eliminate the dirty data using SSIS.
  • Involved in performing i Confidential emental loads while transferring the data OLTP to data warehouse using different data flow and control flow tasks in SSIS.
  • Responsible for creating database objects like table, views, Store Procedure, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.
  • Extensively used joins and sub queries to simplify complex queries involving multiple tables.
  • Optimized the performance of queries with modification in TSQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized table, established joins and created Clustered, Non-Clustered indexes whenever necessary.

Environment: Oracle 11g/10g, Pentaho Data Integration Spoon 5.4.0/ 4.4.0/4.2.0/4.1.2 , Oracle Toad 11.5/10.6, PL/SQL, Pentaho Enterprise console, Linux, Windows.

Confidential, Pittsburgh, PA

ETL/Pentaho Developer

Responsibilities:

  • Worked with business users/Analytics team, data architects to identify the business requirements and developed designed document for ETL flow, analysis of database architecture, created various complex Pentaho Transformations and Jobs using PDI Spoon.
  • Involved in building Data warehouse whose primary source of data comes from external vendors and corporate sources which includes the data from various source system like Oracle 10g, Flat files, and Microsoft Excel file, XML files.
  • Involved in the ETL design and its documentation.
  • Designed Data warehouse including star schema design, DW capacity planning, MySQL performance and tuning. Implemented Orders and Points DW using star schema, Orders and Points Business domain using Pentaho Meta Data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Pentaho PDI.
  • Created Stage based DW supported by that’s completely implemented in Pentaho Kettle.
  • Participated in design of Staging Databases and Data Warehouse/Data marts using Star Schema/Snowflakes schema in data modeling.
  • Imported bulk data in Oracle tables from data files of fixed record, variable record and stream record format using SQL Loader.
  • Developed several Monthly Reports, YTD Reports and End-of-month Checks in PL/SQL.
  • Wrote Shell scripts in UNIX and PL/SQL scripts to automate daily routine jobs for production databases.
  • Modified existing Oracle PL/SQL code of stored procedures, functions and packages.
  • Created and executed SSIS packages to populate data from the various data sources, created packages for different data loading operations for many applications.
  • Created SSIS Packages using SSIS Designer to export heterogeneous data from OLE DB Source (Oracle), flat files, XML files, CSV files and Excel Spreadsheets to SQL Server 2005 and loaded the data into target data sources by performing different kinds of transformations using SSIS.
  • Migrated DTS 2000 packages to SQL Server Integration Services (SSIS) and modified the packages accordingly using the advanced features of SSIS.
  • Actively Participated in developing logical model and implementing requirements on SQL Server 2005.
  • Implemented complex conceptual database design into SQL Server 2005 using various constraint and triggers.
  • Implemented complex business requirement in Oracle Production databases using efficient PL/SQL stored procedures and flexible functions, and facilitated easy implementation to the front end application.
  • Used Oracle Enterprise Manager for troubleshooting, monitoring, and optimizing of Oracle Production and non-production database code as well as PL/SQL code from developers and QA.
  • Backing up, restoring system & other databases as per requirements, and also scheduled those backups.
  • Managing the security of the servers, creating new logins and users, changing the roles of the users.
  • Involved in creating logical and physical models of the database using Erwin and Visio.
  • Developing stored procedures, triggers, views, and adding/changing tables for data load and transformation and data extraction.
  • Used Oracle UTL SMTP package to send email messages using the database as an SMTP mail server.
  • Used DBMS SCHEDULER to scheduling functions and procedures that are callable from any PL/SQL program.
  • Involved in optimizing code and improving efficiency in databases by re-indexing, updating statistics, recompiling stored procedures and performing other maintenance tasks.
  • Interaction with business and technical teams across geographic boundaries, with clients, technical architects, project managers, requirements analysts and testing teams.
  • Developed mappings in Informatica to load data from Source flat files and PDS(SQL server) to EDW(SQL server) using different transformations like Joiner, Aggregator, Update Strategy, Router, Normalizer, Active Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Used Workflow Manager to create different tasks, set task properties, Workflows, Worklets to run the logic embedded in mappings.
  • Provided 24/7 production support for the production database and also to the code deployed into the production environment.

Environment: Pentaho BI Server, Pentaho Data Integration (PDI/Kettle), Oracle 10g, PL/SQL, Toad, SQL Loader, Oracle SQL Developer, SSIS, Oracle Enterprise Manager, SQL Server 2000/2005, SQL Agent, Visio, Erwin, HP-Unix, Linux, Windows XP/Vista/2003 Server, Informatica Power Centre 8.2

Confidential, Mclean, VA

ETL/Pentaho Developer

Responsibilities:

  • Worked with business users/Analytics team, data architects to identify the business requirements and developed designed document for ETL flow, analysis of database architecture, created various complexes Pentaho Transformations and Jobs using PDI Spoon.
  • Involved in building Data warehouse whose primary source of data comes from external vendors and corporate sources which includes the data from various source system like Oracle 10g, Flat files, and Microsoft Excel file, XML files.
  • Participated in design of Staging Databases and Data Warehouse/Data marts using Star Schema/Snowflakes schema in data modeling.
  • Loaded data into STAGE/ARCHDATA/DW/DM by using Transformations/Jobs which consists of 150+ Transformations and 30+ Jobs all together using SMART Master ETL Job.
  • Used Pentaho inbuilt Change Data Capture utility ( Confidential ) to get the data from source system/ Load data into Data Mart(DM)
  • Created Database logging for each Transformation/Jobs into ETL PDI TRANS and setup Email notification failure on each Component level.
  • Implemented Logic with Database lookup table to maintain Parent- Child relationship and maintain hierarchy.
  • Used transformation component like Merge Join, database Join, XML components to load XML files.
  • Use Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others (DEV/QA/PREPROD/PROD).
  • Created Transformations/Jobs to take daily back up Enterprise repository for DEV/QA/PREPROD/PROD.
  • Developed complex ETL Procedures to migrate huge amount of data from the old system to the new one.
  • Applied Configuration, Logging, Error reporting to all Transformation and Jobs to make it deployment easy and troubleshoot package on run time.
  • Used Pentaho Enterprise Console (PEC) to monitor the ETL Jobs/Transformation on Production Database.
  • Created Deployment build (15+ Build) and documentation for DB objects/PDI Jobs and Transformations using one script build.
  • Used Oracle Connection managers to configure database connections for Oracle 11g and Data Warehouse Databases.
  • Worked with team members to tune the SQL queries. Created necessary Index/Partition on huge fact tables for getting a better performance in ETL
  • Automated file transfer processes and mail notifications by using FTP Task and Send Mail task in Transformations
  • Implementing and maintaining database security (Create and maintain users and roles and Login access, assign privileges) to each schema level permission.
  • Providing Production support to operational team for daily/weekly/Monthly job failure in Production environments.
  • Created deployment documentation with the help of other users and operational team to deploy the new release in Production environment

Environment: Oracle 11g/10g, Pentaho Data Integration Spoon 4.4.0/4.2.0/4.1.2 , Oracle Toad 11.5/10.6, PL/SQL, Pentaho Enterprise console, SQL server 2008, My-SQL

Confidential, Owings mills, MD

Pentaho/ETL Developer

Responsibilities:

  • Installed and Configure of MS SQL Server 2008 on Windows XP Professions and Windows Server 2003.
  • Actively participated as in team to gather requirements to Develop BI Project and also participated Designed Physical and Logical of Data warehouse.
  • Participated in design of Staging Databases and Data Warehouse database
  • Implemented ETL to Extract and Transform data from text files, SQL Server 2005 instances and to load into Staging Database. The main purpose to load into staging database was to change all string values to integer and load to Data Warehouse for further Data Analysis by using multiple transformations provided by SSIS such as Data Conversion, Lookup, Conditional Split, Slowly Changing Dimension, Merge and Union all.
  • Applied Configuration, Logging, Error reporting to all packages to make package deployment easy and troubleshoot package on run time.
  • Developed complex custom reports using Pentaho Report Designer which includes developing Business Views, Cascading pick-lists, Drill-through, Hyperlinks, sub-reports etc., functionality into these reports.
  • Implemented Logic with lookup table to maintain Parent- Child relationship and maintain hierarchy.
  • Used transformation like Merge Join, Pivot, lookup, Fuzzy lookup and slowly changing Dimension.
  • Used Package Configuration to Migrate SSIS package from one environment to other.
  • Used event handling to send E-mail on event like on error.
  • Created ROLAP cubes using Pentaho Schema Workbench for group web meetings for demonstrating the powerful functionality of Pentaho Mondrian Server with Ad-Hoc reporting
  • Created different user level groups, and assigned appropriate permission level to the database using Credentials tool. This made sure that only the right person with enough authorizations would be able to see their corresponding Information
  • Use Data Viewers in SSIS Packages to check the flow of data during Execution of Packages
  • Generated reports using SSRS that could be used to send information to different managers of Different Branch.
  • Standardized company reports by implementing SQL Server Reporting Services.
  • Deploying and scheduling Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Deployed and Scheduled the Reports in Report Manager.
  • Scheduled and maintained routine jobs, tasks, and alerts. For instance an alert would be sent to the manager of department, if the entire field were not entered by the data entry person regarding screening using SQL Server Agent
  • Applied Performance tuning techniques at source level, target level, mapping level and session level while dealing with bottlenecks.
  • Used Informatica Power center Mapping designer to create mappings, workflow manager to create and run sessions and workflows to execute the mapping logic and workflow monitor to look at the run statistics

Environment: SQL Developer, Oracle 10g, SQL Server 2005/2008, Pentaho BI Suite (Data Integration Designer, Report Designer, Dashboard Designer, Analysis View, Pentaho Analyzer, Design Studio, Mondrian Server), SQL Management Studio 2008, SQL Server 2005 & 2008,Windows Server 2008, JavaScript

Confidential, Washington, DC

ETL Developer

Responsibilities:

  • Worked with Business users and Business analysts to understand and analyze ETL and reporting requirements. Conducted metric analytics to come up with technical specifications documents.
  • Used Informatica Powercenter components like Mapping Designer to design mappings, Workflow manager to create sessions and workflows to execute the mappings and workflow monitor to view the run properties, session logs and to do recovery process.
  • Worked with homogenous sources and heterogeneous sources and targets like relational tables, delimited and fixed width flat files and CSV format files while creating mappings.
  • Developed mappings using Expression, Source Qualifier, Filter, Router, Lookup and Joiner.
  • Created and used Views, Stored Procedures, Queries and tables while developing ETL mappings and reports.
  • Worked on building Xcelsius Dashboards on top of universes build using SAP Integration Kit.
  • Worked on Performance manager to create various metrics, Analytics and used Dashboard Manager to build corporate dashboards.
  • Developed reports by linking different data providers in a single report and made all possible incompatible objects as compatible by merging the common objects.
  • Performed unit testing on both ETL mappings and reports in order to match the amount and quantities figures against the transactional system.

Environment: Informatica 7.1, Business Objects XI R2/6.5, Oracle 10g/9i, InfoView, Xcelsius, SAP Integration Kit

We'd love your feedback!