We provide IT Staff Augmentation Services!

Sr. Pentaho Developer Resume

3.00/5 (Submit Your Rating)

St Louis, MissourI

PROFESSIONAL SUMMARY:

  • Around 7 years of Experience in ETL. Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Web Based applications and Designing.
  • Experienced on open source Business Intelligence Tools specifically Pentaho, Database management systems, development and complete Project Life Cycle in Data Warehousing and Client/Server technologies.
  • Experience in Data Warehouse development working with Data Migration, Data Conversion, and Extraction/Transformation/Loading using Pentaho Data Integration (Pentaho Kettle) with Oracle, SQL Server.
  • Experience in developing Data Warehouse architecture, ETL framework and BI Integration using Pentaho Reports and Pentaho Dashboards.
  • Highly experienced in Data Warehousing, ETL and Business Intelligence experience using Pentaho Suite (Pentaho Data Integration/Kettle, Pentaho BI Server, Pentaho Meta Data and Pentaho Analysis Tool & Mondrian OLAP).
  • Experience in designing, modeling, performance turning and analysis using Pentaho Data Integration/Kettle, Pentaho BI Server Pentaho Analysis Tool.
  • Experience in Reporting Tools like Pentaho Report Developer, Spotfire and AWS Kibana. Dashboard Developer and Report Analyzer.
  • Strong skills using SQL queries on Oracle, DB2, and SQL Server.
  • Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server.
  • Hands - on experience on Data Warehouse Star Schema Modeling, Snowflake Modeling, FACT& Dimension Tables, Physical and Logical Data Modeling.
  • Proficient in implementing the business logic to design/develop the Cubes, Aggregation, Measures using SQL Server Analysis Services (SSAS).
  • Experience with exception/error handling in PL/SQL.
  • Experience in Java development working across multiple technologies.
  • Excellent programming skills in creating database objects like Tables, Views, Stored Procedures, Indexes, Triggers and Functions.
  • Proficient in transforming data from various sources (flat files, XML, Sybase, Oracle) to Data warehouse using ETL tools.
  • Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to modern technologies and tools.

TECHNICAL SKILLS:

Pentaho suite: Pentaho Data Integration (Kettle), Pentaho BI Server, Pentaho Analysis Tool, Pentaho Report Designer, Pentaho Mondrian, Pentaho Metadata Editor, Pentaho Design Studio, Mondrian OLAP Server.

Databases: SQL Server Oracle 9i, MySQL, Microsoft Access.

Tools: Toad, Oracle SQL Developer, SQL Plus, Oracle Enterprise Manager, SQL Server Management Studio, Business Intelligence Development studio (BIDS).

Other ETL tools: SQL Server Integration Services (SSIS), Data Transformation Services, BCP, Import Export Data, Bulk Insert.

Operating System: Windows, Unix and Linux.

Languages: SQL, PL/SQL, Java, HTML.

PROFESSIONAL EXPERIENCE:

Confidential, St. Louis, Missouri

Sr. Pentaho Developer

Responsibilities:

  • Designed project through the SDLC process, with best practices and trends in the field of BI and Data Warehousing to help shape company's BI strategy and solution designs.
  • Utilized MySQL to pull data from various databases and created various dashboards, scorecards and reports to support business decision making.
  • Create Enterprise Repository in Pentaho BI Server to store Pentaho jobs and Reports.
  • Configure Pentaho BI Server for report deployment by creating database connections in Pentaho Enterprise Console for central usage by the reports deployed in the repository.
  • Created user accounts in Pentaho Enterprise Console for end users/Business Analysts who were supposed to view the reports using Pentaho User Console.
  • Implemented security in Pentaho reports by assigning permission to specific users to view the reports.
  • Deployed reports on Pentaho BI Server to give central web access to the users.
  • Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.
  • Extracting the data from various data sources and populated it in to our BI data warehouse by developing ETL process using Pentaho PDI.
  • Creating all ETL transformations and jobs using Pentaho Data Integration Designer.
  • Used diverse types of input and output steps for various data sources including Tables, Access, Text File, Excel and CSV files.
  • Created single value as well as multi-value drop down and list type of parameters with cascading prompt in the reports.
  • Saving Pentaho jobs in enterprise repository and scheduled them to run in production on daily basis.
  • Created several dashboards in Pentaho using Pentaho Dashboard Designer.
  • Used Pentaho Schema Workbench to create Cubes, Dimensions and fact tables in Pentaho.
  • Performance tuning of slowly running transformations and jobs, SQL Queries & Stored procedures using SQL Profiler.
  • Involved in administration tasks such as creating and managing Sites, setting permissions, managing ownerships and providing access to the users using user filters and adding them to the specific group.
  • Creating web-based reports and dashboards using Pentaho Report Designer.
  • Tested Pentaho ETL jobs, reports and database stored procedures being called in the reports.
  • Participated in Business Analysis & requirement collection. Interacted with clients directly before writing requirement specification documents.
  • Analyzed the source data coming from various heterogeneous data sources to be consumed in the ETL procedures to load data into data warehouse.

Environment: Pentaho BI Server, Pentaho Data Integration (PDI/Kettle), Pentaho Metadata Editor, Pentaho Design Studio, Pentaho Report Designer, Pentaho Dashboard Designer, MySQL, Windows.

Confidential, San Ramon, Ca

Pentaho Developer

Responsibilities:

  • Designed Data warehouse including star schema design, DW capacity planning, performance and tuning.
  • Designed and developed complete BI infrastructure using Pentaho BI Suite.
  • Created Enterprise Repository in Pentaho BI Server to store Pentaho ETL jobs and Reports. Responsible to manage data coming from various sources.
  • Extensively worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
  • Used Pentaho Data Integration Designer to create all ETL transformations and jobs.
  • Identify and analyze data discrepancies and data quality issues and works to ensure data consistency and integrity.
  • Used Pentaho Design Studio for creating custom parameters as well as generating report.
  • Implemented Slowly Changing Dimension Type 1 and Type2 in ETL jobs for certain Dimensions.
  • Used Pentaho Report designer to create various reports having drill down functionality by creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.
  • Created several types of chart reports in Pentaho Business Analytics having Pie Charts, 3D Pie Charts, Line Charts, Bar Charts, Stacked Bar Charts and Percentage Bar charts.
  • Created reports in Pentaho report designer using complex SQL queries and stored procedures against DW tables and added multiple levels of cascading prompt for both static and dynamic parameters.
  • Dealt with slowly changing dimensions type 2 and multi-level hierarchical dimensions.
  • Published cubes and reports onto Pentaho repository and refreshed Pentaho BI repository after uploading each object to be available for central use.
  • Deployed reports on Pentaho BI Server to give central web access to the users.
  • Created several dashboards in Pentaho using Pentaho Business Analytics Platform.
  • Used Pentaho Schema Workbench to create Cubes, Dimensions and fact tables in Pentaho.
  • Created various ad-hoc reports using Pentaho Analyzer to allow users to generate reports on the fly by dragging and dropping dimensions onto rows and/or columns and dropping one or more measures to get the results.

Environment: Pentaho BI Server, Pentaho Report Designer, Pentaho Analyzer, Pentaho Data Integration/Kettle-Spoon, Pentaho Metadata Editor, Pentaho Design Studio, Pentaho Dashboard Designer, MySQL, UNIX shell scripts, Oracle11g, SQL, Toad, Erwin, Putty, MS-Excel, Windows, Oracle SQL Developer.

Confidential, Norfolk VA

ETL/Pentaho Developer

Responsibilities:

  • Interacted with the Business Analysts to understand the process flow and the business.
  • Actively participated in team to gather requirements to develop this BI project and participated in designing Physical and Logical of Data warehouse.
  • Created Data Flow Mappings to extract data from source system and Loading to Target.
  • Configured and installed Pentaho Data Integration Server 4.1 hierarchically from development to production servers.
  • Experienced on integrating Repository in to store the Pentaho ETL jobs and Reports.
  • Expertise in using Pentaho Data Integration Designer 4.1 to create all ETL transformations and jobs.
  • Used Eclipse IDE to create java plugins to work with Pentaho jobs and transformations.
  • Used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
  • Used dimension lookup/update step to populate data into SCDs.
  • Designed Data warehouse including star schema design, DW capacity planning, MySQL performance and tuning. Implemented Orders and Points DW using star schema, Orders and Points Business domain using Pentaho Metadata.
  • Created and used various re-usable tasks, workflows, worklets, mapplets, and reusable transformations.
  • Created cubes using schema workbench on top of DW star schema.
  • Developed UNIX scripts for scheduling the delta loads and master loads using Autosys Scheduler.
  • Designed various SQL queries to extract data as per the project's requirements.
  • Configured error handling for transformation steps by creating Debug mode in Pentaho.
  • Created Validation using Pentaho to generate reports for percentage of mismatch data and to achieve data quality.

Environment: Pentaho BI Server, Pentaho Data Integration (PDI/Kettle), Pentaho Design Studio, Pentaho Report Designer, Pentaho Dashboard Designer, Pentaho Business Analytics, Java, Oracle 11g, Oracle SQL Developer, SQL Profiler, Linux, HP automation server, Eclipse, RPM packaging, Unix.

Confidential

ETL Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Developed rules and mapplets that are commonly used in different mappings and Used IDQ to perform data profiling and create mappings.
  • Involved in migration of the code from IDQ to power center.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Used various transformations like Address validator, parser, joiner, filter, matching to develop the mappings.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Implemented SCD1, SCD2 type mappings to capture new changes and to maintain the historic data.
  • Worked on extract and load type of mappings. Worked on SQL, Oracle, Teradata databases.
  • Designed Work Flows that uses multiple sessions and command line objects (which are used to run the UNIX scripts).
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
  • Involved in creating stored procedures and using them in Informatica.
  • Developed UNIX shell scripts to schedule Informatica sessions.
  • Responsible for Performance Tuning at the Mapping Level and Session level.
  • Used Debugger to troubleshoot the mappings.
  • Worked extensively on Unit Testing the ETL mappings and preparing efficient unit test documentation for the developed code to make sure the test results match with the client requirement.
  • Responsible for migration of the work from dev environment to testing environment.
  • Responsible for solving the testing issues. Created Triggers, Cursors using T-SQL.
  • Created documents that have the detail explanation of the mappings, test cases and the expected results and the actual results.
  • Involved in loading the partner flat files into the OLTP database. Provided Production Support.
  • Developed the T-SQL stored procedures/functions to process these data and populate them in the appropriate destination tables.

Environment: Informatica power center 9.5.1, Informatica Data Quality 9.5.1, SQL Server 2012, DB2, UNIX, PL/SQL, Windows 7, ERWIN, SQL Developer 2005.

Confidential

Informatica Developer

Responsibilities:

  • Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
  • Worked on Informatica 8.6.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Involved in design and development of complex ETL mappings.
  • Implemented partitioning and bulk loads for loading large volume of data.
  • Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
  • Developed Mapplets, Worklets and Reusable Transformations for reusability.
  • Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
  • Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
  • Performance tuning by session partitions, dynamic cache memory, and index cache.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Created Stored Procedures in PL/SQL.
  • Created UNIX Shell scripts to automate the process.
  • Developed Documentation for all the routines (Mappings, Sessions and Workflows).
  • Involved in scheduling the workflows through Autosys Job scheduler using UNIX scripts.
  • Played a key role in all the testing phases and responsible for production support as well.

Environment: Informatica Power Center 8.6.1/8.1.1 , Oracle 10g, TOAD 10.1 for Oracle, DB2, Flat Files, PL/SQL, ERWIN 7.3, Windows 2000, UNIX PERL scripting, Autosys

We'd love your feedback!