We provide IT Staff Augmentation Services!

Bi Architect Resume

0/5 (Submit Your Rating)

New York City, NY

SUMMARY:

  • I have 18 years of IT experience in ETL and Data Warehouse with specialization in Pentaho implementations.
  • Integration of Complicated Enterprise level Dashboards.
  • Bridging gap between Business owners/drivers and technical teams
  • Identifying bottlenecks in processes, tools and technologies.
  • Optimizing resource utilization in OS, ETL, Databases, Reporting & BI and Batch window availability.
  • Integration of multiple sources and targets using Pentaho.
  • Performance Testing, Grid Optimization and Performance Benchmarking.
  • Dynamic SQL scripting and data generation, Data partitioning and archival.
  • Shell scripting. Basic level user. Normally use it to troubleshoot performance issues, scheduling scripts, catching up for aborted jobs, etc.
  • Requirement Gathering Initial connects with the business stakeholders to define the Points of Contacts, outcomes, sample data, templates and signoff criteria.
  • Source System Analysis The Source systems are analyzed in terms of data structure, granularity, cardinality, quality, latency and method of access.
  • Target Data Warehouse Design The Data warehouse needs to be designed primarily as a Star Schema with minor snow - flaking and Global Dimensions. Data Partitioning and archival strategy is defined.
  • BI Architecture Design Designing reports for business Users and evaluating their end usage. Analyzing the Database and OS footprint and continuous optimization.
  • Performance benchmarking and forecasting Analyzing OS logs, Database Stats and reporting logs to forecast any issues before it actually hits the online reporting system.

TECHNICAL SKILLS:

Pentaho BI Suite: Data Integration and Analysis. Being open source its metadata is readily available and can be harnessed for dynamic development. Used it extensively to create product data warehouses which can respond dynamically to changing business requirements.

SQL: Expert in writing queries and processing data. Most ETL components are extensions PL/SQL and Reporting tools are extensions of Spool.

PROFESSIONAL EXPERIENCE:

Confidential, New York City, NY

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis for the new Clinical Cloud database
  • Creating an optimal Data Model with proper operational hierarchy.
  • Generating dynamic ETL approach to have a scalable solution when the data model explodes, new hierarchies are added.
  • Creating scalable ETL processes from sources in Oracle, XML, APIs, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: AWS, Oracle, XML, APIs, Pentaho ETL

Confidential, New York City, NY

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis.
  • Creating an optimal Data Model with proper naming conventions to facilitate automated validation.
  • Creating scalable ETL processes from sources in DREMEL, BIGQUERY, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Optimizing ETL performance and generating high performance extracts to be consumed in Tableau.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: AWS Redshift, Salesforce, Gainsight, Pentaho ETL, Tableau

Confidential, Mountain View, CA

BI Architect

Responsibilities:

  • Requirement Gathering and data analysis.
  • Creating an optimal Data Model with proper naming conventions to facilitate automated validation.
  • Creating scalable ETL processes from sources in DREMEL, BIGQUERY, External Files, Confidential Docs.
  • Establishing proper dependencies and error processing.
  • Optimizing ETL performance and generating high performance extracts to be consumed in Tableau.
  • Implementing Dynamic metadata for ETL to ensure scalability and ease of trouble shooting.

Environment: Dremel, Big Query, Pentaho ETL pantheon, Tableau

Confidential, New York, NY

ETL Architect

Responsibilities:

  • Requirement Gathering from the Analytics team and Centers for Medicare & Medicaid Services (CMS)
  • Data warehouse Modeling to create source data model and Star schema for Data warehouse.
  • Creating ETL architecture for large data volume using Pentaho Data Integration
  • Used Fast Json, Java, Mysql scripts to retrieve source data.

Environment: Pentaho BI Suite, mysql, MongoDB

Confidential, New York, NY

Data Warehouse Architect/ETL Engineer

Responsibilities:

  • Requirement Gathering from the Business Users in Advertising and Reporting
  • Defining Source-target mappings for ETL in Pentaho, dependency matrix for ETLs and Reporting.
  • Incorporating reconciliation using Confidential Analytics and source target data profiling.
  • Dynamic ETL in Pentaho using Sugar CRM metadata to keep pace with the numerous changes on the CRM Analytics and ensuing that the ETL development is no longer a bottleneck for business users.
  • Incorporating data from Redshift and web logs in Hadoop.
  • Scripting in PIG to create rollups of weblogs for feeding data into data-warehouse.

Environment: Pentaho BI Suite, Redshift, Oracle, Confidential Analytics.

Confidential

Product Technical Architect

Responsibilities:

  • Source System Analysis the Source systems was mainly oracle data bases along with Flat-Files or XML feeds from Non-Finacle Systems. The systems were analyzed to bring all data sources to a de-normalized format to be processed in a generic format that can be loaded to target Dimensions and Facts using a Pre-staging and Staging Database. Source-Target Mappings for 200 targets.
  • Target Data Warehouse Design the Data warehouse was designed using Ralph Kimball Star Schema with minor snow-flaking and Global Dimensions. The data model was created in Erwin and tables were optimized for downstream reporting.
  • Reporting Design Designing 50 off the shelf reports as a product. These reports were created in Tableau format for easier deployment.
  • Performance benchmarking Using bulk generation of data all sources were populated and product tested for performance using IBM Labs and analyzing Oracle AWR and nmon reports
  • A key feature of this product was Dynamic ETL using Product metadata to create ETL and all objects dynamically using semantic layers.
  • Presenting Solutions to Clients and incorporating new features.

Environment: IBM Infosphere Information Server, Oracle, Tableau

Confidential

Data Warehouse Architect

Responsibilities:

Responsibilities:

  • Source System Analysis The Source systems consisted of many relational databases, equipment generated log files, XML files. Typically any equipment connected to the network creates at least 2-5 types of data having different metadata. The systems were analyzed to bring all data sources to a pre-staging database.
  • Target Data Warehouse Design the Data warehouse was designed using Ralph Kimball Star Schema with major snow-flaking and Global Dimensions. The Tables were optimized for downstream reporting.
  • Designing ETL Architecture Implementing the Source-Target Mapping using Transformations created in Pentaho, PERL & Python. Implementing Dependencies, SCD, Partitioning using ETL and Oracle
  • Reporting Design Designing 100 off the shelf reports as a product in Tableau.
  • Shell Scripting to control ETL Steps and Reporting Refresh, monitoring and feedback to support teams.
  • KPI in corporation The telecom domain has 1000s of Key Performance Indicators and all of them cannot be shipped as a part of the product. An Application interface was created to provide drag and drop framework which enabled KPIs to be directly created as ETL components. This enabled in-memory Reporting from Tableau to directly connect to Data warehouse bypassing the Cognos Framework.
  • A key feature of this product was Excel Source-Target Mapping being directly used to create ETL Source Target Mappings using semantic layers.
  • Big Data processing of Network logs using PIG & HIVE
  • Presenting Product Demo to Clients and incorporating new features.

Environment: IBM Infosphere Information Server, Oracle, Tableau

Confidential, Anchorage, Alaska

Data Warehouse Architect

Responsibilities:

  • Datamart Design A couple of datamarts were designed as Star Schema and optimized for ETL and Reporting
  • Dynamic SQLs generation process used Metadata directly from ETL requirements. This was optimized to reduce the development efforts considerably.

Environment: Microsoft SSIS/SSRS Unix, Sqlserver, Erwin

Confidential, Austin, TX

Data Warehouse Architect

Responsibilities:

  • Requirement Analysis Analyze site analytics and performance data. Defining site performance metrics. Comparing multiple ETL and reporting tools.
  • Data Warehouse Design Creating Data Warehouse Star Schema Model and OLAP Cubes for Reporting
  • ETL Design Creating POC ETL components in Pentaho
  • OLAP Cubes Design Creating OLAP cubes and Refresh Strategy
  • Tableau Reports and Dashboard Design In-Memory Reporting were used to Design 20+ Reports and 5 Dashboard including Reports Bursting and Email delivery

Environment: Pentaho BI, Tableau

Confidential, Houston, TX

ETL Architect

Responsibilities:

  • Requirement Analysis and Database design Reconciliation requirements were translated into Database design using Data from Flat files and ERCOT Oracle databases Most of the Reporting Requirements were re-conciliation based data from Retail usage and ERCOT.
  • Two datamarts were created with 40-50 tables and around 50 PL/SQL packages.
  • Packages were tuned re-cursively when data volumes increased to more than 10 million records/day.
  • Datastage ETL was used to connect to flat files and secondary Data-sources

Environment: Erwin, IBM information server, Oracle, Unix

Confidential

Oracle DBA

Responsibilities:

  • Database Installation and Configuration for Manugistics
  • Database Support Post installation, monitoring and single point reconciliation of all databases, troubleshooting issues using Enterprise Manager, Statspack, AWR reports
  • Optimizing Databases Processes like Indexing, Partitioning, Defragmentation, SQLs tuning, DB tuning.
  • Critical Production Support

Environment: Oracle, pl/sql, Enterprise manager, Unix

Confidential

ETL Developer

Responsibilities:

  • Requirement Analysis Data Sources in Flat-File, SAP-IDOC, EDIFACTs, SWIFT, XML, etc. Message broker service included configuring protocols like FTP, SFTP, MQ Series, etc. This data was loaded into Sybase Db and finally delivered via other broker services.
  • ETL Development Runtime Maps were created using AMTrix EAI tool

Environment: EAI tool Amtrix, Oracle, Unix

Confidential

Mainframes Developer

Responsibilities:

  • Programming/Testing Developing and Maintaining COBOL Programs and JCLs
  • GUI Development Developing and Maintaining IDMS-ADSO and DB2-CICS Applications

Environment: IBM Mainframes, CICS, ADSO, JCL, IDMS, DB2

We'd love your feedback!