We provide IT Staff Augmentation Services!

Sap Hana Data Process Development Resume

0/5 (Submit Your Rating)

Juno Beach, FL

SUMMARY:

  • Distinguished technology change leader with a proven 18 year history of delivering the highest quality of business intelligence and technical data management leadership in a diverse range of technologies.
  • Experienced in large scale projects such as, business process re - engineering, Big Data Analytics, MDM Architect, Business Intelligence, Data Warehouse and IT Strategy, Over 12 years’ experience managing with Agile project control methods to gain predictive project objectives and deliverables. Leadership style is “lead by example”; by developing team builder and team leader core objectives.
  • Expert with Big Data and BI tools, such as Business Objects Enterprise 4, and SAP HANA data sources processing To include: HDFS 2.1, Falcon, Yarn and Oozie.
  • Design Data structures with metrics analysis of multi-dimensional data cubes. Performance based HDFS data processing architecture derived from innovation in performance metrics algorithms design supported by data research. Data mining skills include research and development for foundational requirements plans aligned with Use Case methodologies.
  • Architecture design values are foundational for building quality operational, technical, and executive level solutions. Expert results derived from proven organizational, communication, presentation, and facilitation skills.
  • Key s
  • Architected Big data, ERP Conversions and BI systems for five corporations and Confidential . Managed and Architect more than 15 projects as Lead Architect and Technical Program Manager. Style of leadership as team builder and lead by example.
  • Experience with Hadoop HDFS and use of MapReduce, YARN, Hive, implementations in cloud.
  • Requirements analysis, blue printing, estimating and project planning deployed three Hadoop and Hortonworks HDFS 2.2 Apache Open Source big data tools deployed via cloud environments for prototype testing and data migration to include cluster development, master data management and data architecture designs.
  • Foundational approach in Data Quality processes and Data Analysis profile definitions and metrics developed to ensure best data sources for Hortonworls and Big Data source to target mapping. Technical Team leader for many large scale Big Data projects. Manage and architect data migration requirements; to include complete data analysis, quality assurance of data and profiling data for migration to new ERP system for large corporations.
  • Architecture Style and Approach:
  • Strategy: Use of detailed data quality and data profile processes to provide clean data sets as source to target processing in ETL stage, that ensures the logical and systematic conversion of business, customer or product requirements into total systems solutions with acknowledgement to information management and technical architecture. Performs functional analysis, timeline analysis, trade, and benchmark and interface definition studies to formalize customer requirements into systems architecture specifications.Scale processing of data in layers is required to effectively migrate data from source to target systems in batches that can be cleansed and profiled correctly.
  • Agile Approach: Use of Agile Stories and Sprints ie: overlapping processes to manage development teams and delivery.
  • Best Practices: Drives technology solutions and standards through the organization; interacts and communicates effectively with internal and external customers as needed to clarify business, operational, or technical requirements to include: Serves as a mentor within the Company facilitating the transfer of technical knowledge and an evangelist for adopting new standards, techniques, products, and methodologies within the Company. Assess industry standards for information models, and where appropriate in corporate them into our information architecture standards. Assist in the definition the governance architecture necessary to allow data stewardship to truly be owned by our business partners.
  • Project Leadership: Ensures that IT projects adhere to the principles, guidelines and standards established by Enterprise Architecture by Maintaining ongoing communication with project teams, verifying what was designed was built and adjusting the solution architecture as appropriate throughout the project.
  • Ensuring the semantics of project level definitions and schemas are mapped to the enterprise viewpoint taking into account permissible variation across business units, addressing gaps, security, and regulatory compliance and mitigating risk. Serving as advisor for “in trouble” projects that need technical help delivering pragmatic results that contribute actual and perceived value to projects.
  • Continuous Learning: Deliberate in the process of learning new technologies and how they apply to the business. Actively participates in industry related technical communities and standards bodies. Capable of representing the technical interests of the company in external forums.

TECHNICAL SKILLS:

Hadoop tools: HDFS 2.1: Yarn, Falcon, Flume, Sqoop, Zookeeper, Storm, Spark, HBASE,H-Catalog, and HIVE >1 to 3 yrs

Database Designs: Distributed Data Cluster Schemes with Hadoop, running with Hbase, Cassandra and Hive. Also developed many traditional Star Scheme DW with Teradata, SAP BW, Neteeza and Oracle

Business Models and Data Modeling Development: Knowledge of Unified Modeling Language (UML) and Business Process Modeling Notation (BPMN) ERWIN Data Modeling 12 yrs, UML CASE Tools, VISIO Diagrams, Data Dictionaries Other DBS Integration: SAP HANA, Oracle 11g, IBM DB2, Netezza, SAS, SAP BW DBS Structures: OLTP, ODS, Star Schema, Snow Flake, party, ROLAP, MOLAP, OLAP, RAPID MARTS SAP Hana Studio SQL 2008 R2, and MS SQL 2012 VM Ware use for data marts Hadoop 2.1, Cassandra SAP BW 7.3, and Teradata, Oracle 10g, 11g, (Oracle RAC to included) Code skills in No SQL, MS SQL, Java Class Objects, JavaScript, MS Visual Studio 2010

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix AZ

Responsibilities:

  • Introducing Apache Tez for the fastest Hive ever Apache Tez reimagines the original MapReduce for interactive query capabilities to meet the needs of users of the most widely-used data access engine for Hadoop: Apache Hive.
  • Stream Processing with Apache Storm: Apache Storm is a distributed real-time computation system for processing fast, large streams of data. Storm adds reliable real-time data processing capabilities to HDP 2.1. Storm in Hadoop helps capture new business opportunities with low-latency dashboards, security alerts, and operational enhancements
  • Data Governance with Apache Falcon a framework for simplifying data management and pipeline processing in Apache Hadoop®. It enables users to automate the movement and processing of datasets for ingest, pipelines, disaster recovery and data retention use cases. Instead of hard-coding complex dataset and pipeline processing logic, users can now rely on Apache Falcon for these functions
  • Operations with Apache Ambari: iHDP 2.1 includes the very latest version of Apache Ambari and which now supports Apache Storm, Apache Falcon and Apache Tez, provides extensibility and rolling restarts, as well as other significant operational improvements.
  • Search with Apache Solr: Apache Solr introduces high performance indexing & sub-second search times over billions of documents. Apache Solr provides powerful full-text search, hit highlighting, faceted search, near real-time indexing, dynamic clustering, database integration, management of rich documents (e.g., Word, PDF), and geospatial search

Confidential, Phoenix AZ

Responsibilities:

  • Develop ELT, Reporting Metrics and analysis for SAP CRM and Operational Data Data Marts, Design BO Universe and data modeling for 4 sub projects. Develop ELT models with BODS 4.1, ETL POC, Universe POC, Data Merge (CDC) delta, define source and target data tables to joins for aggregate data structures to base reporting for CRM and operational data applications supporting residential and commercial electrical services. Reporting, Universes and metrics developed in SAP BOE 4.1 and WEBI reportsUpgrade SAP BOE 3.3 to BOE 4.1; and tested reports and new IDT universes that were developed for Operational Data
  • Review Data Models, integration points and analytics to support robust mortgage loan processing systems within Confidential CORE Systems. Analysis of best practices for COBIT IT audit standards, risk analysis and application of design patterns. BI reporting performance metrics collected in data mart, from various monitoring tools and developed in MicroStrategy Analytics Desktop 4.9 Dashboards. Audit scored rated in Excel and added to Dashboard reports.

Confidential, Juno Beach FL

SAP HANA data process development

Responsibilities:

  • Manage technical and business rules issues for Big Data Greenplum design blueprint
  • Design, Develop and Implement Greenplum with Aster, HIVE, PIG, MapReduce, YARN and Ozzie
  • Setup Complex groups and user rights in BO CMS Admin Console for SSO and backend
  • Hadoop and Hive database modeling and ETL schemes developed and tested.
  • Configured BI Explorer in BO 4.5, Aster Teradata, Oracle databases, Indexes, Transports
  • Data sizing metrics developed for Aster and-Greenplum analysis on OLTP data.
  • Development and administration ETL cube processes with Informatica Data Exchange
  • Sourced multiple data schemes such as SAP BW, Teradata DB, and Greenplum Db to be processed by Aster data appliance with use ETL and Yarn.
  • ETL processes designed and deployed for Production, QA and Development landscapes
  • MDM Architecture Deployment Plan and Blueprint recommendations (SAP MDM)
  • BO 4.5 Upgrade and Installation Production Cluster Windows 2008 Server
  • Configure and debug SAP BOE 4 Promotion Management tools
  • Configure and Debug BI report Publisher Tools in BO 4
  • Setup Complex groups and user rights in BO CMS Admin Console
  • Configured BI Explorer in BO 4.5 with multiple databases, Indexes, Transports
  • Setup and Debug CTS+ BW Promotion Management Tools and Override Functions
  • Size and best Practices for MDM with Oracle, Teradata DW and SAP BW

Confidential, Mechanicsburg, PA

Big Data Solutions Architect

Responsibilities:

  • Stand up new Hadoop HDFS cluster environment for Navy Logistics program.
  • Deployed HIVE, PIG and H-catalog tools for data management on Hadoop HDFS clusters
  • Developed 5 ( 2.5 terabyte x 5) data clusters with multi-sources databases
  • Design and test new MapReduce processes for data analysis feeds (Sqoop and Flume) into BI reporting tools BusinessObjects 4 and Tableau.
  • SAP HANA data architect with cube processing and
  • SAP BOE 4.0 administration and analysis development
  • BI Report designed with SAP BO Web-Intelligence Reporting, Sap BI Dashboards 4, and SSPS.
  • Dashboards 4 to Universe IDT integration. Data Services Configuration and installation
  • BusinessObjects 4 Installation and Configuration, Administration
  • Install and implement SAP MDM Tools and configure with Hadoop and Netezza DW.
  • Design Architecture plan for MDM: Multi-Sources DBS as listed:
  • Sqoop data import to HDFS: Data sources integrated from Oracle, Netezza, Teradata and feeding Hadoop clusters developed on Hive DW Structure.
  • Developed HIVE and Map-reduce tools to design and manage HDFS data blocks and data distribution methods.

Confidential, Boise ID

Sr. SAP Business Objects DW/BI Architect- Data Archive Teradata

Responsibilities:

  • Designed 2012 SAP MDM and SAP BI strategy and architecture to move client Big Data BI analysis. Used Data Services to move data in to HIVE and Hadoop Clusters.
  • Deployed SAP BI /Data Warehouse with Data archives and DW data models for BI DB design.
  • Developed INFORMAICA PowerCenter and Data Exchange ETL architecture for Data loads
  • To Teradata DW and Oracle DW prototype.
  • And Analytics DBS for distributed HDFS DB models.
  • Data Modeling for both Teradata and new DW
  • Designed new BI Databases with Erwin v9 - Star Scheme, 3 n form
  • MDM POC for Hadoop architecture design and implement
  • Business Objects 4.0 Enterprise Web-Intelligence Reports developed for POC
  • Business Objects Data Services XI 4.2 Admin, Installation,, Development
  • Design BO analytics and KPIs, and BOE XI 4 Universes, facts, and dimensions tables:
  • Design BOE Universe methods to increase reporting speed
  • Design Analytics Database Models in ERWIN-OLAP / Star Scheme database Models for Analytics with Teradata methods
  • Installed and configured BOE XI 4 Premium, Crystal 2011, Dashboards 4, Data Federator, Explorer, BOE XI 4Mobile

Confidential, Jackson MI

Sr. Data Services Lead

Responsibilities:

  • Blueprint Roadmap for SAP MDM, BI and BW architecture
  • SAP HANA deployment and administration
  • Data Mapping and Blueprinting steps team effort (Team leader for 10 ETL developers)
  • Core project Duties: SAP Data Services ETL processes designed and team lead to migrate complete ERP data source JBA tor SAP ECC 7 (with Sap All In One solution deployment of IDOCS in data Services 3.2).
  • Direct Report to PMO Team for Corporate Data Migration (Project Falcon )
  • Content BI reporting: SAP SCM, SRM, Vertex, MM
  • Managed team of 6 DS developers
  • Report to Director and PMO (Project Management Operations Board )
  • Manage technical and business rules issues
  • Architect JDA Legacy ERP to Sap ECC cut over and deployment processes
  • Business Objects 4.0 Installed and sample report developed in WEBI ( Five reports)
  • Data Services 4.0 Platform installed and developed 4 Web Intelligence reports
  • Design Data Migration processes with SAP Data Services XI 4.0 tools and SAP BOE XI 4.0 Metadata Analysis, SAP BO Rapid Marts Sales, AR, MM
  • Develop MDM Plan for legacy Data and move cleansed data into new MDM structure
  • Data Cleansing Processes developed with Data Services QA Module
  • Data Mapping and Blueprinting steps for SAP IDOC for SCM, SRM, for SAP ECC 7
  • Data Services IDOCs input of legacy data to SAP ECC 6 and BW 7.3

Confidential, Glendale AZ

Data Metrics Analysis

Responsibilities:

  • Design of SAP BW Info cubes, Info Objects, info-sets, info-objects optimization in BW:
  • BI reporting Content: SAP DSOs, BW cube Aggregates, and Design Loading of Data Delta Points
  • Optimized SAP BW Info cubes and DSO for SAP Hana and SAP BusinessObjects 4
  • BWA appliance configuration with BO Explorer, migrated to SAP HANA landscape
  • Optimized SAP BOE Reports WEBI with BW info-cubes and SAP HANA appliance
  • Design BOE reporting queries, data aggregates and BEX Queries
  • Design BO analytics and KPIs develop and modified for performance data processeing
  • Design BOE Universe methods to increase reporting speed

Confidential, Tempe, AZ

SAP MDM Architect

Responsibilities:

  • SAP HANA deployment and configuration, POC for dimensional data analysis (Cubes)
  • MDM Architecture for DBS Sources: Teradata, oracle 10g, MS SQL 2008 for HANA processing
  • BI Content Reporting SAP SCM, SRM, MM, AP/AR, FICO with Data Governance standards
  • SAP Basis Admin support Architect for Security roles and
  • Configure Transports configuration, new ECC Upgrade from 5 to 6.
  • Other ETL Tools used Informatica Power Center 9 and SQL SSIS
  • BusinessObjects 4.0 Enterprise Web-Intelligence Reports developed for POC
  • BusinessObjects Data Services XI 3.3 (ETL Tools) Admin, Installation,, Development
  • Data Services Xi 3.2 and Admin, - Expert; LSMW 3 months
  • SAP BW with BEX to Business Objects XI integration and architecture
  • BOE XI 3.3 Sp2 and SAP BW Integration kit with BEX info-objects and Sap cube data.
  • BOE XI 3.3 CMS on MS SQL 2008, BOE XI Reporting and Universe Design
  • Global BI architecture and design details confidential
  • Xcelsius 3.3 installation and reporting with BW BEX queries integration
  • Xcelsius/ Dashboards 4.0 Beta test with Universe, Install LCM v1
  • Design criteria for BusinessObjects Explorer Blade and BWA Blade Deployment Architecture
  • Integration Data Services Xi Data Migration with Sap ECC via IDOCS interface
  • Evaluate SAP BOE XI 3.2 Upgrades and Hardware to SAP BusinessObjects 4.0
  • Evaluate BOE LCM tool to move BO reports
  • Develop new BO reports in BO 4.0 IDT tools
  • Universe design data elements, metrics and hierarchies BI content reporting SAP
  • Create BOE XI / Web-Intelligence Reporting Development
  • BI Requirements score cards, detailed functional and technical documents
  • SAP BW Data Modeling for Infobjects, Info cubes and Info sets, Info Spaces

Confidential, Atlanta GA

Sr. SAP BI and BW Data Architect

Responsibilities:

  • BOE XI 3.1 Sp2 and SAP BW Integration kit with BEX info-objects and Sap cube data. BI reporting Content: SAP SCM/ SRM/ MM/ ECC / FICO / CRM
  • BOE XI 3.1 CMS Cluster with Oracle 11i RAC.
  • Global BI architecture and design details confidential
  • Teradata Data Source for reporting trends
  • Informatica PowerCenter 8 used to develop ETL processes for data migrations globally.
  • Xcelsius installation and reporting with BW BEX queries integration
  • SAP Business Objects DS 3.2 Lead, Implemented, designed ETL
  • Teradata 9 Data Migrations to Oracle and SAP BW
  • Informatica Power Center ETL Architect for BI data marts
  • SAP BO Web-Intelligence Implemented, SAP SSO implemented
  • SAP BO Explorer Blade Implemented and Architected Design Criteria
  • SAP BW Data Modeling for Infobjects, Infocubes and info sets, Info Spaces

Confidential

Business Objects Team Leader and Sr. BI Architect

Responsibilities:

  • QA Process to fixed, test and deploy Voyager latest build 3.2
  • Architected many new BOE XI 3.1 systems and designed Universes
  • BusinessObjects Data Services (ETL) used for data migration schemes
  • BusinessObjects Upgrades from XIr2 to XI 3.2
  • BusinessObjects Universe Design and Report Design/ Development
  • Crystal Reports 2008 Development, Xcelsius 2008 Reports Development
  • SAP Business Objects BI 3.2 (Like BOE 3.1 XI) Implemented
  • SAP Business Objects Data Services 3.2 Implemented
  • Integration to BW Data Sources ( Info sets)
  • Informatica Power Center 8 Implemented

Confidential - Phoenix, AZ

ISMS Server 2008 Architect, Admin, Data Admin, Systems Architect

Responsibilities:

  • Data Modeling Architect with ERWIN 7.2 Data Modeler on Teradata DW
  • Oracle 10 G Database, migrated to Teradata Data-warehouse.
  • Developed Informatica 8 ETL processes to move data from Oracle 10G to Teradata DW.BI content reporting SAP SRM/ MM / CRM/ Sales
  • WEB portal logistics management applications.

Confidential, Columbus, Ohio

Sr. SOA and BI Architect and Developer

Responsibilities:

  • Architect Storage Capacity Management Enterprise Solution
  • ETL Processes Designed with Informatica Power Center
  • Collected statistical data from Teradata DW and Falcon Store EMC DB clusters
  • Informatics ETL processes developed for multiple Databases.
  • Reporting on 66 data centers and 150,000 VTL, EMC Cluster DBS
  • Business Objects reporting for forecasting database, utilization with data mapping
  • Teradata ETL and Date warehouse models and more

We'd love your feedback!