We provide IT Staff Augmentation Services!

Sr. Sap Platinum Consultant Resume

5.00/5 (Submit Your Rating)

Witchita, KS

SUMMARY

  • SAP MDG (Master Data Governance & MDM (Master Data Management). MDG 6.1 - 8.0 (Profiling, Enrichment, Harmonization, Data cleansing, De-duplication, Single version of the Truth) includes Dark data, SAP Info Steward, BODS (Data Services) includes SAP SLO (System Landscape Optimization). 3 rd party tools help facilitates expediating profiling using Intellicorp-Live Model and Live Compare. MDG 9.0* - (SAP Courseware: MDG100 Master
  • SAP ERP ECC ILM (Information Lifecyle Management) - Data Archiving enhancement from standard 'classical sap data archiving'. The ILM ADK (Archive Development Kit) enhancement using SAP ILM are superior 1. ILM ADK Enablement of Archive objects and Framework, 2. RM (Records Management) - Define business rules, legal holds, 3. DS (Decommissioning system) - for legacy data that now is part of current or future infrastructure due to i.e. acquisitions, mergers and divestitures and has to be migrated, harmonized, cleansed, profiled, checked for consistency and SAP SLO Landscape
  • SAP DVM (Data Volume Management) - Data Archiving & SLO (System Landscape Optimization) integrated as part of overall data management in SAP ERP ECC, BW, CRM,
  • SRM and XI/PI (Classical Archiving) using standard SAP delivered (ADK - Archive Development Kit) programs and for BW used DAP (Data Archiving Process) per BW Object. Incorporating 'Massive Archiving' based on Mark/Patent 'Process and Time' ™ in reducing SAP EPR Traditional row-database and Non-SAP databases in lieu of HANA migration. Reduction ratio models on average reduction of database and its 'footprint' range from a low
  • 20 % reductions - 60 + % overall database reduction provided to each client AI Models converted to PowerPoint with a set of 1 - 3 options of reductions based on business requirements of data residency vs. retention, impact, and timeline along with many factors will determine the overall reduction ratio.
  • SAP AI (Artificial Intelligence) - Algorithms including ML, DL, NN, NLP in designing a deterministic set of database reduction ratios based on AI compression algorithms models for optimization of overall reduction size of pre/post HANA migration for both row based and in-memory databases. Measurements range from 20 % up to as much as 60 + % reduction of traditional row-based and columnar In-memory SAP ERP databases as well as Non-SAP databases using Mark/Patent of 'Process and Time' ™ for (structured, non-structured, semi- structured, machine etc.,) databases of SQL/No-SQL
  • EU GDPR (Government Data Protection Regulations) # 679, across all 99 Articles having expertise. Use case creation design and integration using SAP ILM + EU GDPR 'Block' functionality expertise of 'sensitive' data. Perform pre/post DPIA (Data Protection Impact Analysis) - 'Risk Assessment' on 6 successful GDPR integration/ migration Go-Live In lieu of May 25, 2018 GDPR compliance mandated based on Article # 35 among others establishing level of 'sensitivity' or 'Risk Association vs. Cost/Value of data' then architecting end-to-end integration/migration.
  • Blockchain - Blockchain cryptocurrency using Bitcoin, Emercoin and smart contracts (Ethereum) Block coded fulfillment development and migration/integration
  • HSM (Hardware Security Module) - Thales and Gemalto Integration of Key Management using TDE (Transparent Data Encryption) databases Oracle / MS SQL integration, migration using HSM Crypto-Key Store HW appliances for storing Encrypted Master Key in key vault outside Encrypted Tablespaces using TDE (Transparent Data Encryption) protocol i.e. FIPS 140-2,3 NIST, EU -PII, and GDPR 679, PCI-SSC and HIPPA for data at rest/motion using SSL/TLS etc.,

TECHNICAL SKILLS

Operating Systems: SUSE Linux (SLES), Oracle Linux/Solaris, WIN 2012 R2, REDHAT (RHEL) LINUX, HP-UX, CentOS, OS X, Unix SCO, WIN 10

Databases: ORACLE 7-12.2.01, ORACLE TDE (Transparent Data Encryption) 10g/R1 - 12c/R2 & AES-NI, MYSQL 5.17.18, MICROSOFT SQL SERVER 2016, HANA SP1 - 2+12, CASSANDRA 3.10, SAP SQL IQ, TERADATA, HBASE 1.2.5, HIVE 2.1.1, DB2 11.1, FILEMARKER 15.0.31

Data Modeling/ ETL: SAP Data Services ETL, SAP HANA SLT (SAP Landscape Transformation), SAP BO Predictive Analytics ETL 3.1.1, Informatica Power Center ETL, Erwin 2.5 - 5 x (IDEF1X), James Rumbaugh, BOOCH

PM Deployment Methodologies: SAP ASAP (Accelerated SAP) Methodology, SAP RDS - Rapid Deployment Solution, Agile and Waterfall Project Management Methodologies, IBM APL (Application Lifecycle Management)/Framework - Deployment, S/4HANA - SAP 'Activate' Time to Value Methodology

HADOOP Distributors: HDP (Horton Data Platform), CLOUDERA & MAPR

PROFESSIONAL EXPERIENCE

Confidential, Witchita, KS

Sr. SAP Platinum Consultant

Responsibilities:

  • Designed and architected a robust set of KPI's using 'emerging and disruptive' technologies i.e. AI, Predictive Analytics models using i.e. classification, regression, Boolean algorithms using language i.e. Python, R, C/C++ extracted top data structures along with all BASIS 'parametrization' down to Codepage/Unicode from SAP HW & SW to design and create a set of metrics and measurements to integrate 'massive' reduction of database 'footprint' upwards of 62.5 % from current 12.5 (TB) DB database leaving ~ 5.2-5.5 (TB) database size.
  • The reduction added value in migration of DB ORA 12c during SAP SUM/DMO Migration cutover downtime reduced by 50 %, also SAP In-memory licensing, reduction in HW Appliance and DC Data center infrastructure support, space and reduction in project overall spend due to efficiencies due to 'massive reduction scalability' framework.
  • Created 2 Use cases. First is enhancements and integration of SAP ILM to existing OPENTEXT Archive server for rules based definition of retention and statutory legal holds compliance,. The Second use case same SAP ILM product with the 4 th 2017 added functionality in supporting 'Block' for EU GDPR (General data protection regulation) # 697 and its supportive 99 articles. Created DPIA (Data Protection Impact Assessment) in which scope, approach, strategy integration into cutover production migration of Go-Live prior to EU GDPR compliance date of May 25, 2018.

Confidential

Sr. SAP & Non-SAP Platinum Integrator

Responsibilities:

  • Evaluated and profiled existing infrastructure threats of intrusion along with development using AI-Artificial Intelligence ML (Machine Learning) models to illustrate the various methods of 'penetration' in support of customers experiencing 'breach' 'intrusion' in that capacity performing infrastructure and data profiling using i.e. ATP, DDoS, NAC, VPN, DNS
  • Threat Intelligence as examples of areas reviewed based on level types of store vs. methods of compliance, regulation, policies internal to client vs. industry standards for solidification of designing a robust set of industry standards around Cryptographic steward compliance protocols i.e. (FIPS 140-2,3 NIST, EU -PII, and end to end integration of (6) EU GDPR # 679 establishing accountability and governance framework, creation of key roles within clients organization, project plan leading into DPIA (Data Protection Impact Assessment) of scope, approach, strategy, integration and migration to successful Go-lives.
  • Reviewed PCI-SSC, HIPPA, NSA, Encryption methods of pseudo choices of key declaration with end to end security of encryption 'data having highest valuation' then integration of the industry's top V&T (Vulnerabilities & Threat) software thus recommending an enforcement of proper compliance based on 'Value of Data vs. Risk Association' factors that determined the scope, approach, strategy, and measure integrated to avert future vulnerabilities.
  • As developer of encryption algorithms supporting various customers in the optimization space using the Top HSM (i.e. Thales and Gemalto) using Crypto engine (NI) AES-256 and new i.e. CPU Chipset Intel Haswell EP/EX,, Intel Ivy Bridge, Intel Broadwell EP/EXE7 development in Crypto C++ development of specialized performance based on type of encryption cipher used in private / public key management driving by level method of database and level type of encryption either column and/or tablespace. Complete Integration of Thales e-Security and Gemalto HSM Safe Net/Luna Appliance with Vormetric installation, integration, and migration with successful Go-Lives in lieu of early 2017 Classification per Industry as the year of the Malware/Ransomware (i.e. WannaCry) via Bitcoin, May 12 th Affecting 150 countries instrumental key play of reverse engineering, profiling and 2 nd Kill switch lead developer.

Confidential, WI

Sr. SAP Solution Architect & Integration Platinum Consultant

Responsibilities:

  • As Sr. SAP Solution Architect & Integrator Re-Architected Kohler's current SAP data management strategy specific to scope, approach and strategy creating a new DVM (Document Volume Management) set of robust innovative standards and guidelines that will reduce overall database 'footprint' as much as 20 - 60 + % percent based on AI Algorithm metric/measurements created during the initial discovery of Kohler's 20-year-old Production environment when archiving went live in 1996.
  • Re-designed current data management scope in lieu of a yearly 2.4 + (TB) database growth and having over 100 million documents in 'open' state since 1996 original Go-Live. Due to mergers, acquisitions, consolidation of companies and explosive growth the current data management model had no reduction impact or real traction thus redesigned a new scope, approach and strategy with metrics/measurements that would get Kohler below the 50 % database baseline prior to and in preparation of 'HANA Readiness' migration in immediate future. I created a set of 3 options developed via AI Algorithm called 'Process and Time' a series of metrics/measurements that would improve and introduced new innovative ways to reduce data, create consistency of data, remove de-duplication, improve reengineer of process flow, enable complete object base automation, added over (35 - 50), new Techno-Functional objects, reduce residency by increasing retention post Kohler Leadership, business lead and end user sign-off. Managed retention via ILM precepts along with NLS SQL IQ, HADOOP and PBS-CL options with Architecture and logic integration in lieu of future 'HANA Readiness'
  • Introduced via Integration Kbyte size volume metrics, established (GB/TB) size reduction % per FYR vs. CC vs. PP vs. Open/Closed vs. OIM vs. Incomplete vs. In Error vs. Phantom Data discovery, identified over (74) Z Table Development, ADK/REO Statistical Session purging, Forensic business process flow checker process and algorithm, realignment of residence vs. retention vs. business rules vs. policies compliance vs. data identification of 'null data / phantom'.
  • Created / Architected MDM / MDG data management governance around clients 'rules' and 'policies' around master i.e. material, financial, business partner centric to a. de-duplication, b. data veracity, c. data consistencies/inconsistencies and d. data volume as the primary driver of creating that foundational paradigm of central governance as a current/future project of establishing that 'Single version of the truth' with MDM/MDG Techniques prior to future 'HANA Readiness'
  • Created / Developed retirement strategy of DART 2.7 Tax Extraction tool in lieu of replacement with TJC Software RJCAEC - Audit extraction cockpit, TJCFRGL - France Tax Audit Tool - For International Audit Compliance Single Prod Instance,
  • Upgraded ADK (Archive Development Kit), ARC-IS (Archive Information System), AE - Archive Enabled SAP Transaction codes to read data from ADK via OSS, Custom ABAP4 Development, along with PBS-CL Release API's C++ Binary Libraries, Connectors, Designed New EOL 'End of Life' Model
  • MDM / MDG Incorporated industry practices around Master data management and governance around principals of data consistency, data cleanup, data de-duplication, forensic data review of process flow for 'open business doc and open item management G/L accounts' with data volume management per industry best practices. OIM Object Integration Model for the following domains i.e. Mater, Customer, Vendor, and Business Partner
  • Participated, Lead, presented via Workshops Data Aging techniques for BW on HANA and Classical Archiving for CRM 7.0 and ECC 6.05.
  • Lead designed discussion on integration strategy using PBS-CL (Content Link 'Light' ECM) in parallel with several options in preparation of future HANA migration i.e. introduced options using NLS SQL IQ, SAP IQ, HADOOP via MapReduce, Yam and HDFS (HADOOP Data File System) from top 3 HADOOP Distributors i.e. HDP (Hortonworks Data Platform), CLOUDERA and MAPR along with options of PBS NLS SQL (Columnar)
  • Re-Modified / Recommended Global RRS (Records Retention Schedule Template of SLA (Service Level Agreement) for residence vs. retention vs. destruction-based data policy in lieu of antiquated RRS (Records Retention Schedule) for SAP statutory business document compliance and business requirements
  • Re-designed PBS-CL (Content Link) Production edition for NLS (Nearline Storage) for SAP business document 'Life' of document at retention level and 'EOL' beyond total retention with document classification as needing to be a. Prune/Purged/Destroyed and/or b. Re-stored to Tier # 3 - 'Life Documents' to be kept forever policy on 3 rd Tier commodity servers being SAP ArchiveLink certified per SAP (PAM) Product Availability Metrix) listing in the event a view is requested post retention 'life cycle' this requirement would be part of overall design and integration the concept of 'cradle to grave' would be in play at this client for 'Forever' document classification.
  • PMO/PM - Project Managed - Reporting to Kohler Leadership via Dashboard updates, daily scrum, and weekly status. Managed team of over 10 +

Confidential, FL

Sr. SAP Solution Architect & Integration Platinum Consultant

Responsibilities:

  • As Sr. SAP Solution Architecture and Integrator created DVM (Document Volume Management) strategy on scope, approach with innovative techniques using new SAP data management strategies incorporated for both (Cold-Store) 'aged' rarely viewed vs. (Warm-Store) 'seldom viewed' data with Integration of EMC InfoArchive Solutions. Selection process on EMC m, is since SAP HANA and S/4HANA is compliant throughout ALM (Application Lifecycle Management) and in lieu of future HANA 'Readiness' created and designed a sustainable migration integration path forward in anticipation S/4HANA In-memory database migration and architecture.
  • Created robust set of data management industry best practices for master data consistency and transactional data volume growth, aging, and destruction based on ILM (Information Life Management) tiered approach for SAP ECC 6.05 structured and non-structured data with full retention management, decommissioning of data and/or its destruction via EOL (End of Life) policies defined, created, and signed off by TBC Leadership and business using HP-QC UAT Toolset.
  • Designed metrics and measurements using AI ML (Machine Learning) models in AI algorithms based on data volume vs. growth projection, vs. codepage/Unicode vs. Kbyte size of header record and many other KPI's include in designing a robust data management strategy around the various types of data structures and types i.e. data veracity vs. data de-duplication vs. data quality, data governance and data volume of structured vs. non structured data for transactional, control and master data type constructs in (FI-Financials) 3.4 (TB) vs. (SD-Sales & Distribution) 1.925 (TB) growing the fastest rate factor along with all other 8 Core functional areas contained with a set of metrics that produced a 58.9 % overall reduction from 10.5 (TB) to 4.8 (TB) baseline in lieu of future HANA - S/4HANA migration based on AI Algorithm 'Process and Time' metrics created using custom C++ Libraries, R, Python constructs models converted into XLS.* as 3 - 5 options were provided to TBC Leadership that could choose from based on how aggressive they want to be in there overall data volume management prior to S/4HANA migration. Consultant recommend most aggressive to achieve results they desired due to new / additional plants going live and new business expansion.
  • Master data key KPI were established using IntelliCorp Live Model / Live Compare tools for the establishment of the fundamental question of establishing a set of high-level governance and stewardship of the rules, business requirements and data quality of master data quality to be included as part of the overall reduction of SAP ECC 6.05 database 'footprint in. The AI Algorithms metrics/measurements created had to include master data domains i.e. customer, material, vendor, since often the clients focus is centered when it comes to a Data Management project on the volume instead of on the quality of the data. My job was to establish to some degree a level of rules around masters to understand for example why this customer had over 700,000 Materials older than 7 years never being modified and considered what I term as 'Stale data'. The architecture thus included all types of data structures in SAP ECC 6.05 transactional, control as well as masters in attaining 'massive' baseline reduction % from the AI algorithm set of metrics/measurements designed.
  • Transactional, control and masters in SAP ECC 6.05 (OLTP) along with Imaged (i.e. Blogs, Binary) attached invoices as an example i.e. Financial accounting document were included as part of the overall scope, strategy, architecture in establishing feasibility of sizing actual vs. projected growth of database size and volume.
  • AI Algorithms were developed around several paradigms of measurements and definitions captured at the a. DDIC - Table level, b. Database (Kbyte) rec.size, c. Unicode, d. Codepage, e. Open SAP documents per Process flow, f. OIM (Open Item Managed) G/L Accounts, g. Data duplication factor(s), h. Dark Data inclusion - defined as 'no value-data' that is either 'orphan' as not having an owner or steward and/or 'stale' data represented by data that has not been modified for at least a period of 7 years. The fact is that based on the profiling and enrichment that was produced via IntelliCorp Live Model and Live Compare tool I also found 'unrecognized records' defined as either in 'error' or 'incomplete' state. The later 'unrecognized records' could be due to ETL Loads, consolidation of acquisition, mergers, divestitures or other factors that could have introduced these Dark Data records. Thus, all types of data reviewed and considered 'Live', in SAP ECC 6.05 Production was included in the final set of AI Algorithm measured and proposed to client.
  • Upgraded ADK (Archive Development Kit), ARC-IS (Archive Information System), AE - Archive Enabled SAP Transaction codes to read data from ADK via OSS, Custom coded via ABAP4 15 FICO reports, along with custom Infostructures across every core functional module.
  • Designed a ILM Tiered Architecture for EOL 'End of Life' model beyond retention 'life cycle' being reached. EOL documents considered to be business documents that must remain for the life of established legal 'entity' but many laws per industry, sector, client requirements also dictate what constitutes 'Life' documents that are excluded from 'destruction and/ or purging'.
  • PMO/PM - Project Managed - Reporting to TBC Leadership via Dashboard updates, daily scrum, and weekly status team of over 30 +

Confidential, San Diego CA

Sr. SAP Solution Architecture & Integrator Platinum Consultant & Project Managed

Responsibilities:

  • Responsible as Sr. SAP Solution Architect, Integrator, and Project Manager of all 4 projects at Confidential . I served in a variety of roles and performed and used a diverse set of skillsets as outlined in each project.
  • The 1 st Project I served mostly as Sr. Solution Architect and PMO/PM Project Manager Owner. The 2 nd Project I served as Lead solution architect and integrator in BW 7.4 on HANA SP12 performing NLS SQL IQ, DT Dynamic Tiering, and Displacement of 'Aged' warm/cold data from HANA 'live' database. The 3 rd Project I was Sr. Solution architect, integrator, and project managed creation of a HADOOP Multi-Node Cluster on WIN 2012 R2 using SPARK vs. HIVE on HDP (Hortonworks Data Platform) 2.2.
  • The 4 th and final project served as Sr. Solution Architect, integrator and project managed a Multi-Node Cluster using RHEL (RedHat Enterprise License) on Reel 6.6 Linux using sAp SPARK Controller / SAP VORA 1.3 with VORA Modeler and predictive analytics models designed, along with HANA SDI, SDQ, SLT. See details below of each of the 4 projects below:

Confidential, San Diego CA

Sr. SAP Solution Architecture & Integrator Platinum Consultant

Responsibilities:

  • As Sr. SAP Solution Architect and Integrator lead the BW on HANA 'Data Aging/ Archiving' project using NLS SQL IQ (Cold-Store), DT Dynamic Tiering (Warm-Store) Displacement options to reduce overall 'footprint' of live HANA In-memory database size as much as 35.8 %
  • H/W Selection BW NLS SQL IQ Optimization Data aging Project, setup installation, configuration, and integration of 3 commodity server's connectors to HANA SP09 to SQL IQ (Columnar) database for storing of (Cold-Store) 'aged' data
  • Development and execution of a. Assessment/Deep dive, b. Project plan, c. Blueprinting, d. Realization phase of H/W acquisition for NLS (Nearline Storage) commodity servers - performed installation, configuration and integration, e. Incorporated EIM - SDQ for profiling data, cleansing, enrichment of data, duplication, match record type, f. Creation and development of prototyping BW DAP (Data Archive Process) with full integration to SAP HANA, g. Performed UAT via HP-QC for business user test script/case signoff on reports, h. Created Pre / Post Cutover steps for I. BW 7.4 Go-Live with NLS SQL IQ (Nearline Solution) along with DT (Dynamic Tiering) enabling full automation using PC Process Chains.
  • Architected BW (DAP) Data archive process using various InfoProviders i.e. DSO's, CUBE's demonstrating 'Time Slice' archiving with connectivity to NLS SQL IQ (Nearline Storage) considered (Cold-Store) along with BW objects needing to still be viewed 'as needed/requested' placed via Dynamic Tiering/Configuration techniques into (Warm-Store) secondary disk storage in a SQL IQ database. Warm requirements could be 'pulled' back if 'activated' into 'Live' HANA (Hot-Store) In-memory database for viewing and processing.
  • Developed Dynamic Tiering using Data Aging concepts of 'Displacement' meaning removal of data from live 'Hot' HANA database and placing in Secondary Storage via Dynamic Tiering displacement configuration options
  • NLS Data strategy and usage is since OLAP/Reporting layer can be controlled by 'Near-Line Storage' settings found in BEx Queries properties, Multi Providers -properties and Cube/DSO properties - Recommended Activating NLS (Nearline) viewing at the MP (Multi Provider) level by setting Info provider switch 'Nearline access switch on' as enabled. Instead of at Bex report level or DSO/CUBE level.
  • Recommended / Developed configuration sizing for NLS (Nearline Storage) commodity server specification and configuration with end to end integration options to reduce what is live in HANA In-memory (Hot-Store) database making sure Sempra remains well below the SAP recommended 50 % threshold of 'Main Memory' in avoidance of memory data overflow issues / warning
  • Validation performed on all BW DAP (Data Archive Process) event steps i.e. # 10 - Initiate, # 40 - Write Step, # 50 - Verification Trigger, # 60 Delete Step including final step of 'Restore' in the event Sempra BW Project team ever required BW Objects already archived and stored in NLS SQL IQ and brought back into HANA 'live' In-Memory database. This final integration process was also demonstrated and integrated as part of this rollout at Sempra via the BW DAP Framework under Archiving Tab > Status > Reload: Post deletion has successfully been performed
  • Created technical and functional SBS (Step-by-Step) documentation on a. NLS SQL IQ - Installation, configuration and integration, b. DT (Dynamic Tiring), c. Displacement Secondary Storage, d. Creation/Development of TCSLA (Temperature Control Service Level Agreement), e. Set NLS (Nearline) MP (Multi-Provider) strategy on defined Info Provider at MP Level, f. Lessons Learned and finally g. Next Steps Phase 2 - New Innovation / Process of i.e. Partitioning BW Objects etc.,
  • Project Managed reporting to Sempra Sponsors/Directors, daily scrum, budget, resourcing, and leadership along with Solution Architect and Integrator as Lead

Confidential, San Diego CA

Sr. SAP Solution Architect & Integrator / Project Managed

Responsibilities:

  • As Sr. Solution Architect and Integrator - Installed, Configured, and Integrated a HDP (Hortornworks Data Platform) HADOOP 5 Node Multi-Cluster Architecture configured via HDP SETUP 2.0.6.0 3 Layers . First (Core Layer) : HDFS (Hortonworks Data File System), YARN and MapReduce 2 (MR2). Second Essential HADOOP Layer): APACHE - Pig, Hive, HCatalog, HBase, Zookeeper. WebCat (Templeton). Third (Supporting Layer): APACHE - Oozie, Sqoop, Flume, Mahout, Knox, Storm, Phoenix, Tez, Falcon, Ranger
  • HADOOP Distribution Setup / Hardware Configuration
  • Software Installation, Configuration, and Integration: Environment Variable: Python 2.7.11, Java JDK1.7.0.79, Microsoft.NET Framework 4.5.1 & Visual C++, 2010, Hadoop- 2.2.0.2.0.6.0-0009 , Flume- 1.4.0.2, hbase-0.96.0.2, \hdpdata\hdfs, hive-0.12.0.2, jdk1.7.0 79, mahout-0.8.0.2, Oozie- 4.0.0.2, pig-0.12.0, sqoop-1.4.4.2, zookeeper-3.4.5.2
  • Setup Configuration on H/W - 5 VM (Virtual Machines) HADOOP Servers: i.e installed on each Hortonworks /Apache Versions 2.0.6.0/2, on WIN 2012 R2 OS, 5xdatanode CPU E5 2693 v3 2.3Ghz, 64 Bit OS, x64 processor 16 Cores 64 Virtual / 64 (GB)/128 (GB), 1 TB Each.
  • Setup Environment on HANA SP09 /12 H/W: - HANA SP09 SBX (Sandbox/Copy Dev) in HANA SBX setup odbc.ini— DSN name - HDB, Driver Path -I. e etc./host Files, Customer.sh, odbc.ini - DSN name, Driver Path-/hdbclient/libodbcHDB.so, Host IP: r3dbxxx.client.com / 10.192.xxx.xx, Scheme, Hive UID/Password, Driver Path - //sharkodbc/lib/64/libsimbasharkodbc.so
  • Setup Environment on 5 VM (Virtual Machines) HADOOP Servers i.e. odbcinst.ini - ODBC
  • Created SDA (Smart Data Access) Remote connection (Name: SPARK03) in HANA to SPARK
  • / usr/sap/ss1/ home > isql HDB system Hanasbx
  • SQL> select * from BOBJ.BOBJ USERS Note: SQL IQ Connect to HANA (SBX) Sandbox returned 21,406 records from Content Scheme/Table: BOBJ USERS Validation in HANA Studio
  • Created / Developed Workshop for BW Project Team along with HADOOP Developers/ Admin for the Installation, Configuration, Integration along with Ingestion, extraction using HADOOP Tools i.e HANA SDA (Smart Data Accessing) creating (VT) Virtualized Tables in HANA, Spark, hive sqoop, pig scripting, hdfs, MapReduce etc.,
  • Project Managed reporting to Sempra Project Lead & Sr. BI Analytics Lead with weekly scrum/standup project status report along with being Sr. Solution Architect and primary

We'd love your feedback!