Cloud Azure S/4 Hana/business Objects Data Services/natezza/hana Resume
3.00/5 (Submit Your Rating)
PROFESSIONAL SUMMARY:
- Having 14+ years of IT Experience in Analysis, Design, Development, Implementation and Testing of enterprise - wide application, Data Warehouse, Client Server Technologies and Web-based Applications.
- Over 4 Years of experience on Azure Cloud.
- Over 3 Years of experience as Snowflake Data Engineer.
- Having over 10 years of Cloud/Dataware housing experience in Azure/Snowflake/SAP Data Services 4.0/4.1/4.2 and 3.2, Migrating BODS landscape to S/4 HANA Cloud/Machine Learning landscape, Data quality tools like Information Steward 4.0/4.1/4.2 with an emphasis on data migration and data quality initiatives for clients. Also have Professional experience on SAP HANA, BO Reporting tools such as Webi, Crystal Reports, and Dashboards. And have good exposure and knowledge with BW/HANA Modeling and ABAP, BPC/MDM/MDG.
- My experience in Migrating data and Analytical with Database Design, Integration and Data
- Management skills gives me the unique ability to manage multi-disciplinary programs and overcome technical, non-technical roadblocks.
- Expertise on ADF/Data Bricks/Machine Learning/Data Sets.
- Microsoft Azure Storage Explorer
- Azure Data Studio
- Linked Services
- Function App
- Blob Storage, Datalake Storage Gen2, SQL Database
- ADF Data factory Vs Set Up Code Repository (Github, Azure DevOps Git), Github Repositories and branches, Collaboration Branches
- Azure Dev Ops Git
- Creation of Repositories, Test Plans
- Pipeline:
- Move, Transform (Copy Data, Data Flows)
- Azure Data Explorer
- Azure functions
- Batch Service
- Databricks
- Data lake Analytics
- HDInsight (Hive,MapReduce,Pig,Spark)
- Power Query
- Metadata functions
- Expertise on ADF Ingest, Orchestrate, Configure SSIS
- Experience in Azure Migration, Azure Load balancing, Networks, Backups, App Services, Security Center IaaS and PaaS.
- Set Up Github code repository for ADF, Generate ARM Templates
- Migrated the on-premises workloads to Azure cloud based on the requirement.
- Designed and deployed Azure Backup and other Confidential backup solutions for Azure.
- Designed and migrated on-premises Servers/Data Bases to Azure IaaS & PaaS.
- Worked on Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics and processing the data in In Azure Databricks.
- Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse.
- Azure network experience, VPN and Express route, Azure DNS, Traffic Manager, and Load Balancers.
- Implemented Azure AD B2C, Custom UI and Custom Attributes, Role based access (RBAC) and AD Groups.
TECHNICAL SKILLS:
Cloud Platform: Azure, Snowflake
BI Tools: Business Objects XI 3.1, R1/R2/R3/, Web-Intelligence, Crystal Reports
ETL Tools: Informatica 8.6 Power Center, BODI Data Integrator (ETL),BODS Xi3.2,4.0,4.1
Platforms: Windows, UNIX, LINUX, Web Sphere, Apache/Tomcat.
Databases: Oracle 9i/ 8i/7.3, MS SQL Server 2008/2012, DB2.
Languages: C, SQL, PL/SQL, MS.NET,SAP ABAP
PROFESSIONAL EXPERIENCE:
Cloud Azure S/4 HANA/Business Objects Data Services/Natezza/HANA
Confidential
Responsibilities:
- Extract data from SAP HANA to Blob storage and Implement ADF and push to Snowflake.
- Handle Error Rows in ADF Mapping Dataflows
- Incrementally copy new and changed data in Data Factory
- Use ADF Pipeline Inbuilt functions to process files based on various conditions (Size, no of columns)
- Deploy the ADFs using Github (Master, Collaboration branches) to QUAL, UAT, PROD.
- Involved in Migrating Objects from various on-premise data bases to Snowflake.
- Created Snowpipe for continuous data load and used COPY to bulk load the data.
- Used SnowSql to run batch job scripts and configured parameters and variables to batch jobs.
- Used SnowSql to perform various DDL and DML operations on DBs, Tables etc.
- Used to Kafka Connector to ingest various logfiles into Snowflake using Snowpipe.
- Created internal and external stages and transformed data during load.
- Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
- Used Temporary and Transient tables on diff datasets.
- Cloned Production data for Dev/QA environments.
- Shared sample data using grant access to customer for UAT.
- Time traveled to 60 days to recover missed data.
- Developed data warehouse model in snowflake for various datasets.
- Did the POCs on Snowflake to understand best possible way to use for different use cases.
- Migrate BODS landscape to S/4 HANA landscape
Business Objects Data Services/BW/BI Developer
Confidential
Responsibilities:
- Fetch the data from SAP Systems by using Standard and custom extractors and load it to SQL Server by using BODS4.1 Batch Jobs
- Perform the data load from SAP BI7.0 to SQL 2012 through SAP BODS4.1
- Upload the data from XML files to SQL2012 through BODS4.1 batch jobs
- Debug the BODS Batch job Flows
- Worked on all the standards & guidelines Best practices Documentation, and also the functional & Technical design specification documentation.
- Migrate all the Projects of the team from DEV TO TEST TO PS TO PROD, & scheduled the jobs as part of the Administrator.
- Worked on the Data Quality, Address cleanse transformations for cleansing the data
- Provide the technical guidance & knowledge transfer of effectively using BODS4.0,4.1
- Create and change all BI Objects in RSA1 Administration Work bench.
- Have been developing and changing Start Routines, End routines and Expert routines.
- Daily Load monitoring SAP BI Jobs through RSPC Transaction Code.
- Trouble shoot the issues on existing Bex Reports.
ABAP Developer
Confidential
Responsibilities:
- Creating Jobs, Work flows, data flows and scripts in BODS 4.1 designer to pull data from the legacy systems and load the data into target database SQL 2012. Writing SQL Statements in DS scripts to modify data. Created initial and delta loads.
- Responsible for creating mapping and transformation specifications based on the business requirements from various business teams and implemented in Data Services jobs.
- Worked on Source to Target Mapping design.
- As a part of Administration work, created the Users, groups and assigned the security in The Central Management console.
- Installed the client tools on all the developer machines created their local repositories and added them to central repository. Involved in the Check in, Check out, get operations with other local repositories from Central Repositories.
- As a part of database Administrator Microsoft SQL server 2008 created the users, mapped the users to their respective local repositories and granted the access to their specified groups.
- Validated and executed batch jobs. Tested data flows and scripts against sample data and real data. Creation of initial and incremental load jobs.
- Involved in knowledge transition to other Developers. Presented a Demo of using all the Platform Transforms, complex functions, look up functions.
- Involved in the production support, and interacted with the Reporting developers to change the code of the Tables based on their reporting requirements.
- Worked on the Universe Design Row- level Security in the Business objects.
- Worked on Custom SAP Extractors to fetch the data from SAP Systems
- Developed Custom Extraction Function Modules for Generic Data Sources
- Created Custom DDIC Structures for Generic Data Sources
Project Engineer
Confidential
Responsibilities:
- Creating Jobs, Work flows, data flows and scripts in Data Integrator to pull data from the legacy systems and load the data into target database. Writing SQL Statements in DI scripts to modify data. Created initial and delta loads.
- Responsible for creating mapping and transformation specifications based on the business requirements from various business teams and implemented in Data Integrator jobs.
- Validated and executed batch jobs. Tested data flows and scripts against sample data and real data. Creation of initial and incremental load jobs.
- Extracted Data from all the Relational Databases including oracle, Microsoft Sql server and loaded in different environments which include BW, ORACLE, SQL Server, and .txt & .CSV files.
- Worked on the DI scripting & prepared a standard Incremental load template which enables the job failure/success notifications.
- Involved in Knowledge transfer to certain users, like explaining the functionality of different transforms (merge, validation, key-generation, query) etc.
- Worked on the Management console in scheduling the jobs, creating the users and granting the security as per the roles & responsibilities
- Involved in the migration of all the jobs of all the developers in moving from DEV TO TEST TO PROD
- Provided the day to day support for all the scheduled jobs in the production
- Created all the technical design specifications & Functional specifications documents by interacting with the business analysts and the business users.