We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Rockledge, FloridA

SUMMARY

  • A professional with total of 9 years of extensive IT experience in Siebel CRM and SAP ERP and as a HADOOP Developer.
  • 8 years of ETL experience with different technologies.
  • Good knowledge of Hadoop ecosystems, HDFS, MapReduce, Pig Latin, HiveQL, Sqoop and Flume.
  • Around 2+ years of experience in Installing, Configuring and using Hadoop ecosystem components like CDH3/4, Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, and Flume.
  • In depth knowledge of Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts.
  • Good Experience in data loading from RDBS system to HDFS system using Sqoop.
  • Loading Weblog data into HDFS using Flume.
  • Analyzing data using Pig scripts and Hive queries and writing Customs UDF’s for analysis.
  • Defined and developed best practices in Hadoop.
  • Loaded the dataset into Hive for analysis.
  • Knowledge over SQL and NOSQL database. Hands - on experience with My SQL and HBASE.
  • Worked on all phases of data warehouse development lifecycle, ETL design and implementation, and support of new and existing applications.
  • Extensively used ETL methodology for supporting Data Extraction, transformations and load using Cransoft ETL tool for SAP data Migration.
  • Involved in creating functional mappings, active transformations and also reusable transformations.
  • Actively involved in cleansing Client Source data as per requirement and sending back reports for the same to the Client before the actual Transformations.
  • Experienced in analysing business requirements and translating requirements into functional and technical design specifications.
  • Experience in working with different data sources like Flat files, XML files and Databases.
  • Hands on experience in Data Migration from various databases and applications into SAP.
  • Created sample data for purpose of testing the developed mappings.
  • Designed complex SAP data migration mappings designing the legacy and target mappings.
  • Involved in creating the Specifications document according to the client requirement.
  • Excellent technical and analytical skills with clear understanding of ETL design and project architecture based on reporting requirements.
  • Delivered Superior quality data before the actual migration in the form of reports or load files.
  • Work closely with teams to ensure Application GO LIVE’s are on schedule and on time.
  • Excellent problem solving skills, high analytical skills, good communication and interpersonal skills.

TECHNICAL SKILLS

Big Data/ Hadoop: Apache Hadoop, HDFS, MapReduce (MRV1 and MRV2), Hive, Pig, HBASE, Sqoop, Flume, MRUnit, Cloudera CDH3/4.

ETL Tools: SAP BODS, Cransoft, DSP, Sqoop, FLUME,IBM Datastage

Operating Systems: Windows XP, 7 and 8, Unix

Databases: MySQL, HBASE, Oracle PL/SQL

Software Application: TOAD, Putty, WinSCP, FileZilla, BMC Remedy Application, HPQC

Programming Languages: Core java knowledge

PROFESSIONAL EXPERIENCE

Hadoop Developer

Confidential, Rockledge Florida

Responsibilities:

  • Developed and Supported Map Reduce Programs those are running on the cluster.
  • Brainstorming sessions with team to explore the features of Hadoop.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Worked extensively with Sqoop for importing metadata from Oracle. Extensively used Pig for data cleansing.
  • Importing test data into HDFS cluster from My SQL using Sqoop import, to test the ETL scripts in new environment.
  • Executing the scripts in different modes like local mode, map-reduce mode.
  • Created Hive tables and working on them using Hive QL.
  • Installed and configured Pig and also written Pig Latin scripts for ad-hoc analysis.
  • Analysed the data by performing Hive queries and running Pig scripts to know user behaviour.
  • Involved in loading data from UNIX file system to HDFS.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce.
  • Update maps, sessions and workflows as a part of ETL change.
  • Modifications to existing ETL Code and document the changes.

Environment: Hadoop Ecosystem (HDFS, MRV1, Pig Latin, Hive, HBase), MY SQL, Core Java

Senior SAP Data Migration Analyst

Confidential, MA

Responsibilities:

  • Part of the STP team for Canada SAP data Migration. STP stands for Source to Pay.
  • Responsible for Vendor Master Creation and Deployment
  • Active Data management specific role mapping (as part of deployment activities) - ownership, processes and tools
  • Driving the Data Quality function of the team to ensure quality and integrity checks are developed and shared across data management functions to improve the operational processes.
  • Constantly interacting with the client to understand the requirement to come up with the technical methodology and to implement the business logic
  • Involved in Defects Fixing of the remaining objects that are held at offshore and onsite
  • Successfully completed end to end Purchase orders Data Migration
  • Completed end to end Purchase order migration that involved PO Output conditions, Open PO Upload, PO Inbound Deliveries, and PO Acknowledgements.
  • Developed and modified the current BAPI programs for PO loads and SAP recordings to create/change/ modify data using SAP standard recording methods and Cransoft BDC recording too.
  • Used Cransoft ETL tools i.e. DSW, Cranport, and BDC recording tools to Extract, transform and load data.
  • Used Cransoft Ezmap mapping tool for all mapping documentation at both object design and value mapping level.
  • Provided coverage for loading Vendor Master and Purchase Info Records.
  • Single point of contact for all Vendor Master and PO related queries and solely responsible for the object loads.

Environment: Cransoft DSW, EzMap, Data Garage, Canport, MY SQL2008, HPQC

SAP Data Migration Analyst

Confidential

Responsibilities:

  • Involved with Confidential in successful end to end SAP data migration.
  • Gathered required information from Functional Consultants and business to transform data technically.
  • Responsible for loading data for Poland Data Migration using Cransoft ETL tools i.e. Data Stage Warehouse, Cranport, Data Garage, BDC Direct, EZMap and Microsoft SQL Server.
  • Build transformation Rules in MS SQL based on the Functional Spec and EzMap Field and Value mapping details.
  • Create Cranport Packages to insert data from different databases and excel and text file using filters rules to get in the required data.
  • Creating load files and Client reports using Microsoft Sql and Cransoft reporting tool
  • Writing Remediation Rules to change data as per business requirement to change data in reports.
  • Designed LSMW standard program to load Vendor Master Data.
  • Created multiple recording using SAP recording T-codes and SAP LSMW recording method to load/change data as per the requirement.
  • Loaded Vendor Master Data, Purchase Contracts and Purchase Orders.
  • Single Point of contact for all vendor masters and PO related queries.
  • Developed multiple BDC recordings, LSMW Programs for Vendor Master Data i.e. for LFA1, LFB1, LFBK, LFBW and LFM1.
  • Worked with remediation and migration teams to ensure accurate and consistent master/ transaction data transformation.
  • Moved the projects from Dev to ST, ST to UAT and UAT to Production.

Environment: Cransoft DSW, EzMap, Data Garage, Canport, MY SQL 2008, HPQC

SAP Data Migration Analyst

Confidential

Responsibilities:

  • Worked with FI team to process the daily loads.
  • Worked on SAP IS Utilities to help facilitate data migration.
  • Followed a stable process of loading the data from source to target.
  • Expertise in fixing production support issues related to multi-provider mapping, data inconsistency, and job failures.
  • Loading Cost centres, Profit centres, Company Code, GL Accounts data in SAP using BDC.
  • Worked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality.
  • Developed rules and reports for STS team.
  • Loading the data in both Dev and QA Servers on daily basis.
  • Analyzing errors occurred during load.
  • Worked on Ezmap for loading functional design for HR Process.
  • Assisting other teams like STS, OTC and P2P team in day to day process.
  • Participated in detailed report design and created design specifications documents at report level.
  • Sound understanding of end to end Data Migration process including Extraction, Data Profiling, and Transformation and Generation of Load ready data files in LSMW format.
  • Ability to interpret, configure and implement data mapping / transformation business rules using Cransoft.

Environment: Cransoft DSW, EzMap, Data Garage, Canport, MY SQL 2005

Sr. Software Engineer

Confidential

Responsibilities:

  • Support for smooth functioning of the Crystal (Siebel) application.
  • Writing Transformation queries using PL/SQL and moving data from Source to Target to Siebel Application.
  • Writing ad-hoc reports in PL/SQL as and when required.
  • Used IBM-Datastage for creating ETL jobs using Oracle Database as backend and PL/SQL for writing transformations.
  • Making sure all the SR’s are automated and completed on the crystal application for providing smooth and uninterrupted service to IDEA customers.
  • Communicating with IDEA client as and when needed through email or phone.
  • Creation of BULK SR’s, cancelling and completion of SR’s using Siebel Business Service.
  • Co-coordinating with different teams as and when required for the proper inflow and outflow of data from Crystal application to other applications and vice versa.
  • Responsible for generation of daily business Reports required by business on daily basis and resolving Report issues as and when required.
  • Responsible for making daily and monthly dashboards and also generating weekly and reports and sending across to IDEA clients on timely basis.
  • Creation and deletion of users in crystal application and loading daily IDEA plans into Crystal Application and automate them at the Application level.
  • End to End monitoring of SR’s raised at different levels making sure that response times are met to maintain SLA.
  • Responsible for closing tickets related to Automation, Reports and data loading using BMC Remedy tool to provide timely resolution to IDEA client and customers.
  • Provide training to Different IDEA Call Canter employees as and when any changes are made in crystal application by phone, email or by visiting client side.
  • Also involved in testing Siebel application from an operational perspective for different circles during system GO LIVE.

Environment: Siebel 8.0, Siebel EIM, Siebel EAI, PL/SQL, IBM Web Sphere, Putty, Toad, BMC Remedy, DataStage, Oracle DB

Software Engineer

Confidential

Responsibilities:

  • Installed Argis, SQL Server and ADT, with the required patches
  • Identified the schema of the Legacy system used by the business
  • Performed ETL (Extract, Transform and load) using Advance Data Transfer (ADT).
  • Establish connection to source system using ODBC connectivity (Flat files, SQL Database, Excel, and Access)
  • Identified, different Roles for a User to be associated with
  • Created Filters to further restrict what kind of data a user can access
  • Set up Access right with Roles and Filters for Argis Web
  • Developed Multiple Transformers in ADT
  • Analyzed the data sent by the Clients and prepared templates that can be used to load data into Argis using automated scripts
  • Wrote complex queries to filter and validate the Data
  • Automated the” cross check script” that would validated the data that was loaded against the data present in Target system
  • Exported Reports as per Client requirement in CSV format
  • Created Shared Drives for easy access. Exported Reports as per Client requirement in CSV format
  • Created Shared Drives for easy access
  • Worked on Asset Intelligence (A.I) to have it integrated with UAM and USD
  • Developed complex queries to extract data from various tables so that they can be in accordance with the Client requirement
  • Automated the script that loads Non- Confidential consumer finance individuals (as per business requirement)
  • Tested Progress to develop Argis Reports
  • Worked Extensively on MS Excel to filter and validate data

Environment: Argis, ADT, SQL Server, MS Excel

We'd love your feedback!