We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • Around 8 years of experience in IT which includes 2 years of experience in Big Data with good knowledge on HDFS and Ecosystem.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components likeHDFS, HBase, Hive, Sqoop, Pig, and Spark.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Extensive experience in working with structured data using Hive QL, join operations, writing custom UDF's and experienced in optimizing Hive Queries.
  • Strong experience in analyzing large amounts of data sets writing scripts and Hive queries.
  • Very good experience of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Working on different data format such as Parquet, XML and JSON.
  • Worked on HBase to load and retrieve data for real time processing.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark Context, Spark - SQL, Data Frame and Spark RDD.
  • Working on AB Initio to Big data migration project.
  • Expertise in ETL methodology for developing Extraction, Transformation and Load processes using AB INITIO.
  • Have been involved in requirement gathering, design, development, code review and Implementation.
  • Worked on Database components, multi files, serial files, Transform components, aggregate, rollup, XML, MQ publish, MQ Subscribe and JMS Publish.
  • Developed Unix Shell scripts for ETL requirements.
  • Extensive experience with CSV,XML,JSON formats.
  • Expertise knowledge improving Performance and Troubleshooting of the AB-Initio graphs and monitoring AB-Initio run time statistics.
  • Extensive experience in writing Oracle PL/SQL Packages, Procedures and SQL queries.
  • Good knowledge on AB Initio debugging strategies.
  • Good Knowledge on Enterprise Meta environment (EME).
  • Hands on experience with scheduling tool Control - M & Event Engine.
  • Good knowledge and working experience in Agile & Waterfall methodologies.

TECHNICAL SKILLS

Hadoop: HDFS, Hive, HBase, Sqoop, Pig, Spark, Scala

DWH Tools: AB Initio/ 1.13/1.15/3.1.5 Co>Ops 2.13/2.15/3.1.5

Operating Systems: UNIX, LINUX

Scripting: Shell, Bash scripts

Scheduling tool: Control - M, Event Engine

SQL Databases: Oracle, DB2

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Hadoop Developer

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
  • Designing and creating Hive external tables using shared meta-store with partitioning, dynamic partitioning and buckets.
  • Developed and executed Hive Queries for DE-normalizing the data.
  • Written the Hive UDFs in Java where the functionality is too complex.
  • Involved in loading data from LINUX file system to HDFS
  • Develop HIVE queries for the analysis, to categorize different items.
  • Create Data Frame and Spark RDD.
  • Maintained System integrity of all sub-components (primarily HDFS, Hive, HBase, and Spark).
  • Automated workflow using Shell Scripts.
  • Performance Tuning on Hive Queries.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Scripts
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.

Environment: Hadoop, HDFS, Hive, HBase, Java, Spark, Linux, Scripts.

Confidential, Phoenix, AZ

Hadoop Developer

Responsibilities:

  • .Worked on analyzing Hadoop cluster using different big data analytic tools including Hive and HBase.
  • Experience in Developing the Hive queries for the transformations, aggregations and Mappings on the Customer Data.
  • Writing UDF (User Defined Functions) in Hive.
  • Created HBase tables to store various data formats.
  • Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts
  • Used Parquet file formats for serialization of data
  • Automated the History and Purge Process.
  • Automated workflow using Shell Scripts.
  • Performance Tuning on Hive Queries.
  • Performance tuned and optimized Hadoop clusters to achieve high performance.
  • Extensive hands on experience in Hadoop file system commands for file handling operations.
  • Worked on Integration of Tableau.

Environment: Hadoop, HDFS, Hive, Java, Hbase, Linux, Scripts, Eclipse.

Confidential, Phoenix, AZ

AB Initio Developer

Responsibilities:

  • Project management for the respective modules and assisting the project manager for estimation.
  • Analysis and gathering of requirements for new implementations.
  • Generation of high - level specifications and reviewing the documentation of designs.
  • Coordinating and assisting the team to ensure that the tasks are delivered on schedule.
  • Ensuring adherence to quality standards on the deliverables made.
  • Estimation of effort required for completing the activity.
  • Troubleshooting and defect fixing on the delivered code.
  • Respond to the queries from business user and provide support to the application.
  • Testing - Unit testing and integration testing
  • Post implementation support

Environment: AB Initio,DB2,Uunix,Linux,Control-M,Service Now, Toad.

Confidential, Phoenix, AZ

AB Initio Developer

Responsibilities:

  • Project management for the respective modules and assisting the project manager for estimation.
  • Analysis and gathering of requirements for new implementations.
  • Generation of high - level specifications and reviewing the documentation of designs.
  • Coordinating and assisting the team to ensure that the tasks are delivered on schedule.
  • Ensuring adherence to quality standards on the deliverables made.
  • Estimation of effort required for completing the activity.
  • Troubleshooting and defect fixing on the delivered code.
  • Respond to the queries from business user and provide support to the application.
  • Testing - Unit testing and integration testing
  • Post implementation support

Environment: AB Initio,DB2,Uunix,Linux,Control-M,Service Now, Toad.

Confidential, Phoenix, AZ

AB Initio Developer

Responsibilities:

  • Requirement gathering and preparing required documents.
  • Coordinated with the Project Managers, Business Analysts and analyzed requirements needed for Development.
  • Involved in creating source target mappings based on the Technical Design requirements.
  • Developed number of AB Initio Graphs and plans based on business requirements using various AB Initio Components like Reformat, Join, Sort components, Rollup, Normalize, Assign Key component, Partition & departition components, etc.
  • Responsible for writing unit test cases and assisting testers to capture all these scenarios.
  • Involved in developing Unix Shell wrappers to run various graphs, plans.
  • Created test data and test cases to make sure transformations are accurate and meeting the business requirements.
  • Involved in writing, testing and loading the Control - M in the Prod and UAT for different steams.
  • Documented test scenarios and test cases.
  • Involved in migrating the code form Development to QA and Supported testing team in System testing and Integration testing.
  • Responsible for resolving the defects and conducting knowledge transfer sessions to test testing team on timely manner.
  • Created Detail Level Design Documents and Production support guides for various projects.
  • Analyze and resolve complex technical problems regarding application and job execution.
  • Configure or coordinate AB Initio installations, Control-M job scheduling and Unix deployments.
  • Performing Defect Analysis and Defect Tracking.
  • Involved in Migrating the Control-M jobs to the Prod Environment..

Environment: AB Initio,DB2,Uunix,Linux,Control-M,Service Now, Toad.

Confidential, Phoenix, AZ

AB Initio Developer

Responsibilities:

  • Requirement gathering and preparing required documents.
  • Coordinated with the Project Managers, Business Analysts and analyzed requirements needed for Development.
  • Involved in creating source target mappings based on the Technical Design requirements.
  • Developed number of AB Initio Graphs and plans based on business requirements using various AB Initio Components like Reformat, Join, Sort components, Rollup, Normalize, Assign Key component, Partition & departition components, etc.
  • Responsible for writing unit test cases and assisting testers to capture all these scenarios.
  • Involved in developing Unix Shell wrappers to run various graphs, plans.
  • Created test data and test cases to make sure transformations are accurate and meeting the business requirements.
  • Involved in writing, testing and loading the Control - M in the Prod and UAT for different steams.
  • Documented test scenarios and test cases.
  • Involved in migrating the code form Development to QA and Supported testing team in System testing and Integration testing.
  • Responsible for resolving the defects and conducting knowledge transfer sessions to test testing team on timely manner.
  • Created Detail Level Design Documents and Production support guides for various projects.
  • Analyze and resolve complex technical problems regarding application and job execution.
  • Configure or coordinate AB Initio installations, Control-M job scheduling and Unix deployments.
  • Performing Defect Analysis and Defect Tracking.
  • Involved in Migrating the Control-M jobs to the Prod Environment.

Environment: AB Initio,DB2,Uunix,Linux,Control-M,Service Now, Toad.

We'd love your feedback!