We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Charlotte, NC

PROFESSIONAL SUMMARY:

  • Over 7 years of experience in the field of IT including four years of experience in Hadoop ecosystem.
  • Good knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Data Node, Name Node and Map - Reduce concepts.
  • Implemented in setting up standards and processes for Hadoopbased application design and implementation.
  • Experience in installation, configuration and deployment of Big Data solutions.
  • Experience in Hadoop Ecosystem including HDFS, Hive, Pig, HBase, Oozie, Sqoop and knowledge of Map-Reduce framework.
  • Experience working with NoSQL database including MongoDB and HBase.
  • Experience in developing NoSQL database by using CRUD, Sharding, Indexing and Replication.
  • Worked on graph database Neo4j by creating nodes and by creating relationships between the nodes.
  • Hands-on experience with Kafka, Cassandra and Camel.
  • Experience in developing Pig scripts and Hive Query Language.
  • Written Hive queries for data analysis and to process the data for visualization.
  • Responsible for developing Pig Latin scripts.
  • Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
  • Experience in managing and reviewing Hadoop Log files.
  • Used Zookeeper to provide coordination services to the cluster.
  • Experienced using Sqoop to import data into HDFS from RDBMS and vice-versa.
  • Experience in requirement analysis, system design, development and testing of various software applications.
  • Handling structured and unstructured data and applying ETL processes.
  • Hands on experience in application development using Java, RDBMS and Linux Shell Scripting.
  • Proficiency in programming with different Java IDE’s like Eclipse, Spring Tool Suite.
  • Detailed understanding of Software Development Life Cycle (SDLC) and sound knowledge of project implementation methodologies including Waterfall and Agile.
  • Experiences in all phases of the software development lifecycle: Concept, Design, Development, QA, Rollout and Enhancements
  • Ability to work independently to help drive solutions in fast paced/dynamic work environments
  • Strong team building, conflict management, time management and meeting management skills.
  • Excellent communication skills and leadership skills.

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, HDFS, Hive, Pig, Oozie, Sqoop, Map-Reduce, HBase, MongoDB, Zookeeper

Fault Management Tool: IBM Tivoli suite of products (OMNIbus, ITNM, Web-GUI, TBSM, TCR, Impact, ITM)

Database Technologies: PL/SQL, NoSQL, MongoDB, Neo4j

J2EE Technologies: Servlets, JSP, JDBC, JNDI, Hibernate

Programming Languages: C, C++, JAVA

Web Technologies: HTML, JavaScript, AngularJS

Operating Systems: Windows, Linux

Office Tools: MS Word, MS Excel, MS PowerPoint, MS Project

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Installed and configured Hive, Pig, Oozie, and Sqoop on Hadoop cluster.
  • Developed simple to complex Map-Reduce jobs using Java programming language that was implemented using Hive and Pig.
  • Supported Map Reduce Programs that are running on the cluster.
  • Cluster monitoring, maintenance and troubleshooting.
  • Handled the importing of data from various data sources, performed transformations using hive, Map-Reduce, loaded data into HDFS and extracted data from MySQL into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries (HiveQL) and running Pig Scripts (Pig Latin).
  • Assess existing and available data warehousing technologies and methods to ensure our Data warehouse/BI architecture meets the needs of the business unit and enterprise and allows for business growth using Cassandra database.
  • Used Pig as ETL tool to do transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Worked on NoSQL database including MongoDB and HBase.
  • Developed NoSQL database by using CRUD, Indexing, Replication and Sharding in MongoDB. Sorted the data by using indexing.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Written multiple MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats.

Environment: Java (JDK1.7), Java 7, Hadoop 2.6.0, MapReduce, HDFS, Hive 0.13.0, Sqoop 1.4.4, HBase, Pig 0.12.0, Oozie, Kerberos, Linux, Shell Scripting, Oracle 11g, PL/SQL, SQL*PLUS, HDInsight

Confidential, Phoenix, AZ

Hadoop Developer

Responsibilities:

  • Involved in Installing, Configuring Hadoopecosystem, and Cloudera Manager using CDH3 Distribution.
  • Experienced in managing and reviewing Hadoop log files
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Supported Map Reduce Programs those are running on the cluster.
  • Installed and configured fully distributed Hadoop cluster using Cassandra database.
  • Importing and Exporting of data from RDBMS to HDFS using Sqoop.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading the data and writing hive queries which will run internally in map reduce.
  • Developed ETL layer using Hive and Pig.
  • Worked with Python, MongoDB, Spark and Scala.
  • Written Hive queries for data to meet the business requirements.
  • Analyzed the data using Pig and written Pig scripts by grouping, joining and sorting the data.
  • Hands on experience with NoSQL Database.
  • Worked on MongoDB by using CRUD (Create, Read, Update and Delete), Indexing, Replication and Sharding features.
  • Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business users.
  • Actively participated in weekly meetings with the technical teams to review the code.

Environment: Hadoop, Hive, Linux, MapReduce, HDFS, Hive, Python, Pig, Sqoop, Cloudera, Cassandra, Spark, Shell Scripting, Java (JDK 1.6), Java 6, Oracle 10g, PL/SQL, SQL*PLUS

Confidential, Atlanta - GA

Java Developer

Responsibilities:

  • Primary responsibilities include writing Test cases, Test execution and Defect Fixing.
  • Written Test cases, Prepared Test data and Performed Test execution.
  • Involved in Scrum daily stand-up meetings throughout the process of design and development.
  • Developed the Junit test cases for the backend services.
  • Used Java/J2EE technologies in Application development including frameworks JDBC, Struts 2.0, ADF.
  • Used Log4J for logging and tracing the messages.
  • Participated in Code review sessions and design discussions. Code review process is done using the tool called Jupiter and the system code is validated against the security threats using Fortify tool.
  • Generated excellent reports using J2EE, Jasper Reports and JDBC classes.
  • Developed services using spring framework.
  • Used Agile software development methodology on Spring framework.
  • Used Hibernate POJOs to retrieve data from Databases.
  • Developed SQL Queries and stored procedures.

Confidential

Netcool Developer/Support

Responsibilities:

  • Monitored the CCR systems environment using the IBM suite of products.
  • Developed scripts to automate the monitoring of the network servers, identifying the fault and reporting the same in the form of e-mails/alerts.
  • Upgrading the IBM products, which includes installing Fix packs, feature packs etc. for the installed products in the CCR environment to the latest version released by IBM.
  • Worked on Web based portal system that displays and consolidates web enabled network management applications into a single view using JARVIS tool.
  • Integrating the above GUI to HP suite of products to display the network applications which is user friendly.
  • Responsible for the fixation of the errors in data by utilizing the GUI interface of application and SQL Scripting.

Environment: IBM Tivoli suite of products, Linux (RHEl 5), MS SQL 2008, Windows NT

Confidential

Developer/Support

Responsibilities:

  • Developed architectural solution in the area of fault management, performance management using the IBM suite of products.
  • Performed the tasks like creating custom reports as per the customer’s requirement and scheduling to the same at regular intervals using IBM Tivoli Common Reporting.
  • Developed custom scripts and interfaces to several OSS systems providing functionality for the client that was not yet available from the vendor, to include a bi-directional Remedy-Oracle-ObjectServer gateway.
  • Developed Netcool Impact policies in the production server which identifies the faults/errors in a system and alarms to the respective team in the form of critical alerts.
  • Maintenance and Updating of MS SQL database.
  • Monitored daily activities and troubleshooting the issues.

We'd love your feedback!