We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Sanjose, CA

Summary:

  • 8+ Years of IT Experience in Analysing, Designing, Developing, Implementing and Testing of software Applications and Currently working as Hadoop Developer.
  • Highly experienced IT professional with 8+ years of commitment to excellence and the implementation of best practices worldwide specializing in Big Data (Hadoop Ecosystems),Java and Mainframes.
  • Hands on experience on Teradata Migration to Hadoop Platform.
  • Extensive knowledge on Hive,Spark - Sql Development.
  • Knwoledge on Optimization techniques for better performance.
  • Automated Spark-SQL scrips using Unix shell script.
  • Hands on experience on major components of BigData and Hadoop Ecosystems.
  • (HDFS,MapReduce,PIG,HIVE,HBASE,SQOOP,Flume,Oozie,ZooKeeper,Kafka,Spark(Scala),Storm,MongoDB, Cassandra).
  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Ability to analyze different file formats.
  • Involved in Optimization of Hive Queries.
  • Excellent understanding of Hadoop architecture and different components of Hadoop clusters which include componenets of Hadoop (Job Tracker, Task Tracker, Name Node and Data Node).
  • Involved in Data Ingestion to HDFS from various data sources.
  • Strong experience in collecting and storing streaming data like log data in HDFS using Apache Flume.
  • Extensively used Apache Sqoop for efficiently transferring bulk data between Apache Hadoop and relational databases (Db2).
  • Automated sqoop,hive and pig jobs using Oozie scheduling.
  • Extensive knowledge in NoSQL databases like Hbase,MangoDB, Cassandra.
  • Have good knowledge on writing and using the user defined functions in HIVE,PIG and MapReduce.
  • Helped business team by installing and configuring Hadoop ecosystem components along with Hadoop admin.
  • Configured & deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Developed multiple Kafka Producers and Consumers from scratch as per the business requirements.
  • Responsible for creating, modifying and deleting topics (Kafka Queues) as and when required by the Business team.
  • Developed tests cases and POC’s to benchmark and verify data flow through the Kafka clusters.
  • Working on implementing Spark and Strom frame work.
  • Extensive exposure to all aspects of Software Development Life Cycle (SDLC) i.e. Requirements Definition for customization, Prototyping, Coding (COBOL,DB2,Java) and Testing
  • Experience in resolving on-going production and maintenance issues and bug fixes.
  • Proficiency in developing SQL queries.
  • Exposure to Java development projects.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Extensive management skills in leading GenO (Generation Open) Framework of Confidential initiative and lead my community to Top 1 in World out of 126 Communities consistently for 4 years.

TECHNICAL SKILLS

Hadoop Ecosystems: (Hive,Pig,Sqoop,Flume,Zookeeper,oozie,MR,Hbase),Kafka,Storm and Spark(Spark SQL,Scala),Unix Shell Scripting Core Java,JDBC,Java Script DB2,SQL,COBOL,CICS,JCL,VSAM,ISPF,FILEAID,SPUFI,IDCAMS, ENDEVOR

PROFESSIONAL EXPERIENCE

Confidential, Sanjose, CA

Hadoop Developer

Responsibilities:

  • Worked with Data Ingestion techniques to move data from various sources to HDFS.
  • Analysed different formats of Data.
  • Extensive knowledge on Spark.
  • Worked on Redesigning and Coding in Hive and Spark-Sql.
  • Extensly worked with Partitions, Bucketing tables in Hive and designed both Managed and External table.
  • Worked on optimization of Hive Queries. Optimized Spark-Sql Parameters to work effectively.
  • Worked on Teradata understanding and migrating it to hadoop end to end.
  • Worked on developing Unix Shell scripts to automate Spark-Sql.
  • Involved in all phases of Project.
  • Worked with huge volumes of data and migrated effectively with no post production defects.
  • Involved in requirement analysis.
  • Involved in giving KT to other team members.
  • Involved in preparing Project documentation.

Confidential, Portland, Or

Hadoop Developer

Responsibilities:

  • Worked with Data Ingestion techniques to move data from various sources to HDFS.
  • Analysed different formats of Data.
  • Worked on writing Map reduce programs using Java.
  • Extensly worked with Partitions, Bucketing tables in Hive and designed both Managed and External table.
  • Worked on optimization of Hive Queries.
  • Created and worked with Sqoop jobs with full refresh and incremental load to populate Hive External tables.
  • Worked on Pig to do data transformations.
  • Developed UDF’s in Map/reduce, Hive and Pig.
  • Worked on Hbase and Its integration with Strom.
  • Worked on Apache flume for getting data from Twitter to HDFS.
  • Presently implementing KAFKA.
  • Worked with MongoDB and Cassandra NoSql Db’s.
  • Presently implementing Strom,Spark.
  • Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and sqoop jobs.
  • Worked with RDBMS import and export to HDFS.
  • Involved in requirement analysis.
  • Involved in giving KT to other team members.
  • Involved in preparing Project documentation.

Confidential

Java Developer

Responsibilities:

  • Involved in design and development phases of Software Development Life Cycle (SDLC).
  • Implemented Multithread concepts inJavaclasses to avoid deadlocking.
  • Involved in High Level Design and prepared Logical view of the application.
  • Involved in designing and developing of Object Oriented methodologies using UML and created Use Case, Class, Sequence diagrams and also in complete development, testing and maintenance process of the application.
  • Created CorejavaInterfaces and Abstract classes for different functionalities.
  • Responsible for Analysis, Design, Development and Integration of UI components with backend usingJ2EEtechnologies such as Servlets, JSP, JDBC.

Confidential

Application Developer

Responsibilities:

  • Worked on Db2 and Sql for FSDB.
  • Involved in source analysis and Inventory Phase of the Project.
  • Involved in Coding new modules, bug fixing, testing of Jobs and ABEND handling.
  • Involved in preparation of project report and took KT for new members of the team.
  • Involved in Unit testing, System Testing, UAT, Integration Testing, Regression Testing and Deployments
  • Involved in distribution & management of project work with other vendors like Accenture.

Confidential

Application Developer

Responsibilities:

  • Worked on Db2 and Sql.
  • Actively participated in giving demo to clients during their visit to offshore.
  • As this project needs to convert Assembler to C, I have prepared understanding documents which includes entire logic of Assembler programming modules and approach documents.
  • Involved in converting Mainframe CICS programs to Tx series (UNIX).
  • Involved in all phases of CICS conversion (both development and testing).
  • Involved in Unit testing, Bug fixing of CICS modules.
  • Took KT for new team members.
  • Involved in analysis of all the CICS modules to get test data as there is no test data available.

We'd love your feedback!