Hadoop Developer,resume Profile
PROFESSINAL SUMMARY:
- Over 7 years of professional IT experience which includes experience in Big data ecosystem related technologies.
- Having 2 years of hands on experience working with Hadoop, HDFS, Map Reduce framework and Hadoop ecosystem like Hive, HBase, Sqoop and Oozie.
- Excellent understanding of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
- Hands on experience in installing, configuring, and using Hadoop components like Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Zookeeper and Flume.
- Good Exposure on Hive, PIG Scripting and Distribute Application and HDFS.
- In-depth understanding of Data Structure and Algorithms.
- Experience in managing and reviewing Hadoop log files.
- Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, Cassandra.
- Implemented in setting up standards and processes for Hadoop based application design and implementation.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Experience in Object Oriented Analysis and Design OOAD and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Experience in managing Hadoop clusters using Cloudera Manager tool.
- Very good experience in complete project life cycle design, development, testing and implementation of Client Server and Web applications.
- Experience in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of Linux Redhat.
- Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
- Hands on experience in VPN, Putty, winSCP, VNCviewer, etc.
- Scripting to deploy monitors, checks and critical system admin functions
- o automation.
- Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
- Experience in Java, JSP, Servlets, WebLogic, WebSphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML
- Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
TECHNICAL SKILLS:
Big Data Ecosystem : : HDFS, HBase, Hadoop MapReduce, Zookeeper, Hive,
Pig, Sqoop, Flume, Oozie, Cassandra
Languages: : C, C , Java, PHP, SQL/PLSQL
Methodologies: : Agile, waterfall.
Database: : Oracle 10g, DB2, MySQL, MongoDB , CouchDB, MS
SQL server, Amazon EC2
Web Tools: : HTML, Java Script, XML, ODBC, JDBC, Java Beans,
EJB, MVC, Ajax, JSP, Servlets, Java Mail, Struts, Junit
IDE / Testing Tools : Eclipse.
Operating System : Windows, UNIX, Linux
Scripts : JavaScript, Shell Scripting
PROFESSIONAL EXPERIENCE:
Confidential
Hadoop Developer
Responsibilities:
- Installed and configured HadoopMapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and also written PigLatin scripts.
- Developed PIG scripts using Pig Latin.
- Involved in managing and reviewing Hadoop log files.
- Exported data using Sqoop from HDFS to Teradata on regular basis.
- Developing Scripts and Batch Job to schedule various Hadoop Program.
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL.
- Experienced in defining job flows.
- Got good experience with NOSQL databases like MongoDB.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Designed and implemented Mapreduce-based large-scale parallel relation-learning system
- Setup and benchmarked Hadoop clusters for internal use.
- Monitoring the log flow from LM Proxy to ES-Head.
- Used secportal as front end of Gracie where we perform the search operations.
- Wrote the Map Reduce code for the flow from Hadoop Flume to ES Head.
Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of, Cloudera, Pig, MongoDB, Linux, XML, MySQL, MySQL Workbench, Java 6, Eclipse, PL/SQL, SQL connector, Sub Version.
Confidential
Hadoop Developer
Responsibilities:
- Involved in review of functional and non-functional requirements.
- Installed and configured HadoopMapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and Developed PigLatin scripts.
- Involved in managing and reviewing Hadoop log files.
- Importing and exporting data using Sqoop to load data to and from Teradata to HDFS on regular basis.
- Developed Scripts and Batch Job to schedule various Hadoop Program.
- Prepared avro schem files for generating Hive tables.
- Creating Hive tables and working on them using Hive QL.
- Experienced in defining job flows.
- Good exposure to NOSQL database HBase.
- Developed Custom UDFs in PIG.
- Prepared shell scripts for executing Hadoop commands for single execution.
- Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
- Setup and benchmarked Hadoop/HBase clusters for internal use.
Environment: Hadoop CDH4.1.1, Pig 0.9.1, Avro, Oozie 3.2.0, Sqoop, Hive,PIG, Java 6, Eclipse,Teradata.
Confidential
Sr , Java Developer
Responsibilities:
- Responsible for requirement gathering and analysis through interaction with end users.
- involved in designing use-case diagrams, class diagram,interaction using UML model with Rational Rose.
- Designed and developed the application using various design patterns, such as session facade,business delegate and service locator.
- Worked on Maven build tool.
- Involved in developing JSP pages using Struts custom tags,JQuery and Tiles Framework.
- Used JavaScript to perform client side validations and Struts-Validator Framework for server-side validation.
- Good experience in Mule development.
- Developed Web applications with Rich Internet applications using Java applets,SilverLight,JavaFX.
- Involved in creating Database SQL and PL/SQL queries and stored Procedures.
- Implemented Singleton classes for property loading and static data from DB.
- Debugged and developed applications using Rational Application Developer RAD .
- Developed a Web service to communicate with the database using SOAP.
- Developed DAO data access objects using Spring Framework 3.
- Deployed the components in to WebSphere Application server 7.
- Actively involved in backend tuning SQL queries/DB script.
- Worked in writing commands using UNIX,Shell scripting.
- Involved in developing other subsystems' server-side components.
- Production supporting using IBM clear quest for fixing bugs.
Environment:Java EE 6, IBM WebSphere Application Server 7, Apache-Struts 2.0, EJB 3, Spring 3.2, JSP 2.0, WebServices, JQuery 1.7, Servlet 3.0, Struts-Validator, Struts-Tiles, Tag Libraries, ANT 1.5, JDBC, Oracle 11g/SQL, JUNIT 3.8, CVS 1.2, Rational clear case,Eclipse 4.2,JSTL,DHTML
Confidential
Sr,Java/J2EE Interface Developer
Responsibilities:
- Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Participated in JAD meetings to gather the requirements and understand the End Users System.
- Developed user interfaces using JSP, HTML, XML and JavaScript.
- Generated XML Schemas and used XML Beans to parse XML files.
- Created Stored Procedures Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
- Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
- Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
- Developed web application called iHUB integration hub to initiate all the interface processes using Struts Framework, JSP and HTML.
- Developed the interfaces using Eclipse 3.1.1 and JBoss 4.1 Involved in integrated testing, Bug fixing and in Production Support
Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, Spring Batch XML Processing, MySQL 2.1, Swing, Java Web Server 2.0, JBoss 2.0, RMI, Rational Rose, Red Hat Linux 7.1.