Hadoop Developer Resume
Cupertino, CA
SUMMARY:
- 8 years of experience in Software analysis/Design/Development/Debugging/Implementation which includes 3+ years in Big Data ecosystem related technologies.
- In depth knowledge of Hadooparchitecture, HDFS, JobTracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
- Experience in installing, configuring, and usingHadoopecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, and Flume and Impala.
- Experience in managing and reviewingHadooplog files and analyzing data using Pig Latin, HiveQL HBase and custom MapReduce programs in Java.
- Extending Hive and Pig core functionality by writing custom UDFs.
- Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and theHadoopecosystem.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems.
- Expertise in Oracle, NoSQL, My SQL, MS SQL Server, SQLite, Crypto API, HTML, DHTML, and other Internet Technologies.
- Strong knowledge of Mahout Machine learning, Mongo DB Experience working with geographically dispersed teams.
- TCP/IP, Sockets, Multithreaded Programming, and IPC based client server programming using C/C++.
- Proficient in using OOPs Concepts (Polymorphism, Inheritance, Encapsulation) etc.
- Used Design Patterns like MVC (Model - View-Controller) and Singleton, Factory etc.
- Experience using XML, XML parser
- A good team player with strong ability to lead and adapt new skills, and make efforts to do the work at hand effectively.
TECHNICAL SKILLS:
Programming Languages: Java 1.4, C++, C, SQL, PIG, and PL/SQL.
Java Technologies: JDBC
Frame Works: Jakarta Struts 1.1, JUnit and JTest, LDAP.
Databases: Oracle8i/9i, NOSQL (HBase), MY SQL, MS SQL server.
IDE’s & Utilities: Eclipse, JCreator, NetBeans
Web Dev. Technologies: HTML, XML.
Protocols: TCP/IP, HTTP and HTTPS.
Operating Systems: Linux, Mac OS, WINDOWS 98/00/NT/XP.
Hadoop ecosystem: Hadoop and MapReduce, Sqoop, Hive, PIG, HBASE, HDFS, Zookeeper, Lucene, Sun Grid Engine Administration
WORK EXPERIENCE:
Confidential, Cupertino, CA
Hadoop Developer
Responsibilities
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
- Responsible for building scalable distributed data solutions using Hadoop.
- Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
- Involved in loading data from LINUX file system to HDFS.
- Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
- Created HBase tables to store variable data formats of PII data coming from different portfolios.
- Implemented a script to transmit sysprin information from Oracle to Hbase using Sqoop.
- Implemented best income logic using Pig scripts and UDFs.
- Implemented test scripts to support test driven development and continuous integration.
- Worked on tuning the performance Pig queries.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
- Responsible to manage data coming from different sources.
- Involved in loading data from UNIX file system to HDFS.
- Load and transform large sets of structured, semi structured and unstructured data
- Cluster coordination services through Zookeeper.
- Experience in managing and reviewing Hadoop log files.
- Job management using Fair scheduler.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
- Installed Oozie workflow engine to run multiple Hive and pig jobs.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
Environment: Java, J2EE, Ruby on Rails, Android, IOS, Hadoop, HBase, Hive, Sqoop, MySQL, HTML5, Android, NDK.
Confidential, San Francisco, CA
HadoopDeveloper
Responsibilities:
- Built scalable distributed data solutions usingHadoop
- Worked on analyzingHadoopcluster and different big data analytic tools including Pig, Hbase database and Sqoop.
- Installed and configured Flume, Hive, Pig, Sqoop, and HBase on theHadoopcluster.
- Managing and scheduling Jobs on aHadoopcluster.
- Implemented nine nodes CDH3Hadoopcluster on Red hat LINUX.
- Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
- SetupHadoopcluster on Amazon EC2 using whirr for POC.
- Resource management ofHADOOPCluster including adding/removing cluster nodes for maintenance and capacity needs.
- Installed and configured Hive and also written Hive UDFs.
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
- Cluster coordination services through Zookeeper.
Environment:Hadoop, HDFS, Hive, Flume, HBase, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL and Ubuntu, Zookeeper, Amazon EC2
Confidential, Atlanta, GA
Senior Core Java/J2ee Developer
Responsibilities:
- Installation of Web Logic Software.
- Developed Modules using Casper Framework
- Customized GEPS Energy application Enhancements.
- Web based interfaces Java and JSP.
- Create Database.
- Running DB Scripts like Functions, Procedures and SQL Queries.
- Designed and developed the system using Java, MVC, XML,and GUI designing.
- Designed the XML format to exchange Data back and forth between front end, middleware and Backend. DTD was written to define XML data.
- Developed algorithms and coded programs in Core Java.
- Fixing system test defects. Involved in setting up integration and testing environments.
Environment: Core java, Struts2, Polymorphism, Threads and Multi-Threading, Operator/Function Overloading, STL's, Virtual Functions, XSLT, XML, Oracle, SQL/ PL SQL.
Confidential, Atlanta, GA
Senior Core Java/J2ee Developer
Responsibilities:
- Involved in requirement gathering, functional and technical specifications.
- Monitoring and fine tuning IDM performance
- Enhancements in the self-registration process.
- Fixing the existing bugs in various releases
- Global deployment of the application and co-ordination between the client, development team and the end users.
- Setting up of the users by reconciliations, bulk load and bulk link in all the environments.
- Wrote requirements and detailed design documents, designed architecture for data collection.
- Developed OMSA UI using MVC architecture, Core Java, Java Collections, JSP, JDBC, Servlets, ANT and XML within a Windows and UNIX environment.
- Used Java Collection Classes like Array List, Vectors, Hash Map and Hash Table.
- Used Design Patterns MVC, Singleton, Factory, Abstract Factory.
Environment: Core java, Struts2, Waveset Lighthouse 6.x (Sun IDM), Polymorphism, Threads and Multi-Threading, Operator/Function Overloading, STL's, Virtual Functions, Rational Rose, Design Patterns, Oracle, SQL/ PL SQL.
Confidential
Java/J2EE Interface Developer
Responsibilities:
- Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
- Developed user interfaces using JSP, HTML, XML and JavaScript.
- Generated XML Schemas and used XML Beans to parse XML files.
- Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
- Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
- Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
- Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, MySQL 2.1, Rational Rose, RedHat Linux