We provide IT Staff Augmentation Services!

Hadoop Developer/ Administrator Resume

0/5 (Submit Your Rating)

Memphis, TN

SUMMARY

  • 7+Years of experience with emphasis on Big Data Technologies, Development, Administration and Design of Java based enterprise applications.
  • More than two years of experience in Hadoop Development/Administration and five years of Java Application Development.
  • Experience in installation, configuration, supporting and managing Hadoop clusters.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Responsible for writing Map Reduce programs.
  • Logical Implementation and interaction with Hbase.
  • Developed Map Reduce jobs to automate transfer of data from Hbase.
  • Perform data analysis using Hive and Pig.
  • Load log data into HDFS using Flume.
  • Gained good knowledge on creating strategies on risky transactions.
  • Successfully loaded files to Hive and HDFS from MongoDB.
  • Worked in Multiple Environment in installation and configuration.
  • Document and explain implemented processes and configurations in upgrades.
  • Support development, testing, and operations teams during new system deployments.
  • Evaluate and propose new tools and technologies to meet the needs of the organization.
  • Experience in using Scoop, Zoo Keeper and Cloudera Manager.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Experience in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine - tuning of Linux Red Hat.
  • Worked on debugging tools such as Dtrace, Truss and Top. Expert in setting up SSH, SCP, SFTP connectivity between UNIX hosts.
  • An excellent team player and self-starter with good communication skills and proven abilities to finish tasks before target deadlines.

TECHNICAL SKILLS

Programming Languages: Java 1.4, C++,C,SQL,PIG,PL/SQL.

Java Technologies: JDBC.

Frame Works: Jakarta Struts 1.1, JUnit and JTest, LDAP.

Databases: Oracle8i/9i, NO SQL (HBase),MY SQL,MS SQL server.

IDE’s & Utilities: Eclipse and JCreator, NetBeans.

Web Dev. Technologies: HTML, XML.

Protocols: TCP/IP, HTTP and HTTPS.

Operating Systems: Linux, MacOS, WINDOWS 98/00/NT/XP.

Hadoop ecosystem: Hadoop and Map Reduce, Sqoop, Hive, PIG, HBASE, HDFSZookeeper, Lucene, Sun Grid Engine Administration

PROFESSIONAL EXPERIENCE

Confidential, Memphis, TN

Hadoop Developer/ Administrator

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Implemented nine nodes CDH3Hadoop cluster on Red hat LINUX.
  • Involved in loading data from LINUX file system to HDFS.
  • Created Hbase tables to store various data formats of PII data coming from different portfolios.
  • Implemented test scripts to support test driven development and continuous integration.
  • Worked on tuning the performance Pig queries.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Responsible to manage data coming from different sources.
  • Involved in loading data from UNIX file system to HDFS.
  • Services through Zookeeper.
  • Experience in managing and reviewing Hadoop log files.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Installed Oozie workflow engine to run multiple Hive and pig jobs.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.

Confidential, Milwaukee, WI

Hadoop Developer/ Administrator

Responsibilities:

  • Involved in review of functional and non-functional requirements.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and pre-processing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced indefining jobflows.
  • Experienced in managing andreviewingHadooplog files.
  • Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Got good experience with NOSQL database.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Gained very good business knowledge on health insurance, claim processing, fraud suspect identification, appeals process etc.
  • Developed a custom File system plug in for Hadoop so it can access files on Data Platform.
  • This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.
  • Designed and implemented Map Reduce-based large-scale parallel relation-learning system.
  • Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
  • Setup and benchmarked Hadoop/Hbase clusters for internal use
  • Setup Hadoop cluster on Amazon EC2 using whirr for POC.
  • Wrote recommendation engine using mahout.

Environment: Java 6, Eclipse, Oracle 10g, Sub Version, Hadoop, Hive, HBase, Linux, MapReduce, HDFS, Hive, Java (JDK 1.6),Hadoop Distribution of Horton Works, Cloudera, MapReduce, DataStax, IBM DataStage 8.1, Oracle 11g / 10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.

Confidential, MA

Hadoop and Java Developer

Responsibilities:

  • Worked with several clients with day to day requests and responsibilities.
  • Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, Hbase, Zookeeper and Sqoop.
  • Involved in analyzing system failures, identifying root causes and recommended course of actions.
  • Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.
  • Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
  • Managing and scheduling Jobs on a Hadoop cluster.
  • Utilized Java and MySQL from day to day to debug and fix issues with client processes.
  • Developed, tested, and implemented financial-services application to bring multiple clients into standard database format.
  • Assisted in designing, building, and maintaining database to analyze life cycle of checking and debit transactions.
  • Excellent JAVA, J2EE application development skills with strong experience in Object Oriented Analysis, Extensively involved throughout Software Development Life Cycle (SDLC).
  • Strong experience of J2SE, XML, Web Services, WSDL, SOAP, UDDI, TCP, IP.
  • Strong experience of software and system development using JSP, Servlet, Java Server Face, EJB, JDBC, JNDI, Struts, Maven, Trac, Subversion, JUnit, SQL language.
  • Rich experience of database design and hands-on experience of large database systems: Oracle 8i and Oracle 9i, DB2, PL, SQL.
  • Hands-on experience of Sun One Application Server, Web logic Application Server, Web Sphere Application Server, Web Sphere Portal Server, and J2EE application deployment technology.

Environment: Hive, Pig, Hbase, Zookeeper, Sqoop, Java, JDBC, JNDI, Struts, Maven, Trac, Subversion, JUnit, SQL language, spring, Hibernate, Junit, Oracle, XML, AltovaXmlSpy, Putty and Eclipse.

Confidential, New York, NY

Java/JEE Architect/developer

Responsibilities:

  • Architected a JSF, Web sphere, Oracle, spring, and Hibernate based 24x7 Web application.
  • Built an end to end vertical slice for a JEE based billing application using popular frameworks like Spring, Hibernate, JSF, Facelets, XHTML, Maven2, and Ajax by applying OO design concepts, JEE & GoF design patterns, and best practices.
  • Integrated other sub-systems like loans application, equity markets online application system, and documentation system with the structured products application through JMS, Web Sphere MQ, SOAP based Web services, and XML.
  • Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 9i database.
  • Tuned SQL statements, Hibernate mapping, and Web Sphere application server to improve performance, and consequently met the SLAs.
  • Gathered business requirements and wrote functional specifications and detailed design documents.
  • Improved the build process by migrating it from Ant to Maven2.
  • Built and deployed Java applications into multiple Unix based environments and produced both unit and functional test results along with release notes.

Environment: Java 1.5, JSF Sun RI, Facelets, Ajax4JSF, Richfaces, Spring, XML, XSL, XSD, XHTML, Hibernate, Oracle 9i, PL/SQL, MINA, Spring-ws, SOAP Web service, Web Sphere, Oracle, JMX, ANT, Maven2, Continuum, JUnit, SVN, TDD, and XP.

Confidential

Java/J2EE developer

Responsibilities:

  • DesignedanddevelopedStruts like MVC 2 Webframework using the front-controller design pattern, which is used successfully in a number of production systems.
  • Spearheadedthe “Quick Wins” project by working very closely with the business and end users to improve the current website’s ranking from being 23rdto 6thin just 3 months.
  • Normalized Oracle database, conforming to design concepts and best practices.
  • Resolvedproduct complications at customer sites and funnelled the insights to the development and deployment teams to adopt long term product development strategy with minimal roadblocks.
  • Convinced business users and analysts with alternative solutions that are more robust and simpler to implement from technical perspective while satisfying the functional requirements from the business perspective.
  • Applied design patterns and OO design conceptsto improve the existing Java/JEE based code base.
  • Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.
Environment: Java 1.2/1.3, Swing, Applet, Servlet, JSP, custom tags, JNDI, JDBC, XML, XSL, DTD, HTML, CSS, Java Script, Oracle, DB2, PL/SQL, Web logic, JUnit, Log4J and CVS.

We'd love your feedback!