Hadoop Developer Resume
2.00/5 (Submit Your Rating)
TX
EXPERIENCE SUMMARY
- Close to 6 years of experience in design and development of Enterprise solutions. Proficient in analyzing, conceptualizing and implementing web applications using Java/JEE and Big Data technologies. In depth exposure to web application frameworks, web services and big data stack such as Hadoop. Excellent analytical and communication skills.
- Proficient inJavaand good understanding of Object Oriented Concepts and development of multi - tier enterprise web applications.
- Exposure to all the phases of Software Development Life Cycle such as requirement gathering, design, development, support and testing
- Hands-on experience on all the major components in Hadoop Ecosystem like HDFS, Yarn, Hive, Flume,Sqoop, Oozie and other related components.
- Installed, configured and monitored Hadoop Clusters using Cloudera (CDH5) distribution.
- Excellent understanding /knowledge on Hadoop architecture and Spark.
- Experience in ingesting RDBMS data to HDFS using Sqoop and unstructured data from message queues to HDFS using Flume.
- Developed Map/Reduce programs and wrote HIVE and Pig scripts for data cleansing, preprocessing and transformation.
- Experienced in managing and scheduling workflows for sqoop, shell, hive and map reduce jobs using Oozie.
- Extensive experience in developing web applications using variousJava,Java EE technologies including JSP, Struts 2.x, Web Services (REST), Spring3.x, Hibernate3.x, ANT 1.7, Maven, log4j, XML Binding (JAXB), JQuery.
- Expertise in implementing MVC architecture using framework (Struts2) for designing, developing and deploying web applications in conjunction with Spring IOC and AOP .
- Experience working with Hibernate 3.2 as the back-end Object Relational Mapping (ORM tool) for mapping java classes with database using the HQL (Hibernate Query Language).
- Commendable experience working with Rapid application development tools like Eclipse and hands on experience in building front end using HTML, CSS, Java script, JQuery and Ajax.
- Strong skills in developing web services using REST by means of JERSEY implementation and spring.
- Designed and developed test cases to test business logic using unit testing tools such as JUnit and Mockito.
- Hands on experience working with databases like Oracle, MySql and Vertica .Hands on experience in writing SQL queries and Procedures.
- Hands on experience in integrating applications using Mule Enterprise Service Bus and Rabbit MQ.
- Experienced with popular version control such as SVN, CVS, Git and build tools like ANT and Maven.
- Experienced in working in Agile and Waterfall model.
PROFESSIONAL EXPERIENCE
Confidential, TX
Hadoop Developer
Responsibilities:
- Gather requirements by frequently collaborating with the researchers and analytics team.
- Installed Cloudera based Hadoop cluster (CDH 5.3) and data visualization tools like tableau.
- Configured the Hadoop components including Hive, Pig, Sqoop, Oozie, Hue and Spark in the client environment.
- Configured High Availability on the cluster.
- Created Hive Tables, loaded healthcare data from Oracle using Scoop to Hive.
- Developed custom flume component to ingest data Clinic Station (Medical Record Systems) to HDFS via Rabbit MQ.
- Wrote Map Reduce programs, PIG, Hive scripts to parse, clean and filter clinical data on the cluster.
- Worked on defining the job flows using Oozie and scheduled them in the client environment.
- Involved in creating POCs to ingest and process streaming data using Spark and HDFS.
- Design and Developed REST based services using spring and Mule ESB to feed structured data from Vertica.
- Co-ordinate with the offshore team and cross-functional teams to ensure that applications are properly tested, configured, and deployed.
- Analyzed large amounts of clinical data sets like genomic, comorbidity, patient demographics and NLP data to determine optimal way to aggregate and process it.
- Followed agile model and participated in the daily scrum meetings.s:
- Involved in requirements and design analysis to determine the best approach to be adopted.
- Have designed and developed applications using JAVA, J2EE, JSF, Struts, spring and Hibernate.
- Implement MVC architecture using Struts2 Framework.
- Extensively used Spring DI, Hibernate, Core Java such as Exceptions, Collections, etc.
- Have developed the presentation layer code using JQuery, CSS, Ajax and HTML.
- Used Java script for validation of page data in the HTML pages
- Involved in the download functionality of PDF using BFO PDF library.
- Extensively involved in Application Development, Production Support and fixing the issues.
- Writing unit test cases to test the business logic using Junit and Mockito.
Skills: Core Java, JEE, JDBC, XML, JSP, Servlets, Hibernate, Struts2,JSON,Spring, JavaScript, Ant,RAD,WebSphere Portal,Websphere Server, SQL,Oracle 10G.
Confidential
Hadoop Developer
Responsibilities:
- Set up and configured all Big Data stack such as Hadoop, Hive, Oozie, Pig and Flume.
- Configured flume agent to collect the log data
- Developed Map Reduce jobs to parse the content from the Log Files.
- Wrote Hive and Pig scripts to filter and group the extracted data and export the data and scheduled jobs using Oozie.
- Involved in requirements and design analysis for various domains such as Inventory Management, Asset Management, Land and Estate Management and Building Plan Approval.
- Designed and developed JEE applications using JAVA, J2EE, Struts2, Spring and Hibernate.
- Extensively used Spring DI, Hibernate, Core Java such as Exceptions, Collections, etc.
- Have developed the presentation layer for mobile cross browser platforms using JQuery and JQuery Mobile
- Involved in manual testing and automated testing using WATIR
- Writing unit test cases to test the business logic using Junit and Mockito.