Java Developer Resume
Herndon, VA
SUMMARY
- Over 7+ years of experience in Java/J2EE technologies and worked in all phases of Software Development Life Cycle and Big Data Analytics with hands on experience on writing MapReducejobs on Hadoop Ecosystem including Hive, Pig, Hbase, Flume, and Oozie.
- About 2 year of work experience on BigData Analytics with hands on experience on writing MapReducejobs on Hadoop Ecosystem including Hive and Pig, Hbase, Flume, and Oozie.
- Experience with distributed systems, large - scale non-relational data stores, mapreduce systems, data modeling, and big data systems.
- Professional experience in installing, configuring and using ApacheHadoop ecosystem components like Hadoop MapReduce, HDFS, Hbase, Zoo Keeper, Oozie, Hive, Sqoop, Pig and Flume.
- Experience in handling various tools for Big Data analysis using Pig, Hive and understanding of Sqoop and Puppet.
- Experience in developing customized UDF’s in java to extend Hive and Pig Latin functionality.
- Very good understanding on NOSQL databases like mongoDB and HBase.
- Have good Knowledge in ETL and hands on experience in ETL.
- Extensive experience in middle-tier development using J2EE technologies like JDBC, JNDI, JSP, Servlets,JSP, JSF, Struts, Spring, Hibernate, JDBC, EJB.
- Experience with web-based UI development using jQuery UI, jQuery, ExtJS, CSS, HTML, HTML5, XHTML and Javascript.
- Worked on Agile methodology,SOA for many of the applications
- Experience using XML, XSD and XSLT.
- Good knowledge of Log4j for error logging.
- Developed stored procedures and queries using PL/SQL.
- Expertise in RDBMS like Oracle, MS SQL Server, MySQL and DB2.
- Team player with excellent communication, presentation and interpersonal skills.
- Highly motivated team player with zeal to learn new technologies.
TECHNICAL SKILLS
Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, MongoDB, Cassandra, Power pivot, Puppet, oozie, Zookeeper
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans
IDE’s: Eclipse, Net beans
Big data Analytics: Datameer 2.0.5
Frameworks: MVC, Struts, Hibernate, Spring
Programming languages: C,C++, Java, Python, Ant scripts, Linux shell scripts
Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server
Web Servers: Web Logic, Web Sphere, Apache Tomcat
Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL
Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP
ETL Tools: Informatica, Pentaho
Testing: Win Runner, Load Runner, QTP
PROFESSIONAL EXPERIENCE
Confidential, Richmond, VA
Hadoop Developer
Environment: CloudEra Hadoop, MapReduce, HDFS, Hive, Java (jdk1.7), Pig, Linux, XML. HBase, Zookeeper, Sqoop.
Responsibilities:
- Handled importing of data from various data sources, performed transformations using Hive, Map Reduce,
- Loaded data into HDFS and extracted the data from MySQL into HDFS using Sqoop.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports For the BI team.
- Managing and scheduling jobs on a Hadoop cluster.
- Developed Simple to complex Map/reduce Jobs using Hive.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
- Developed Simple to complex Map/reduce Jobs using Hive.
- Implemented Partitioning and bucketing in Hive.
- Mentored analyst and test team for writing Hive Queries.
- Experience in managing and reviewing Hadoop log files.
- Extensively used Pig for data cleansing.
- Configured Flume to extract the data from the web server output files to load into HDFS.
- Developed the Pig UDF'S to pre-process the data for analysis.
- Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
Confidential, Herndon, VA
Hadoop Developer
Environment: CloudEra Hadoop, Linux, HDFS, Hive, Sqoop, Flume, Zookeeper, HBase
Responsibilities:
- Installed and configured multi-nodes fully distributed Hadoop cluster.
- Involved in installing Hadoop Ecosystem components.
- Responsible to manage data coming from different sources
- Involved in Hadoop Cluster environment administration that includes adding and removing cluster nodes,
- Supported Map Reduce Programs those are running on the cluster
- Involved in HDFS maintenance and administering it through Hadoop-Java API.
- Configured Fair Scheduler to provide service-level agreements for multiple users of a cluster.
- Maintaining and monitoring clusters. Loaded data into the cluster from dynamically generated files using
- Flume and from relational database management systems using Sqoop.
- Managing nodes on Hadoop cluster connectivity and security.
- Resolved configuration issues with Apache add-on tools.
- Used Pig as ETL tool to do transformations, event joins, filter both traffic and some pre-aggregations before storing the data onto HDFS
- Involved in writing Flume and Hive scripts to extract, transform and load the data into Database cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
Confidential, MD
Java/J2EE Developer
Environment: Java 1.4,Struts, JSP, Servlets API, HTML, JDBC, Web Sphere 5.1,MQ Series, MS SQL server, XSLT, XML, EJB, Edit Plus, EJB, JUnit, CSS,JMS, Hibernate, Eclipse, and WSAD
Responsibilities:
- Responsible for the design and development of the framework. The system is designed using J2EE technologies based on MVC architecture.
- Developed Session Beans using J2EE Design Patterns.
- Implemented J2EE Design patterns like Data Access Objects, Business Objects, and Java Design Patterns like Singleton.
- Extensively used MQ series.
- Extensive use of Struts framework.
- Used JSP and Servlets, EJBs on server side.
- Implemented Home Interface, Remote Interface, and Bean Implementation class.
- Implemented business logic at server side using Session Bean.
- Wrote PL/SQL queries to access data from Oracle database.
- Set up Web sphere Application server and used Ant tool to build the application and deploy the application in Web sphere.
- Developed the application using WSAD.
- Prepared test plans and writing test cases
- Worked on Hibernate.
Confidential
JAVA Developer
Environment: Oracle 11g, Java 1.5, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6.
Responsibilities:
- Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
- Designed and developed user interface using JSP, HTML and JavaScript.
- Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form beans and action classes.
- Extensively used Struts framework as the controller to handle subsequent client requests and invoke the model based upon user requests.
- Defined the search criteria and pulled out the record of the customer from the database. Make the required changes and save the updated record back to the database.
- Validated the fields of user registration screen and login screen by writing JavaScript validations.
- Developed build and deployment scripts using Apache ANT to customize WAR and EAR files.
- Used DAO and JDBC for database access.
- Developed stored procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
- Design and develop XML processing components for dynamic menus on the application.
- Involved in postproduction support and maintenance of the application.
- Involved in the analysis, design, implementation, and testing of the project.
- Implemented the presentation layer with HTML, XHTML and JavaScript.
- Developed web components using JSP, Servlets and JDBC.
- Implemented database using SQL Server.
- Designed tables and indexes.
- Wrote complex SQL and stored procedures.
- Involved in fixing bugs and unit testing with test cases using JUnit.
- Developed user and technical documentation.