Senior Java/ Developer Resume
Owings, MillS
SUMMARY
- 8+ years of programming experience with skills in analysis, design, development, and deploying forlarge Scale distributed data processing using Hadoop, Pig and Java and other various software applications with emphasis on Object Oriented programming.
- Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Hive,Big data, Sqoop, Pig, Zookeeper, Oozie and Flume.
- Good exposure on Map Reduce programming, PIG Scripting and Distributed Application and HDFS.
- In - depth understanding of Data Structure and Algorithms.Maintenance/ Implementation of Commercial Software. Extensive work experience with Java/J2EE technologies such as Servlets, JSP, EJB, JDBC, JSF, Struts, Spring, SOA, Core Java, AJAX, XML/XSL, DOJO, Web Services(REST, SOAP), UML, Design Patterns and XML Schemas.
- Have good knowledge and skills implementing Web/Enterprise architectures and in open source frameworks like Struts, Hibernate,Spring Framework (Aspect oriented programming and inversion of control).
- Experience in Cassandra data model design for high volume IOT telemetry data.
- Experience in Hive and SQL and Developing UDFs
- Experience with one or more of the following: Hadoop Core, Pig, Hive, kafka, Oozie, Hbase, Spark, or Storm.
- Experience in Apache Spark with Spark Core, Spark SQL.
- Expertise in Hadoop Security and Hive Security.
- Extensively worked withNetezzadatabase to implement data cleanup, performance tuning techniques.
- Experience in managing Hadoop cluster using Cloudera Manager.
- Expertise in Hive Query Language and debugging hive issues
- Very good experience in complete project life cycle (SDLC) of Client Server and Web applications.
- Experience in Administering, Installation, configuration, troubleshooting, Security, Backup,
- Performance Monitoring and Fine-tuning of Linux Redhat.
- Good experience with Python, Pig, Sqoop, Oozie, Hadoop Streaming and Hive.
- Experience in developing Spark and Impala.
- Administration of production Cassandra cluster
- StrongSQLand database.
- Experience with Oracle & MS SQL Server RDBMS with Teradata's
- Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
- Scripting to deploy monitors, checks and critical system admin functions automation.
- Hands on experience in application development using Java, RDBMS, and Linux Shell scripting.
- Strong experience as a senior Java Developer in Web/intranet, client/server technologies using Java,J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
- Extensive Experience on Linux, Shell Scripting and SQL and Python.
- Involved in using JPA Hibernate for data base transaction.
- Designed, coded and configured server side J2EE components like JSP, Servlets, Java Beans, JNDI, JTS, Java Mail API, XML.
- Client side and server side data validations using the JavaScript.
- Involved in implementing Object Relational Mapping using Hibernate 4.0.
- Used My Eclipse IDE and web logic application server in development.
- Used Hibernate Query language and the Hibernate Criteria Queries to do the database operations.
- XML Transformations where done using XML, XSL, XSLT, and XPATH.
- Used Log4j for application debugging and JUNIT for unit testing.
- Experience in Performed problem-solving in a big data arena.
- Big Data/Hadoop solution Architecture, Des
TECHNICAL SKILLS
Big Data Technologies: HDFS, Map Reduce, YARN, Spark, Pig, Hive, Sqoop, Flume.
Java Technologies: J2SE, J2EE - JSP, Servlets, JNDI, JDBC, JSTL, EJB, Junit, JPA, RMI, JMS
Java Technologies: JSE, JSP, JDBC, Hibernate
Methodologies: Agile, V-model
Database: My SQL, Teradata, Oracle, Amazon EC2
IDE / Testing Tools: Eclipse
Web Services: SOAP, REST
IDEs: Eclipse, NetBeans, jDeveloper, TOAD
Version Control Systems: GIT, CVS, SVN
Tools: Maven, Ant, JUnit, TestNG, Log4J
PROFESSIONAL EXPERIENCE
Confidential, Mount Laurel, NJ
Sr. Hadoop Developer
Responsibilities:
- Understanding existing system to come up with migration plan to Hadoop system.
- Design, Development, testing and deployment of new system in Hadoop environment.
- Involved in ETL operations on large data sets.
- Worked with NZLoad to load flat file data intoNetezzatables.
- Used Python programming and language to develop a working and efficient network within the company. installing, tuning and operating Apache Spark and one/two or more related technologies like Spark SQL, Spark Streaming
- Messaging and collection frameworks like Flume, or Storm.
- Detailed understanding of Data warehouse concepts and applications
- Understand all aspects of Yahoo's Big Data Stack (Hadoop, Storm, Spark) and learn select components in detail.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loaded into data warehouse.
- Sound knowledge with Netezza SQL
- Conducted ETL development in the Netezza environment using standard design methodologie.
- Used Spark API over Hortonworks Hadoop to load the data from HP Vertica DB to Oracle DB.
- Utilized Python in the handling of all hits on Django, Redis, and other applications
- Worked on Unix shell scripts to run the jobs. Work on the delivery of both internal and external facing tools for interacting with ourdata processing platform.
- Load and transform large sets of structured, semi structured using Hive and Impala
- Shell & Perl scripting on aix along with python scripting to put custom code to push load.
- Loaded Data into target Data warehouse from 12 different sources.
- Used UC4 job scheduler tool to schedule the jobs.
- Performed research regarding Python Programming and its uses and efficiency.
- Implemented the logging mechanism using log4j framework.
- Focuses on the big picture with problem-solving
- Hive Apache hive is a data warehouse system for Hadoop that uses a SQL-like language called Hive Query Language (HQL)
- Big Data strategic planning, technology roadmap, talent acquisitions and mentor team for cutting edge technology competitiveness such as Hadoop, HIVE, HBase
- Used Maven to clean, compile, build, and ANT to deploy the jarsin HDFS.
- Assisted the team members with daily tasks. Attended project meetings and status meetings.
Environment: Java 8, Hortonworks Hadoop, Apache Spark, Oracle11g, HP Vertica, SQL, PL/SQL, NetBeans, Unix Scripting, PuTTY, UC4 ETL Flume, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL,, Cassandra, CoffeeScript, CouchDB, CSS, DSL, HTML, HTTP, Java andUbuntu, Zookeeper, CDH, Java Eclipse, SQL Server, Shell Scripting. Development, Big Data Architecture, Big Data NoSQL/HBase, Storm,Spark
Confidential, Weehawken, NJ
Hadoop Developer
Responsibilities:
- Analysis, test, debug documents and implements moderately complex software applications.
- Implemented the project by using Spring Web MVC module.
- Devise or modify procedures to solve complex problems considering computer equipment capacity and limitations, operating lime, and form of desired results.
- Used SVN for version control.
- Expertise in Loading data into Data Base using Linux OS
- Implement, integrate and maintain the client server and web related applications.
- Implemented the project using Ajax,Java Script, and HTML as UI components.
- Implemented the project using JAXB to retrieve the data from the xml documents.
- Implemented the project using Web Services to communicate with other systems.
- Used spring framework for the business layer using Spring Core, DAO, Spring ORM, Spring Web MVC modules.
- Assessed the Netezza environment for implementation of the ETL solutions
- Expertise with Developing SQL scripts and Performance Tuning.
- Involved in the integration of Spark jobs with UC4 scheduler.
- Worked on Sqoop to transfer data from HDFS and Oracle DB for POC.
- Implemented the project using the Hibernate framework to communicate with database.
- Implemented Singleton, factory design pattern, DAO design Patterns based on the application requirements.
- Provide consulting to customers in identifying Big Data use cases and then guiding them towards implementation of use cases.
- Used SVN as version control tools to maintain the code repository.
- Developing Test cases using JUnit Testing Framework and Log4J for logging and audit trail
- Deployed the application using the JBOSS as application server.
Environment: Java 1.6, Servlets, JSP, JBOSS 6.1, Spring MVC, Hibernate, XML, JAXB, HTML, Ajax, JavaScript, JNDI, Web Services, WSDL, SOAP, SQL, PL/SQL, Oracle11g, Eclipse, SQL Developer, Log4j, JUnit, Maven, SVN.
Confidential, Owings Mills
Senior Java/ Developer
Responsibilities:
- Prepared the detailed technical requirements are technical specification and communication set up.
- Worked on Informatica tools -Source Analyser, Data warehousing designer, Mapping Designer Map Reduce, and Transformations Developer.
- Used Source Analyzer and Warehouse Designer to import the source and target database Schema's, and the mapping designer to map source to the target.
- Worked on Oracle BI Answers to build Interactive Dashboards with drill-down capabilities.
- Mapped the configurations to accommodate for the territorial hierarchy that were accommodated via chart drilldowns.
- Implemented dynamic dashboard prompts to zoom into particular segments of the business in a performance-optimized manner.
- Administered security and created alerts.
- Created templates for presenting results and Analytics and modified the Obiee Dashboard using cascading style sheets.
- Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters.
- Performing Unit Testing.
- Performance Tuning. worked on HBase a plus.
- Used Oracle Cache Techniques to improve the site performance.
- Responsible for development check out process.
- Reading the data from profile and completed the checkout process.
- Created one page checkout where customer will be able to finish checkout without navigating to different screens.
- System study, interaction with users & management, performance of analysis, designing, coding, and testing & implementation of the system.
- Development of product management code.
- Interacting with the client and with various internal teams such as UI,Backend and integration.
Environment: Java1.6, J2EE, Spring MVC, JSF, JPA, Servlets, JSP, XML, RESTwebServices,JSON, UML, Junit, CSS, HTML, JQuery, JavaScript, Maven, Linux, Oracle 10g,jDeveloper, Weblogic Application Server, Log4j, Git FlowYARN, SVN, CSV.
Confidential, Birmingham, AL
Java/J2EE Developer
Responsibilities:
- Involved in gathering and analyzing business requirements and converting them into technical specifications.
- Developed user-friendly web-based GUI using JSP, JavaScript, CSS, HTML, and DHTML.
- Worked on developing the backend part of the application involving spring 2, JPA, Hibernate 3.2 and Web Services.
- Involved in designing and implementing persistence layer using JPA with Hibernate following the Generic Data Access pattern.
- Used JPA and Hibernate annotations for defining object relational metadata.
- Developed Graphical User Interfaces using struts, tiles and JavaScript.
- Used JSP, JavaScript and JDBC to create web Servlets.
- Extensive use of JAXB to convert the xml schemas into objects to use in the application.
- Used various feature of spring 2such as XML configuration extensions for Declarative Transaction
- Management, Java Generics support, Annotation Driven Dependency injection.
- Agile methodology was adopted in the development. This includes daily Scrum.
Environment: Java 1.6, Spring Framework 3.0, Struts 2.0,Hibernate3.5, Rad 7.5, Websphere Application
Confidential, Lancaster, PA
Java Developer
Responsibilities:
- Coded Server side Enterprise Java beans using Session and Message Driven Beans.
- Creating of files and setting up the paths and properties for Web sphere application server.
- Logic and have developed Hibernate HQL and Hibernate mappings /created DAO mappings in Hibernate.
- Developed server-side common utilities for the application and the front-end dynamic web pages using JSP, JavaScript
- HTML Developed EJB components encapsulating business logic.
- Created navigation component that reads the next page details from an XML config file.
- Involved O/R Mapping using Hibernate.
- Designed Reference table process that primarily involves caching of the dropdown data for all the pages.
- Major components designed Reference Table, Navigation, Custom tags, and Logout process.
- Responsible for Unit Testing with Junit, integration testing of software.
Environment: Enterprise Java Beans, WebSphere Application Server, Hibernate, JSP, JavaScript, XML, Junit
Confidential
Java Developer
Responsibilities:
- Developed web components using JSP, Servlets, and JDBC.
- Implemented database using MySQL.
- Used JUnit for unit testing and involved in fixing defects.
- Involved in writing user and technical documentation.
- Made extensive use of Java Naming and Directory interface (JNDI) for looking up enterprise beans.
- Involved in developing stored procedures and triggers in PL/SQL.
- Developed the application using MVC architecture and deployed using Weblogic server.
- Involved in Database design, writing stored procedures and triggers, Writing session and entity beans,
- JMS client and message driven beans to receive & process JMS messages, JSPs & Servlets.
- Responsible for Parsing XML data using XML parser and Testing, fixing of the bugs and coding modifications.
Environment: Enterprise Java Beans, WebSphere Application Server, Hibernate, JSP, JavaScript, XML, Junit