Full Stack Developer / Hadoop Developer Resume
SUMMARY
- Over 8 years of professional IT experience in various Software Development positions in core and enterprise Software Design and Deployment of using various Java/Java EE (J2EE) JDK 6, 7 and 8, and Big Data and Open Source technologies.
- Having 3+ years of hands - on Experience on Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper, Flume, NiFi, Spark and Kafka including their installation and configuration.
- Expertise in web development applications using Core Java, Servlets, JSP, EJBs (Session Bean, Entity Beans, JMS/MQ Series), JDBC, XML, XSD, XSLT, JNDI, XML Parsers (DOM and SAX), JAXP, JAXB, Java Beans etc.
- Expert hands on experience in working with Spring modules like Spring MVC, Spring Boot, Spring IoC, Spring ORM, and Spring JDBC, Spring Web Services, Spring JMS, Spring AOP etc.
- Expertise in microservices applications using Spring Boot(1.5) with Micro Services, Tomcat(8), Couchbase(5.0), CouchDB(2.1) and Kafka, Memcached.
- Efficient in using J2EE Frameworks like Struts, SOAP Web Services, RESTful Web Services, Hibernate Technologies.
- Proficiency in web service applications both producing and consuming REST (JAX-RS, JAX-WS) API.
- Hands on Experience in integration with ANT, JUnit and Log4j frameworks.
- Expertise in usingJavaIDE tools of Rational Application Developer (RAD), Web Sphere Application Developer (WSAD), Eclipse EE 3.0/3.2, WebLogic Workshop, RSA 7.0 and NetBeans to develop dynamic web applications.
- Good exposure toJavaWeb Services development using SOAP, REST, WSDL.
- Experience in extensive usage of various J2EE Design Patterns like Singleton, Factory Pattern, Builder, MVC, Chain of Responsibility, Prototype patterns.
- Expertise in developing XML documents with XSD validations, SAX and DOM parsers to parse the data held in XML documents.
- Strong working knowledge of Database Programming using and maintaining SQL, PL/SQL, Triggers and Stored Procedures. and Database server like Oracle, Sybase and MS SQL Server with SQL & Stored Procedures.
- Experienced with J2EE Application Servers like IBM WebSphere, BEA WebLogic, JBoss and Tomcat.
- Have an experience in integrating web services using Mule ESB.
- Experience in Enterprise Integration Development using Apache Camel Framework.
- Developed integration techniques using the Apache camel and apache Active MQ technologies.
- Strong Experienced in Developing and Deploying Applications using WebSphere, WebLogic, JBoss, Apache Tomcat & HTTP Server.
- Good understanding in implementing web application using Angular JS Framework.
- Created RESTful Web Services with Node JS.
- Experience in working with version controlling tools like CVS, SVN and build tools like ANT and Maven.
- Used Bugzilla, JIRA&HP Quality Center for Bug reporting.
- Managed risk analysis and mitigation plans, status/defect reports, and client presentations.
- Proficient in all phases of SDLC (analysis, design, development, testing and deployment) and highly competent in gathering user requirements and converting them into software requirement specifications.
- Experience in AWS, Hortonworks and Cloudera Hadoop distributions.
- In depth knowledge of Hadoop architecture and various components such as HDFS, JobTracker, NameNode, DataNode, MapReduce and Yarn concepts.
- Experience in writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
- Handling and further processing schema oriented and non-schema oriented data using Pig.
- Designed and developed Sqoop scripts for datasets transfer between Hadoop and RDBMS.
- Experience in extending Hive and Pig core functionality by writing custom UDFs.
- Hands on experience in extending the core functionalities of HIVE using UDF, UDAF and UDTF.
- Experience and good at Data modeling with Hive.
- Experience in using NiFi processor groups, processors and concepts on process flow management.
- Knowledge of job workflow scheduling and monitoring tools like Oozie and Zookeeper.
- Experience in using Flume to collect weblogs.
- Developed MapReduce jobs to automate transfer the data from HBase.
- Handling different file formats on Parquet, Proto Buffer, Avro, Sequence file, JSON, XML and Flat file.
- Experience working on Kafka cluster. Also have experience in working on Spark and Spark Streaming.
- Good Knowledge in creating event processing data pipelines using Kafka and Spark Streaming.
- Configured and maintained different topologies in Storm cluster and deployed them on regular basis.
- Got experience in working on Apache Spark for implementing advanced procedures like text analytics and processing using the in-memory computing capabilities written inScala.
- Experience in using Hcatalog for Hive and Pig.
- Involved in the ETL process usingAbInitiotool to setup a data extraction from several databases.
- Loaded data in elastic search from datalake using SPARK/Hive.
- Knowledge of NoSql databases such as Hbase, MongoDb, Cassandra and CouchDB 5.0, Couchbase.
- Involved in data modeling and sharing and replication strategies in MongoDB.
- Experience in creating custom Lucene/Solr Query components.
- Utilized Kafka for loading streaming data and performed initial processing, real time analysis using Storm.
- Excellent programming skills with experience in Java, C, SQL and Python Programming.
- A deep-seated desire to keep up with technology and learn new skills.
- Ability to evaluate, suggest, and implement various technologies and techniques based on the specific task at hand.
- Implement Best Practices and uphold technical requirements throughout the full SDLC.
- Problem solver with the ability to rapidly analyze challenges, applying strategic thinking to tactical concerns with strong problem solving skills and a result oriented attitude.
- Research new technical topics, available third party and open source libraries, and other tools and techniques which may refine processes or enhance software performance.
- Excellent goal-oriented team Player, quick learner and self-starter with effective communication, motivation and organizational skills combined with attention to details and business process improvements.
- Involved in coordinating with multiple teams for common issues fix, effective communication with client, cross-functional teams to achieve project priorities/deadlines.
- Providing leadership and mentor ship to development team members; conducting code reviews; and coordinating communication and efforts between both business stakeholders and technical teams (development, quality assurance, and systems administration).
- Excellent communication skills, leadership abilities, strong architectural skills, hardworking and a very good team worker.
TECHNICAL SKILLS
Hadoop Core Services: HDFS, MapReduce, Spark, Yarn
Hadoop Distribution: Hortonworks, Cloudera
NoSQL Databases: Hbase, Cassandra, MongoDB, Couch DB 5.0, Couchbase.
Hadoop Data Services: Hive, Pig, Impala, Sqoop, Flume, NiFi, Kafka, Storm
Hadoop Operational Services: Zookeeper, Oozie
Programming Languages: Java 1.6, 1.7 and 1.8, Servlets, Struts, Spring 1.5, RESTful Web Services, Scala, Python
Databases: Oracle, MySQL, SQL Server, MS SQL, Sybase
Application Servers: Web Logic, Web Sphere, JBoss, Tomcat
Development Tools: NetBeans, Eclipse, IBM Rational Application Developer 9.0, Python IDE, IntelliJ
Client Side Scripting: Java Script, Angular JS, jQuery, Ajax, XML, JSON
Java/J2EE Technologies: CoreJava, Collections, Exception Handling, IO. Multi-Threading, Annotations, Applets, Swings, JDBC, JMS, Groovy, EJB
Server Side Scripting: JSP, Java
Version Control/Debugging: CVS, SVN, GitHub, GitLab, Fire bug, Dev Tools
Web Services: RESTful, SOAP and WSDL
XML Technologies: XML, DTD, XSD
Web Technologies: HTML, DHTML, CSS, JSP, HTML5, JSF
Operating Systems: Windows, UNIX/Linux, Mac OS
PROFESSIONAL EXPERIENCE
Confidential
Full Stack Developer / Hadoop Developer
Responsibilities:
- Worked on systems-and infrastructure-level concepts related to software.
- Involved in Requirements gathering, Development, Testing and delivery of application.
- Followed Agile methodology and Scrum.
- Involved in preparation of functional definition documents.
- Involved in the discussions with business users, testing team to finalize the technical design documents.
- Involved in all phases of Software Development Life Cycle (SDLC) such as Analysis, Design, Development, Testing and Implementation.
- Provide better estimates while spending less time creating them and make sure we were in control of the project schedule and state.
- Performed User Interface Design and coding usingJava, Spring framework and web technologies
- Hands on Experience in systems analysis, including defining technical requirements and performing high level design for complex solutions.
- Work with internal groups and external vendors to roll out a high-quality product.
- Developed MicroServices using Spring Boot, Tomcat, CouchDB, Couchbase and Kafka, Memcached.
- Involved with deployment team to deploy the applications using Docker container with Tomcat Server through automation deployment using GitLab and Jenkins.
- Experience using GitHub, GitLab, Subversion, Jenkins and Selenium.
- Developed the applications using JSP, Struts and implemented MVC Architecture.
- Designed UI pages using JSP, HTML5,AngularJS, JavaScript, CSS, AJAX, jQuery, JSP, JSON and Tag libraries.
- Implemented business tier using Spring IOC, AOP and MVC.
- Involved in using Corejavaconcepts Collections, Exception Handling, Multi-Threading, Serialization andJava1.6 and 1.7 features etc.
- Developed, and debugged the Servlets and EJB with WebSphere Application Server 8.5.
- Developed Web Services for data transfer using RESTful Web Services.
- Used JAX-B to call Web Services described by WSDL.
- Involved in peer code reviews.
- TEST and QA, UAT & Production issues investigation and supporting business users.
- Working with the Middleware team and DBA’s regarding the Server and Database issues and code deployments.
- Followed coding guidelines, ran PMD & Find bugs and also make sure the 100% code coverage to maintain the quality of code.
- Involved using IBM tools ISAM and ISIM to resolving the security access related issues in application level and business and dealer users.
- Prepared Change Request business documents and involved in change and release management.
- Implemented agent-server messaging dialog using Camel and JMS (Active MQ implementation).
- Developed simple and complex MapReduce programs in Java for Data Analysis on different data formats.
- Developed MapReduce programs that filter bad and un-necessary records and find out unique records based on different criteria.
- Developed Secondary sorting implementation to get sorted values at reduce side to improve MapReduce performance.
- Implemented Custom writable, Input Format, Record Reader, Output Format, and Record Writer for MapReduce computationsto handle custom business requirements.
- Implemented MapReduce programs to classify data records into different classifications based on different type of records.
- Created FanIn and FanOut multiplexing flows with Flume.
- Experience with creating ETL jobs to load JSON data and server data into MongoDB and transformed MongoDB into the Data Warehouse.
- Experience in developing and designing POCs usingScalaand deployed on the Yarn cluster, compared the performance of Spark, with Hive and SQL/Teradata.
- Created Ab Initio graphs that transfer data from various sources like Oracle, Flat Files and CSV files to the Teradata database and Flat Files.
- Worked on Sequence files, RC files, Map side joins, bucketing, partitioning for hive performance enhancement and storage improvement.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala.
- Implemented Daily Oozie coordination jobs that automate parallel tasks of loading the data into HDFS and pre-processing with Pig using Oozie co-coordinator jobs.
- Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
- Responsible for performing extensive data summarization using Hive.
- Importing the data into Spark from Kafka Consumer group using Spark Streaming APIs.
- Developed Pig UDF's to pre-process the data for analysis using Java or Python.
- Worked with Sqoop import and export functionalities to handle large data set transfer between Oracle database and HDFS.
- Derived modeled the Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for creating billing.
- Knowledge on the real-time message processing systems (Storm, S4).
- Worked intuning Hive and Pig scriptsto improve performance.
- Involved in migrating Hive queries intoSparktransformations usingData frames, Spark SQL, SQL Context, and Scala.
- Involved in submitting and tracking MapReduce jobs using Job Tracker.
- Implemented business logic by writing Pig UDFs in Java and used various UDFs from Piggybanks and other sources.
- Good at schema design and data modeling on HBase.
- Developed a data pipeline using Kafka and Storm to store data into HDFS.
- Knowledge on handling Hive queries using Spark SQL that integrate Spark environment.
- Implemented test scripts to support test driven development and continuous integration.
- Good at implementing unit tests with MRUnit and PIGUnit.
- Configured build scripts for multi module projects with Maven and Jenkins CI&CD.
- Involved in story-driven agile development methodology and actively participated in daily scrum meetings.
- Implementation, such as build and release management, configuration management, clustering, load balancing, and systems/network monitoring.
- Familiarity with systems administration of Linux and Windows servers.
Environment: Java/J2EE, JSP 2.2, HTML5, XML, XSLT, CSS, jQuery, AJAX, AngularJS, JAX-WS, JAX-RS(REST), Twitter Bootstrap, Node JS, Spring 3.2 and 5.1, Hadoop, CDH4, Map Reduce, HDFS, Pig, Hive, Impala, Oozie, Kafka, Storm, Linux, Maven, Oracle 11g/10g, SVN, MongoDB.
Confidential
Sr. JavaDeveloper / Hadoop Developer
Responsibilities:
- Extensively used Java OOPS concepts, Java 1.8 features, Java J2EE and JSF, jQuery, JavaScript to develop ARDHS web applications.
- Implemented Arkansas Works as part of Arkansas DHS EEF Project using RESTful Web Services, React.js and Node.js.
- Implemented client side functionality using JavaScript, HTML 5, CSS 3, Bootstrap and jQuery.
- Developed web application using Ajax, Facelets, JSON, GIT.
- Extensively involved in Installation and configuration of Cloudera distribution, NameNode, Secondary NameNode, Job Tracker, Task Trackers and DataNodes.
- Installed and configured Hadoop ecosystem like HBase, Flume, Pig and Sqoop.
- Involved in Hadoop cluster task like Adding and Removing Nodes without any effect to running jobs and data.
- Managed and reviewed Hadoop Log files.
- Load log data into HDFS using Flume. Worked extensively in creating MapReduce jobs to power data for search and aggregation.
- Worked extensively with Sqoop for importing metadata from Oracle.
- Designed a data warehouse using Hive.
- Created partitioned tables in Hive.
- Mentored analyst and test team for writing Hive Queries.
- Extensively used Pig for data cleansing.
- Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS
- Developed the Pig UDF’S to pre-process the data for analysis.
- Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
- Get trained in Curam basics and Fundamentals including but not limited to Curam Server and Client side Development and Customizations.
- Built web-based applications using Spring and JSF technologies.
- Actively involved in daily scrum meetings to discuss on the status of the tasks assigned.
- Extensively worked on front end, business, and persistence tiers using thestrutsframeworks.
- Designed and developed Service layer usingStrutsframework.
- Developed presentation layer usingstrutstag libraries like logic, html, bean, etc inJSPpages.
- ImplementedSOAbased web services, designed and builtSOAPweb serviceinterface, implemented usingSOAP &ApacheAxis.
- Made proficient use ofWSDLfiles. UsedSOAPUIfor testing the web services.
- Involved in developing Unit test framework usingJUnit. Wrote JUnit test cases.
- Built single-page applications with JavaScript on a custom-built framework.
- Leveraged AngularJS and JavaScript to build presentation layer.
- Developed and maintained critical components using React components.
- Experience in Data Serialization formats like JSON, XML.
- Used DB2 server for database operations.
- Developed modelling part of the application using various features provided by RSA.
- Built the code using ANT scripts.
- Developed new UI functionality for multithreaded user-facing application running on touchscreen devices using AngularJS.
- Worked on the implementation of Server code.
- Understanding functional specifications and documenting technical design documents for the Project.
- Involved in Maintenance activities of Arkansas Project.
- Contributed to Arkansas EEF Project Releases.
- Certified in IBM Curam Application Developer Exam with 80%.
- Delivered the assigned tasks within planned timelines and quality standards.
- Involved in defect fixes for real-world Arkansas EEF Project
- Fixed issues in various areas of Curam like Rules, Evidence, IEG, Batch, Account Transfer, etc.
- Actively involved in meetings to discuss on the issues and resolution.
- Used Jira to update the status on the assigned tasks.
- Client side files like uim are developed using xml.
- Configured old codes for open-source communication.
- Reformatted DOM-Level-2Core to extend architectural applications.
- Database operations are performed using DBeaver and Toad.
- Property files are used to inject values to the xml file.
- Created the project startup UI i.e.generic XHTML facelet templates, CSS, generic JSF validators and converters, backing beans, reference beans, phase listeners etc.
- Used GIT repository for code check-in and check-out.
- Created custom JSF Facelet tags for error handling, data tables, true/false markers etc.
Environment: Java 1.8, H2 Server, Curam, Struts, DB2, XML, RSA, Oracle, REST, HTML, JSF, JSON, Facelets, Spring, Hibernate, Struts, CSS, JavaScript, Tomcat, React.js, Node.js., jQuery, GIT, AJAX, Angular JS, Bootstrap, SOAP WebService, Hadoop, CDH4, Map Reduce, HDFS, Pig, Hive, Impala, Oozie, Kafka.
Confidential, Waltham, MA.
Java Developer
Responsibilities:
- Involved design, code, test, debug, document and maintain software solutions.
- Implemented technical solutions to improve the applications for user experience and perform feasibility studies.
- Developed web application following User Experience (UX) principles.
- Developed software according to Confidential Standards.
- Used Eclipse IDE to develop the application
- Involved in fixing defects & tracked them using QC & Provided support and maintenance and customization
- Involved propose viable technical solutions to Product Management and users for validation.
- Developing customized reports and Unit Testing using JUnit.
- Conduct unit, package and performance tests of the software and ensure a level of quality in line with the Confidential guidelines.
- Participate in the validation / acceptance phase of the product cycle ensuring the fine-tuning necessary to finalize the product.
- Involved the validation / acceptance phase of the product cycle ensuring the fine-tuning necessary to finalize the product.
- Involved in running PMD defects & Find bugs.
- Dealt with business processes and Updated Project documents.
- Involved in fixing IST, QA, UAT & Production defects.
- Following Coding guide lines & maintain quality of code.
- Involved in building the code & deploying on the server.
- Supported the end user in the Production phase by debugging existing software solutions in response to Problem Tracking Records (PTR) and Change Requests (CR) issued from Product Management or Product Definition.
Environment: CoreJava, J2EE, JSP, Apache Tomcat, Oracle, Spring, JMS, XML, HTML, DHTML,JavaScript, AngularJS, CSS, Bootstrap, AJAX, CVS, Struts.