We provide IT Staff Augmentation Services!

Java/kafka Developer Resume

3.00/5 (Submit Your Rating)

AZ

SUMMARY

  • Over 7+ years of progressively responsible experience in testing, documentation, production support, applications developing for Banking / Financial Services, Insurance Retail / e - Commerce clients using J2EE/JAVA Architecture and Kafka.
  • Expertise through all parts of Software Development Life Cycle (SDLC) in developing web applications using JAVA, J2EE, EJB, Web services, SOA, SOAP, RESTful etc.
  • Extensive hands-on experience with core expertise in design, development, and deployment of N-Tier enterprise applications for the J2EE platform using Core Java, Servlets, Struts, Spring, EJB, JSP, Web services, JPA, JNDI, JMS, JAXP, JUnit, JMeter.
  • Hands on experience in the implementation of Core Java SE 8 concepts like Streams API, Lambda functions, Generics, Time API, Functional Interfaces Multithreading, Transaction Management, Exception Handling and Collection API.
  • Expertise inReact JSframework to develop theSPA
  • Used Java 8 features in developing the code like Lambda expressions, creating resource classes, fetching documents from database.
  • Hands on experience in Front-End technologies like Angular Js, Bootstrap 3.1, HTML5, JavaScript, CSS3, jQuery, Google Web Toolkit (GWT), Tag Libraries, Custom Tags, Ajax and Node Js, Ext Js.
  • Experienced inReact JSand working withReact Fluxarchitecture using complex Object-Oriented concepts in improving the performance of the websites
  • Expert in Various Agile methodologies like SCRUM, Test Driven Development, Incremental & Iteration methodology and Pair Programming as well as implementing Waterfall model. To track the progress of Agile process used JIRA.
  • Experience in working with open source frameworks like Spring, Struts and ORM frameworks like Hibernate and Spring JPA.
  • Experienced using JSF, Servlets, JSP, JDBC, JMS, JSTL and JNDI.
  • Extensive experienced with hands on development in various Spring components like Spring MVC, AOP, Spring IOC, Spring JDBC, Spring JPA, Spring Securities and Spring Batch, Spring Boot, Spring Microservices, and swagger API’s.
  • Experience writing with Splunk query and working with Splunk dashboard to check health service metrics.
  • Good understanding ofHadoop YARNArchitecture and its daemons such as Resource manager, Node Manager, Application master, Job Tracker, Task Tracker, Containers, Name Node, Data Node and MapReduce v1 & v2 concepts.
  • Developed MapReduce jobs in Java for processing data in different File Formats (Avro, RCFile, JSON etc.) and Compression Formats (LZO, GZIP, Snappy etc.).
  • Experience gathering data from different sources and implementing optimized Joins using MapReduce programs.
  • Experience in using various Amazon Web Services (AWS) Components like EC2 for virtual servers, S3.
  • Good working knowledge of NoSQL Database, MongoDB.
  • Good experience in developing SOA (Service Oriented Architecture) and Microservices architecture using J2EE Web Services based on Restful (JAX-RS), SOAP (JAX-WS), JAX-RPC and JAXR (Java API for XML Registries) using frameworks like Apache CXF, Jersey, Axis and Spring framework, NodeJS.
  • Good exposure to Tableau 7 and Tableau 8.
  • Experience working with Spring modules like Spring core, Spring MVC, AOP and ORM integrations.
  • Experienced in writing SQL, procedures, functions, triggers and integrate with ORM tools like Hibernate.
  • Hands on Experience in working with Databases MYSQL, Oracle 10g/11g, SQL-Server, DB2 and NoSQL (MongoDB, Cassandra) Dynamo DB, PostgreSQL.
  • Experience in writing SQL queries, Stored Procedures, Triggers, views with the major databases like Oracle 9i to 11g, SQL, PL/SQL, MS Access.
  • Experience in unit testing the applications using JUnit, TDD Framework.
  • Experience in generating Reports and Dashboards on Dynatrace (APM tool) and Splunk.
  • Experience with Apache Kafka and Zookeeper apart from JMS as messaging service.
  • Experience in using JSP and Servlets to create web interfaces and integrating apache tools like Kafka.

TECHNICAL SKILLS

Programming Languages: Java/J2EE, SQL, PL/SQL, JavaScript, Perl, XML, jQuery

Database skills: Oracle, PL/SQL, SQL, Oracle 11g, MySQL, MS SQL Server 2008, SQL Developer, Toad, SQLite3, Microsoft Access, EBX, PostgreSQL

Web Technologies: HTML5, XML, CSS, XSL, AJAX 2.0, .Net, XSL, XHTML

Web/Application Servers: JBOSS, Apache Tomcat7.1, WebLogic, IBM WebSphere

Scripting languages: Java Scripting, PERL Scripting, Shell scripting

IDE’s: Eclipse, STS, NetBeans, IntelliJ, Sublime, RAD.

Version control tools: CVS, SVN, Git.

Frameworks: Spring 1.2/1.3/2.0/3.2 , MVC, Hibernate 2.0/3.0.

Build/Testing tools: Ant, Maven, Jenkins, Hudson, JUnit, Mockito, Power Mockito, Log4j, SOAP UI, GITHUB, Postman.

Operating system: LINUX, UNIX, Windows 7, Mac, Vista.

PROFESSIONAL EXPERIENCE

Confidential, AZ

Java/Kafka Developer

Responsibilities:

  • Developed REST API's using Spring Boot. Developed the API endpoint based on the RESTful Webservices.
  • Used Postman to test the REST API end points.
  • Utilized Spring Boot framework to develop the controller and service classes for interacting with the database.
  • Used Spring Framework for Dependency injection and integration with other layers: Service objects, DAO etc.
  • Experience working withConfluentKafkaevent streaming platform version 4.x, 5.x
  • Experience in utilizing and implementingConfluentSchema Registry withKafka.
  • Implemented Spring boot microservices to process the messages into the Kafka cluster setup.
  • Experience in using build/deploy tools such asJenkins, Docker for Continuous Integration & Deployment for Microservices.
  • Used spring config server for centralized configuration and Splunk for centralized logging. Used Concourse and Jenkins for Microservices deployment.
  • Experience with stream processing usingKafkaandKafkaConnect.
  • Implemented to reprocess the failure messages in Kafka using offset id.
  • Used Spring Kafka API calls to process the messages on Kafka Cluster setup.
  • Used Spring Kafka API calls to process the messages on Kafka Cluster setup.
  • Implemented Kafka Stream to retry error topic record.
  • Implemented Kafka Consumer Rebalancer to save the offset.
  • Implemented latest Kafka consumer incremental rebalancing.
  • Used Kafka stream state store for aggregations.
  • Configured RocksDB to avoid unbounded memory growth.
  • Used Kafka interactive query to query the state store.
  • Used Multithreaded concept in implemented multiple consumers.
  • Replaced with Kafka connect wherever possible to avoid complex code.
  • Implemented manual commit in Kafka Consumer.
  • Implemented retry logic with fixed delay before processing.
  • Implemented Kafka log compaction mechanism for restoring state in case of system failure.
  • Performed log compaction and deduped mechanism to remove duplicate messages.
  • Implement KSQL for easy transformation of data within the pipe.
  • Experience building Kafka Connector for publishing and consuming Kafka messages.
  • Experienced in schema evolution such as forward and backward compatiability.
  • Familiar with KSQL windows functions such as windows hopping and tumbling.
  • Worked on partition of Kafka messages and setting up the replication factors in Kafka Cluster.
  • Experience in designing high availability, scalable, fault-tolerant AWS Cloud platform.
  • Performed export and import of data into Amazon AWS S3 from multiple data sources.
  • Experience in writing and retrieving files to and from AWS S3 bucket for UI to render data faster that involves complex and time-consuming server-side logic.
  • Created Stored Procedures and other SQL scripts using PL-SQL.
  • Provide postproduction support for the project.
  • Developed API for using AWS Lambda to manage the servers and run to code in the AWS.
  • Knowledge of Kafka API.
  • Designed and implemented by configuring Topics in new Kafka cluster in all environment.
  • Exposure and Knowledge of managing streaming platform on cloud provider (Azure, AWS & EMC)
  • Efficiently Worked with all of the following tools/Instances but not limited to including - Kafka, Zookeeper, Console Producer, Console Consumer, Kafka Tool, File Beat, Metric Beat, Elastic Search, Logstash, Kibana, Spring Tool Suite, Apache Tomcat Server etc.
  • Operations - Worked on Enabling JMX metrics.
  • Operations - Involved with data cleanup for JSON and XML responses that were generated.
  • Successfully secured the Kafka cluster with Kerberos Implemented Kafka Security Features using SSL and without Kerberos. Further with more grain-fines Security I set up Kerberos to have users and groups this will enable more advanced security features.
  • Integrated Apache Kafka for data ingestion.

Environment: J2EE, Java 1.8, Agile (SCRUM), REST API, Spring DI/IOC, Spring Boot, STS, Spring JDBC, Spring MVC, Hibernate, JSP, Web Services, SQL Server, Microservices, XML, jQuery, JavaScript, Apache Maven, Apache Cassandra, MongoDB, JUnit, Html, HTML/DHTML, JENKINS, Kafka, POSTMAN, Log4J, GIT, Oracle, UNIX.

Confidential, NJ

Java/Kafka Developer

Responsibilities:

  • Actively involved in Analysis, Design, Development, System Testing and User Acceptance Testing. Successfully followed agile methodology in the Scrum Cycle model.
  • Designed and developed Microservices using REST framework and Spring Boot.
  • Extensively used MVC, Factory, Delegate and Singleton design patterns.
  • Decomposed existing monolithic code base into Spring Boot Microservices.Developed new features and provided support for allMicroservices.
  • Performed Real time event processing of data from multiple servers in the organization using Apache Storm by integrating with Apache Kafka.
  • Install Kafka on AWS cluster and configure producer and consumer coding part in java. Loaded data from various data sources into AWS S3 using Kafka.
  • Experience in developing Spark streaming code in Scala to pull data from Kafka topics.
  • Performed comparative analysis of the Hive vs Impala.
  • Implemented ingestion services to connect to RDBMS using Kafka producer API’s / Kafka Connect.
  • Developed Spring boot application with Microservices and deployed it into AWS using EC2 instances.
  • Used Cassandra as the backend database to retrieve Data Access Objects.
  • Developed Spring boot application with Microservices and deployed it into AWS using EC2 instances.
  • Involved in setting up the application to run on AWS cloud environment on multiple EC2 instances.
  • Experience on working several AWS services like EC2, S3, ELB, SNS, ALB, and ECS.
  • Experience in Implementing API's in JavaMulti-Threaded Environment
  • Written unit test cases using JUnit and Integration test cases and integrated with Jenkin jobs.
  • Designed services to store and retrieve user data using Mongo DB database and communicated with remote servers using REST enabled Web Services on Jersey framework.
  • Working withIbatisORM tool to connect thejavacode to Database (Oracle).
  • Experience in integration tools like Spring Integration, Apache Kafka and Apache Camel to integrate the enterprise application technologies with existing JVM environment.
  • Involved in writing application level code to interact with APIs, Web Services using AJAX, JSON and XML.
  • Writing JSP's for user interfaces, JSP's uses Java Beans objects to produce responses.
  • Used JUnit framework for Unit testing of application.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Configured and built Spring MVC application on Tomcat web server.
  • Worked on UNIX Shell scripts and Cron jobs.
  • Created and built project using Maven.
  • Involved in designing the page layouts using Wire Frames templates.
  • Used extensively Eclipse in development and debugging the application.
  • Validating the Application by Deploying and testing on JBoss Server.
  • Analyzed large data sets by running Hive queries and Shell scripts.
  • Experience in importing and exporting data from HBase usingSpark.
  • Exposed the endpoint forSwaggerand developed API's for documenting RESTFUL Web services.
  • Experience in Implementing API's in JavaMulti-Threaded Environment.
  • Finally, creating the backup, adding clients, corgis, patch up and monitoring.

Environment: J2EE, Java 1.8, Spring framework, Spring MVC, Hibernate, JSP, AJAX, SOA, Web Services, SQL Server, Microservices, XML, Ext JS, NodeJS, jQuery, JavaScript, WebSphere 8.5, Clear Case, Apache Maven, Apache Cassandra, MongoDB, JUnit, Html, Unit, XSLT, HTML/DHTML, JENKINS, Spark, Kafka.

Confidential, NY

Java Developer

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) of the application like Requirement gathering, Analysis, Design and Code development.
  • Developed business components using Java Objects, CoreJava, Multithreading and Collections
  • Developed end to end application on spring boot framework (Rest Api Application/Spring JPA using Crud repository).
  • Creating reproducible infrastructure environments for theMicroservicesarchitecture (Ansible, AWS CloudFormation).
  • Used Maven build script for building and deploying the application and Designing new database tables for the enhancements.
  • Created User defined exception classes usingJava.
  • Used Core, Context, and Security, Bean modules of Spring Framework.
  • Design and implemented the backend layer using Hibernate.
  • Involved in multi-tiered J2EE design utilizing Spring Inversion of Control (IOC) architecture and Hibernate.
  • Applied design patterns including MVC Pattern, Façade Pattern, Abstract Factory Pattern, DAO Pattern and Singleton.
  • Implemented Hibernate for data persistence in Workflow screens. Used RAD6, WSAD as IDE for development of the application. Worked with WebLogic, Tomcat as the application servers and JMS as the message Server.
  • Performed Clear Quest defects, Database change requests logging using Clear Quest. Used Perforce as versioning system.
  • Worked on Code design and develop the code in Java /J2EE components including Core Java, JavaScript, JSP/Servlets, Building Restful Web services, with SQL, Sybase DB.
  • Implemented and tested the enterprise application with jQuery, Spring MVC.
  • Extensive usage of BOOTSTRAP and LESS CSS for Responsive design.
  • Developed SQL scripts for data migration.
  • Implemented MVC, Singleton, Factory, DAO, Value Object, session, Facade, Data Access Object, Business Object paradigm.
  • Configured the XML files for Hibernate 3 and applied its Object Relational Mapping ORM to enable the data transactions between POJO and Oracle Database using caching on Session Factory as well as caching on query.
  • Strong experience in implementing Service oriented architecture and Web Services using SOAP, RESTFUL API's.
  • Wrote spring configuration file to define beans, define data source and Hibernate properties.
  • Extensively used Hibernate in data access layer to access and updates in the database.
  • Understanding user change requirements and analyzing the source systems and Developed UI pages using JSP and Spring MVC framework.
  • Validations are done using Spring MVC validation framework and JavaScript.
  • Designed, developed and implemented unit tests and product features.
  • Maintained, structured, and surveyed documents within the NoSQL, MongoDB database; ensuring data integrity, correcting anomalies, and increasing the overall maintainability of the database.
  • Responsible for building/deploying consistently repeatable build/deployments to company non-production environments using JENKINS & BUILD Pipelines.
  • Monitored the control using Spring IOC.
  • Followed Scrum/Agile Methodology during the software development life cycle.
  • Extensively worked with XML while using Maven, Dispatcher Servlet etc.

Environment: Java SE, JDK 1.7 and 1.8, Hibernate 3.0, Spring, Groovy, JSP, HTML, CSS, Angular, jQuery, XML, XSLT, SQL Server, Maven, Apache Tomcat 8, Eclipse 6.0, SVN, Jenkins, Spring Boot, JUnit, Mongo DB, Docker.

We'd love your feedback!