Kafka Developer Resume
Dallas, TX
SUMMARY:
- Result - driven IT Professional with overall 11+ years of extensive experience in software design and development & Requirement Analysis that includes recent 2 years of Bigdata Ecosystems experience in ingestion, storage, querying, processing and analysis of Big Data.
- Started career as Java developer and currently in the process of switching to big data Hadoop technologies expert and working as big data developer since last two projects.
- Having capable hands on experience in software development using Java, J2EE technologies in varied domains like Media and Financial, Real Estate, Telecom Applications, Manufacturing, State Agencies and Ecommerce.
- Excellent understanding of Hadoop architecture and core components such as Name Node, Data Node, Resource Manager, Node Manager and other distributed components in the Hadoop platform.
- Experience in Spark Framework on both batch and real-time data processing.
- Skilled in developing Hadoop integration for data ingestion, data mapping and data process capabilities.
- Having knowledge on DevOps tools like Git, Jenkins and Docker.
- Having knowledge on NoSQL databases like Cassandra, HBase and MongoDB.
- Excellent skills in implementing Client and Server-side components using Java, J2EE technologies (JSP, Servlets, JDBC, EJB and JSF).
- Extensive experience in design, development and implementation of Model-View-Controller (MVC1, MVC2) using Struts2.x, spring frameworks and JSF.
- Expertise in development tools like Eclipse, My Eclipse and STS.
- Expertise in designing applications using various J2EE design patterns like Singleton, Value Object, Data Access Object, Factory, Session Facade, Business Delegate and Service Locator etc.
- Extensive experience in using and configuring various application servers like BEA Web logic, JBOSS and Tomcat.
- Extensive experience in development and implementation of ORM frameworks like Hibernate.
- Good knowledge on AJAX, SOAP, Web Services and WebLogic Portal.
- Pleasant experience in RDBMS like Oracle and associated SQL Programming Knowledge in creating and maintaining SQL Databases , JDBC, TSQL, DB Tables, Stored Procedures, Views, functions, Triggers and PL/SQL.
- Extensively used various internet technologies like JavaScript, HTML, CSS, exJS and XML.
- Expertise in SQL Scripting and exposure in understanding stored procedures and involved in study of Database Design Process.
- Having pleasant experience in source control system such as Clear Case, CVS, Perforce and SVN.
- Having pleasant experience in tools like TOAD.
- Expertise in developing enterprise applications using N-tier architecture.
- Worked in CMMi level standards and making ensure that the company's software development standards and methodology are followed.
- Excellent Interpersonal, verbal and written communication skills.
PROFESSIONAL EXPERIENCE:
Confidential, Dallas, TX
Kafka Developer
Responsibilities:
- Implemented Spring boot microservices to process the messages into the Kafka cluster setup.
- Worked as Onshore lead to gather business requirements and guided the offshore team on timely fashion.
- Closely worked with Kafka Admin team to set up Kafka cluster setup on the QA and Production environments.
- Had knowledge on Kibana and Elastic search to identify the Kafka message failure scenarios.
- Implemented to reprocess the failure messages in Kafka using offset id.
- Implemented Kafka producer and consumer applications on Kafka cluster setup with help of Zookeeper.
- Used Spring Kafka API calls to process the messages smoothly on Kafka Cluster setup.
- Written an external API call to explore the connection mechanism to between Sequentra to LeaseAccelerator(LA) and Sequentra to Leverton Integration system.
- Used Spark API to generate PairRDD using Java programming.
- Have knowledge on partition of Kafka messages and setting up the replication factors in Kafka Cluster.
- Worked on Big Data Integration &Analytics based on Hadoop, SOLR, Spark, Kafka, Storm and web Methods.
- Explored with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark -SQL, Data Frame, PairRDD's, Spark YARN.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from SQL into HDFS using Sqoop.
Environment: Java, Python, Sqoop, Spring Boot, Microservices, jQuery, JSON, Git, Jenkins, Docker, Maven, Apache Kafka, Apache Spark, SQL Server, Kibana and Elastic Search
Confidential, Albany NY
Bigdata Developer
Responsibilities:
- Worked on analyzing Hadoop cluster using different big data analytic tools including Flume, Pig, Hive, HBase, Oozie, Zookeeper, Sqoop, Spark and Kafka.
- Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data.
- Used SparkAPI over Cloudera Hadoop YARN to perform analytics on data in Hive.
- As a Big Data Developer implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing Big Data technologies such as Hadoop, MapReduce Frameworks, MongoDB, Hive, Oozie, Flume, Sqoop and Talend etc.
- Developed a job server (REST API, spring boot, ORACLE DB) and job shell for job submission, job profile storage, job data (HDFS) query/monitoring.
- Explored with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark -SQL, Data Frame, PairRDD's, Spark YARN.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from SQL into HDFS using Sqoop.
- Developed analytical components using Scala, Spark, Apache Mesos and Spark Stream.
- Installed Hadoop, Map Reduce, and HDFS and developed multiple MapReduce jobs in PIG and Hive for data cleaning and pre-processing.
- Worked on Big Data Integration &Analytics based on Hadoop, SOLR, Spark, Kafka, Storm and web Methods.
- Managed Hadoop jobs using Oozie workflow scheduler system for Map Reduce, Hive, Pig and Spark transformation actions.
- Performed transformations, cleaning and filtering on imported data using Hive, Map Reduce, and loaded final data into HDFS.
Environment: Hadoop, J2EE, JavaScript, HDFS, Apache Spark, Sqoop, Apache Kafka, HBase, Scala, Python, Java, SQL Scripting and Talend, Linux Shell Scripting, Cassandra, Zookeeper, Cloudera, Cloudera Manager, Oracle.
Confidential, Albany NY
Senior Java Lead
Responsibilities:
- Collected Business requirements from the client and Business Analyst, Functional Specification Document (FSD) have been developed.
- Developed High Level Design Document and Low-Level Design Document.
- Created the SDD (System Design Document) based on FSD from the scratch.
- Developed Geocoding Restful web service to provide the longitude and latitude of the crime location to point on Google Map.
- Designed UI using JSF framework, and configured UI for all global access servers.
- Implemented all business rules like Top charge calculation out of multiple crimes for the incident number and separating non-part 1 crimes and finding the UCR code of the crime.
- Coding and unit Testing, configuring for applet for remote sessions.
- Written standalone cron job scheduler to load the part 1 crimes with accurate Geocoding up to date.
- Written similar scheduler programs to load DIR, UTT and FI.
- Developed major use cases like Geography search and Radius Search from the scratch.
- Contributed to the project execution working collectively to achieve the project goals.
- Unit testing and bug fixing.
- Log4j is used to handle for debugging purposes.
- SVN has been used as version control system and Putty is used to integrate the modifications.
- Actively participated in the complete life cycle of the project, from the requirements phase to production phase.
Environment: Java, J2EE, JavaScript, JSF, Web services, Log4j1.2.8, Maven, WebSphere, JPA, RAD, Tomcat 6.2, Postgres, WinSCP, SVN and Putty.
Confidential, Albany NY
Senior Java Lead
Responsibilities:
- Used RUP as software development lifecycle for building application.
- Prepared UML diagrams like use case, class diagrams, sequence diagrams using Rational Rose for Action Script classes and Flex Component.
- Work with Manager to analyze appropriate services for the present system.
- Perform analysis, coding, testing and development of J2EE and ArcGIS Components.
- Configure Project Environment for Flex - Spring Java Communication using BlazeDS Remoting.
- Used Ajax/json post calls for accessing spring methods.
- Created Configuration Files for Example remotingconf.xml, serviceconf.xml etc.
- Written ANT build file to compile and deploy flex code.
- Developed, tested and deployed on Tomcat and was6 application servers.
- Used MyEclipse Blue9 as IDE with Flash Builder plugin.
- Worked on integration of Struts framework, spring mvc and hibernate for application.
- Analyzed the business requirement and designed the part of system for upgrading.
- Used oracle10g, as database and TOAD are to write store procedures.
- Worked on JSP, servlets, AJAX, Java script, JUnit, jQuery, JDBC technologies.
- Used Asynchronous web services to send the message and continues processing.
- Analyzed and designed required classes and database.
- Extensively worked with JavaScript, AJAX, jQuery for various front-end validations.
- Maintained the source code versions in CVS repository.
- Used Bug tracking tool for interaction with testing team updates.
- Used SDE for spatial queries and ArcGIS Flex API.
- Used deployment server Tomcat for testing and Web sphere for production.
- Writing the code for the backend connectivity which includes writing JDBC/SQL/PL-SQL/Stored Procedures to interact with Oracle database and developing code for interfacing with External systems
- Actively participated in the complete life cycle of the project, from the requirements phase to production phase.
Environment: Web Sphere6, MyEclipse 8.6& 9, Tomcat 7, J2EE, SQL, Spring 2.5, Blaze DS, Struts 2.1.8, ArcGIS Flex and Java script, Bootstrap, Backbone, Angular JS, Oracle 10g, AJAX, CVS, JSP, Servlets, Java Script, JUnit, JQuery, Hibernate, EJB3, JUnit3.8, Log4j1.2.14, LDAP, ArcIMS, ArcGIS 10, Web Services, SOAP, CSS, XML.
Confidential, Southborough, MA
Sr Java Programmer
Responsibilities:
- Collected Business requirements from the client and Business Analyst, Functional Specification Document (FSD) have been developed.
- Developed High Level Design Document and Low-Level Design Document.
- Created the SDD (System Design Document) based on FSD.
- Used spring annotations as well as xml configuration for dependency injection and Spring Batch for running batch jobs.
- Designed UI using JSF framework, and configured UI for all global access servers.
- Extensively worked on Java Script, java code in action classes, EJBs and DAO.
- Coding and unit Testing, configuring for applet for remote sessions.
- Contributed to the project execution working collectively to achieve the project goals.
- Integrate SR system etc. compliant with Axeda and Confidential defined metrics, coding standards and best practices.
- Implemented the application on other brands.
- Unit testing and bug fixing.
- Log4j is used to handle for debugging purposes.
- Participation in Reviews, Status and Conference meetings.
- SVN and Perforce has been used as version control system and Putty is used to integrate the modifications.
- Actively participated in the complete life cycle of the project, from the requirements phase to production phase.
Environment: Java, J2EE, JavaScript, JSF, Spring, Web services, Log4j1.2.8, Maven, Groovy, UML, Hibernate, Raptor, WebLogic 10.1, Oracle9i, WinSCP, Perforce, SVN and Putty.
Confidential, Iowa City, IA
Sr Java Programmer
Responsibilities:
- Collected Business requirements from the client and Business Analyst, Functional Specification Document (FSD) have been developed.
- Developed High Level Design Document and Low-Level Design Document.
- Created the SDD (System Design Document) based on FSD.
- IBM Rational XDE is used for creating Use Cases/Activity Diagrams/Class Diagrams/Sequence Diagrams for the requirements and actively participated in design meetings identifying scenarios for converting use-case requirements into a stub-enabled code.
- Struts framework is used to writing beans and action classes.
- Extensively worked on Java Script, java code in action classes, EJBs and DAO.
- Ajax is used for retrieving the application data from database based upon the User profile information. Form data is retrieved from webpages and URL is built to connect to server.
- Implemented cross-platform integration by using HTTP, XML-Web Services and SOAP.
- Worked on analysis of validation rules and implemented client-side validation using java script.
- Have done a key role to make the system to Go live and make it stable system.
- Involved in developing the makeAJournal Entry flow, which is the heart of the system for Fee Processing Module.
- Developed almost 10 use cases like Distribute Existing Funds, Scholarship Distribution, Super Utility, Approval, Third Party Payer Account Setup etc...
- Developed various Journal Entry Reports like Cash Payment, Customer Payment and TPP UDA Distribution.
- Written Stored Procedures and functions this involves as part of Fee Processing.
- Involved in Cybersource Reconciliation report for Credit Card and ECheck.
- Implemented POST URL in cybersecure to solve the timeout issues of HOP of Cybersource with respect to NBRIC.
- Involved in Search TPP and Search Customer Payment.
- SVN and Perforce has been used as version control system and Putty is used to integrate the modifications.
- Bugzilla has been used as bug list tracking system.
- Actively participated in the complete life cycle of the project, from the requirements phase to production phase.
Environment: Java, J2EE, Struts2.0, JavaScript, Ajax, Web Services, Log4j1.2.8, Ant, CSS, UML, Hibernate, Raptor, WebLogic 10.1, Oracle9i, Rapid SVN, WinSCP, Perforce, SVN and Putty.
Confidential
Sr. Java Developer
Responsibilities:
- Collected Business requirements by interacting with the business analyst.
- Involved in kickoff meeting to discuss the collected business requirements.
- Created FSD (Functional Specification Document) based on the business requirement.
- Created the SDD (System Design Document) based on FSD.
- Created Use Cases/Activity Diagrams/Class Diagrams/Sequence Diagrams for the requirements using IBM Rational XDE.
- Spring framework is used to writing beans and action classes.
- Extensively worked on JSP’s, Java Script, java code in action classes EJB (Session Beans) and DAO.
- Involved in requirements gathering, coding and unit test cases.
- Involved in web services for schedule installation and validate config services.
- Involved in testing the SFDC data with the application.
- Hibernate persistence layer is used on DAO layer for storing the application data into database. Complex mappings were implemented according to the project requirements.
- Worked on analysis of validation rules by discussing with business analyst for business validations and implemented client-side validation using java script.
- Worked on Web Services to obtain the coordinates for a given input address.
- Actively participated in the complete life cycle of the project, from the requirements phase to production phase.
Environment: Java, J2EE, JSP, WebLogic Workshop, Spring2.0, Hibernate3.1, EJB, JSF, BEA WebLogic 7.0, IBM Rational XDE, Oracle9i, TOAD, Log4j, XML, Web services, JUNIT, SVN
Confidential
Software Developer
Responsibilities:
- Involved in discussion with the business analyst while preparing the specifications.
- Involved in designing, development & testing of ESU Parity devices.
- Involved in system integration and integration testing.
- Involved in service creation, discovery and audit between the ESU devices.
- Involved in fixing the change Request (CR)'s of ESU 1800/ESU1850 devices.
Environment: Core Java, JSP, Servlets, Tomcat 5.0, Eclipse, Oracle, HTML, Log4j, CVS, Team Track