Kafka Developer Resume
SUMMARY:
- 11 years of Extensive experience in designing and developing enterprise grade, secure and scalable solutions based on enterprise design patterns and industry best practices.
- Extensive working experience on Core Java using Multi - Threading and Collections Framework
- Very good working knowledge on Core java and J2ee technologies, and Jdk1.8 features like Functional interface, Default methods in interface, Static method, Lambda expressions, Streams, and functional programing.
- Very deep understanding and working knowledge on Confluent Kafka and Apache Kafka components.
- Very good working knowledge on Kafka integrations with other systems like AWS, PostgreSQL, GCS, IBM MQ, Hadoop using Kafka Connect, Micro Services and Streaming Libraries.
- Strong application development experience withKafka
- Strong experience buildingstatelessandstateful event processorswith Kafka
- Familiarity withKafka stream processorsandKafka Streams APIs
- Development and deployment experience in a cloud environment usingAWS,GCP etc.
- Extensive working experience on Apache Kafka, Confluent Kafka, Kafka Streaming using KStream, KTable, KGroupedStream, KSQL
- Extensive working experience on Kafka Connect and Schema Registry and Rest proxy servers
- Very good working knowledge on Control Center and Kafka monitoring
- Very good working knowledge on implementing SSL and SASL authentication between broker and zookeeper and Client
- Extensive working experience on Spring Boot Micro Services, Spring Cloud Components and Cloud deployments.
- Extensive working experience on Confluent Kafka and Kafka components and real time message systems.
- Very good understanding and working knowledge on building data pipelines using Kafka and other systems
- Very good understanding and working knowledge on real time stream processing using Kafka stream and KSQL
- Expertise in developing applications on Java Platform using enterprise design patterns, architecture principles and popular open-source frameworks (Spring Boot, Hibernate, Kafka, Spark, Ansible).
- Developing scalabledata platformsforIoT sensor data
- Good experience with Time-series/metrics database
- Experience withPostgresandDynamoDB
- Familiar withAWS serverless applicationcomponents
- Familiarity deployingJava microserviceswithKubernetes
- Good experience with Docker and Container deployments
- Strong knowledge of DevOps best practices such as continuous Integration, Continuous Delivery, and tools (Ansible, OpenShift).
- Good familiarity with security code best practices and principles and tools used for static and dynamic security code analysis tools, risk assessment methodologies.
- Extensive working experience on Confluent Kafka and Kafka components and real time message systems.
- Ability to contribute to each step of the product development process from ideation to implementation to release, including rapid prototyping and driving solution approach.
- Work in cross-functional teams to drive the effective creation of Minimum Viable Products that bring business value faster.
- Ability to learn new technologies and influence the team and leadership to constantly implement modern solutions.
- Experience with Agile methodology and Scrum implementation
TECHNICAL SKILLS:
Operating Systems: Ubuntu, Red Hat, Windows 10
Languages: Java, C++, Ruby, Python, JavaScript, HTML 5, CSS, SQL, Unix Scripting
Databases: Oracle, MySQL, DB2, PostgreSQL, SQL Server, Hive, Mongo Db
Servers: WebLogic, WebSphere, JBoss, Tomcat, HA Proxy, Nginx
Frameworks: Java EE, Spring, Spring Boot, Hibernate, Jersey, Lucene, Avro, Unit, Mockito, TestNG, Selenium
Services: Confluent Kafka, Apache Kafka, Zookeeper, Confluent Schema Registry, Rest Proxy, Kafka Connect, Control Center, Mongo Db, RabbitMQ, IBM MQ, Hadoop, OpenStack, AWS Cloud
DevOps: Git, Jenkins, SonarQube, Chef, Ansible, Docker, Kubernetes
Monitoring/Logging: New Relic, Zabbix, ELK (Elasticsearch Logstash, Kibana), JMX
Tools: ntelliJ, Eclipse, Jira, Git, Gerrit, Maven, Ant, Fiddler, Fortify, Coverity, Hailstorm, Clover, SoapUI, Jenkins, Bamboo, Rally, Blueprint, Vagrant, VirtualBox, VMware, AWS Cloud
PROFESSIONAL EXPERIENCE:
Confidential
Kafka Developer
Responsibilities:
- Supporting and enhancing complete confluent Kafka cluster on Kubernetes
- Implemented Authorization, Authentication and Encryption using SSL and SASL protocols between zookeeper, broker, and clients
- Enabled Role based access control (RBAC)
- Integrated Kafka with Google Cloud Storage for Tired Storage
- Integrated Kafka with Cloud SQL databases using Kafka connect.
- Deployed CDC source connector to move the data to Kafka
- Written Kafka streams application using Kafka streams library and Ksql queries.
- Monitoring the Kafka platform
- Creating topics and role bindings
- Worked on the streams/table joins and aggregations using time windows
- Written Rest Clients to read the schema from schema registry servers
- Written producer and consumer programs using native Kafka client and spring boot micro services
- Deploying the connectors to move the data from/to Kafka to other systems
- Implemented error handling mechanisms for producers and consumers
- Implemented SSL/SASL for component communication
- Setting up and supporting complete confluent Kafka cluster in Kubernetes
- Implemented Authorization, Authentication and Encryption using SSL and SASL protocols between zookeeper, broker, and clients
- Enabled Role based access control (RBAC)
- Written Rest Clients to read the schema from schema registry servers
- Written producer and consumer programs using native Kafka client and spring boot micro services
- Written Avro producer and consumer programs using native and spring boot
- Deployed various Schema registry applications.
- Written Kafka Streams application using KSTREAM, KTABLE, KGROUPED STREAM to process the events in real time.
- Used Kafka State store for processing state full streams and events
- Written Streams applications using KSQL
- Processed the Events in real time and enriched the events through Kafka
- Written using java 8 features functional programming
- Written the Spring boot micro service to produce and consume messages from Kafka messaging system.
- Written Spring Boot Micro service to publish Schema on Schema registry
- Written the Kafka producer and consumer module.
- Participated in Kafka cluster setup broker and zookeepers.
Confidential
Responsibilities:
- Written Kafka Streams application using KSTREAM, KTABLE, KGROUPED STREAM to process the events in real time.
- Used Kafka State store for processing state full streams and events
- Processed the Events in real time and enriched the events through Kafka
- Written using java 8 features functional programming
- Written the Spring boot micro service to produce and consume messages from Kafka messaging system.
- Written Spring Boot Micro service to publish Schema on Schema registry
- Written the Kafka producer and consumer module.
- Participated in Kafka cluster setup broker and zookeepers.
Confidential
Responsibilities:
- Modernize Citi Alerts platform to support customer growth and enhance user experience
- Implemented framework to handle Change Data Capture / DELTA / Incremental data load type (Kill and Fill load type)
- Working with architecture team in identifying design options and doing PoC for the multiple datacenter replication design
- Implemented job execution table to write several metrics to RDS such as job execution time, start time, end time, record count before and after, exceptions for job failures, job run id and schema.
- Implemented a hybrid integration platform using AWS services: SNS, SQS, Lambda, Step functions, Glue, Athena, Redshift, DynamoDB and Elasticsearch
- Developed AWS Glue jobs (Pyspark) for migrating data from On-Prem to AWS cloud, performed transformations based on the data model and load data to elastic search which is one of the consumers for the API’s
- Worked with DevOps and Production team in operationalizing Kafka and provide support in troubleshooting production issues.
- Built Real time streaming and ingestion platform for an Event to publish events and subscribe events.
Confidential
Responsibilities:
- Involved in sprint planning and story grooming sessions with business stakeholders for requirement analysis and breaking down business requirements into sizeable user stories which can be considered for development in each sprint cycle
- Responsible for providing the design and architecture of the overall application based on functional and non-functional requirements. In addition to this, responsible for establishing coding and unit test guidelines which serve as the basis for implementation of the components and developing unit/integration test cases
- Working with development team to reproduce, debug and troubleshoot critical and complex production issues
- Working with environment configuration team in setting up development, QA and UAT environments
- Provide basic understanding of the technologies used in the project to the team and conduct regular knowledge sharing sessions to ensure every team member can debug and resolve issues independently
Confidential
Responsibilities:
- Developing proof-of-concepts as part of technology evaluation exercise to determine the technology "right-fit" for the project. Documented the results along with recommendations for submission to stakeholders for final production selection
- Involved in core development team responsible for developing the framework components which will be extensively reused by components across the platform
- Worked on developing Central Authentication and Signing Service component which encapsulates the features provided by Entrust Security Toolkit (cryptographic operations based on PKI) for authentication and signing documents
- Worked closely with enterprise architects for establishing coding and testing guidelines, reviewing code, raise potential design concerns, proactively identifying areas of improvement and discussing design alternatives or solution approach
- Designed and Developed the Spring Controllers for each request.
- Developed the Core Functionality using multi-threaded programming.
- Designed and Developed Purchase order and Shipment module using spring, hibernate frameworks and creation of Hibernate mapping files and generation of database schema.