Confluent Developer Skills for Building Apache Kafka®

Explain the value of a *Distributed Event Streaming Platform*

Explain how the “log” abstraction enables a distributed event streaming platform

Explain the basic concepts of: – Brokers, Topics, Partitions, and Segments – Records (a.k.a. Messages, Events) – Retention Policies – Producers, Consumers, and Serialization – Replication – Kafka Connect

More Information
Special Product Yes
Course feature Enterprise Reporting, Lifetime Access, CloudLabs, 24x7 Support, Real-time code analysis and feedback
  • Enterprise Reporting

  • Lifetime Access

  • CloudLabs

  • 24x7 Support

  • Real-time code analysis and feedback

Course Description

• Illustrate how consumer groups and partitions provide scalability and fault tolerance

• Tune consumers to avoid excessive rebalances

• Explain the difference between “range” and “round robin” partition assignment strategies • Create a custom `` file

• Use the Consumer API to manage offsets • Tune fetch requests • Create a consumer with Confluent REST Proxy

Target Audience

Application developers and architects who want to write applications that interact with Apache Kafka®. The course treats Java as a first-class citizen, but students will derive value even if Java is not their primary programming language. C# and Python clients will also be used.


Attendees should be familiar with developing professional apps in Java (preferred), C#, or Python. Additionally, a working knowledge of the Apache Kafka® architecture is required for this course,

Key Objectives

During this hands-on course, you will:

• Write Producers and Consumers to send data to and read data from Kafka

• Integrate Kafka with external systems using Kafka Connect

• Write streaming applications with Kafka Streams & ksqlDB

• Integrate a Kafka client application with Confluent Cloud

Get a Peek at Our Success Stories

Featured Review



One of best I have encountered in my life. Freedom to interact and respond candidly and with courage for every question is not an easy task for Trainers which they did it exceptionally well.

Chun Ngee


The course is well structure. Timing is also right. The trainer Mr Raj is professional. And he asnwer all my question and doubts.

Sarbojit Bose


The course is one of the two in the track of Agile Professional Coach. It is designed to provide both wide and deep knowledge to become a competent Coach with the addirional skills of a Trainer and a Mentor. The two trainers, Preeth Panday and Naveen K Singh, are excellent Facilitators and Coaches with patience and promptness. Their mastery in this area stands out while their mode of delivery captures the interest of the trainees. They demonstrated professionalism with a personal touch.

Training FAQ

Course Outline

The hands-on lab exercises in the course follow the coherent story of building and upgrading a driver location app.

This gives a throughline throughout the course where concepts are applied directly to a working application. Exercises are available in Java, C# and Python. Exercises include:

• Working with Kafka command line tools

• Producing driver location data to Kafka and consuming that data in real-time

• Refactoring the application to use Avro and Schema Registry

• Creating a Kafka Streams application to do real-time distance aggregation

• Extracting a table from an external database into Kafka using Kafka Connect

• Creating a full event streaming application using kslqDB that enriches driver location data with driver profile data

•Experimenting with semantic partitioning