More Information
Special Product Yes
Course feature Lifetime Access, CloudLabs, 24x7 Support, Real-time code analysis and feedback
  • Lifetime Access

  • CloudLabs

  • 24x7 Support

  • Real-time code analysis and feedback

• Sketch the high level architecture of a Kafka producer

• Illustrate key-based partitioning Explain the difference between `acks=0`,`acks=1`, and`acks=all`

• Configure `` to control retry behavior

• Create a custom `` file

• Tune throughput and latency using batching • Create a producer with Confluent REST Proxy

• Illustrate how consumer groups and partitions provide scalability and fault tolerance

• Tune consumers to avoid excessive rebalances

• Explain the difference between “range” and “round robin” partition assignment strategies • Create a custom `` file

• Use the Consumer API to manage offsets • Tune fetch requests • Create a consumer with Confluent REST Proxy

During this hands-on course, you will:

• Write Producers and Consumers to send data to and read data from Kafka

• Integrate Kafka with external systems using Kafka Connect

• Write streaming applications with Kafka Streams & ksqlDB

• Integrate a Kafka client application with Confluent Cloud

Application developers and architects who want to write applications that interact with Apache Kafka®. The course treats Java as a first-class citizen, but students will derive value even if Java is not their primary programming language. C# and Python clients will also be used.

Attendees should be familiar with developing professional apps in Java (preferred), C#, or Python. Additionally, a working knowledge of the Apache Kafka® architecture is required for this course,

Course Outline

The hands-on lab exercises in the course follow the coherent story of building and upgrading a driver location app.

This gives a throughline throughout the course where concepts are applied directly to a working application. Exercises are available in Java, C# and Python. Exercises include:

• Working with Kafka command line tools

• Producing driver location data to Kafka and consuming that data in real-time

• Refactoring the application to use Avro and Schema Registry

• Creating a Kafka Streams application to do real-time distance aggregation

• Extracting a table from an external database into Kafka using Kafka Connect

• Creating a full event streaming application using kslqDB that enriches driver location data with driver profile data

•Experimenting with semantic partitioning