Apache Kafka: Hands-On Training

Course 1266

  • Duration: 2 days
  • Labs: Yes
  • Language: English
  • Level: Foundation

This Apache Kafka course provides hands-on experience with Apache Kafka both from the command line and through programmatic access using Java. Attendees will get a full appreciation of the publish/subscribe idiom for asynchronous communication: the course producers and consumers and reliable data delivery. Attendees will get a full appreciation of how to build data pipelines and how they can process streaming data.

Kafka is an event streaming platform. Businesses and organizations often must capture data in real time from event sources like databases, sensors, mobile devices, cloud services, social media, and software applications in the form of streams of events. They need to store these event streams durably for later retrieval. They must also manipulate, process, and react to the event streams in real time and retrospectively. The event streams must be made available to different destination technologies as needed. Kafka's event streaming implementation ensures a continuous flow and interpretation of data so that the correct information is at the right place and time.

Train your whole team by bringing this Apache Kafka course to your facility.

  • Team training is available online and in-person.

Apache Kafka: Hands-On Training Course Information

In this training, you will learn how to:

  • Install a Kafka broker for development.
  • Write Kafka producers and consumers.
  • Leverage Kafka features for reliable data delivery.
  • Build Kafka data pipelines.
  • Receive and process streaming data using Kafka.

Prerequisites

Java to the level of Learning Tree courses 471, Introduction to Java Programming Training.

Apache Kafka: Hands-On Training Outline

Chapter 1: Apache Kafka (42 slides)

  • Publish/Subscribe Messaging
  • Apache Kafka
  • Installing Kafka
  • Hands-On Exercise 1.1: Installing Kafka

Chapter 2: Producers and Consumers (51 slides)

  • Constructing a Kafka producer
  • Synchronous and asynchronous messages
  • Configuring producers
  • Serializers and partitions
  • Hands-On Exercise 2.1 Creating a Kafka producer
  • Consumers and consumer groups
  • Creating a Kafka consumer
  • Subscribing to Kafka topics
  • Consuming messages
  • Deserializers
  • Hands-On Exercise 2.2: Creating a Kafka producer

Chapter 3: Reliable Data Delivery (15 slides)

  • Reliability Guarantees
  • Broker configuration
  • Producers in a reliable system
  • Consumers in a reliable system
  • HO 3.1 Reliable data delivery

Chapter 4: Building Data Pipelines (21 slides)

  • Considerations when building data pipelines
  • Kafka Connect
  • Hands-On Exercise 4.1: Building a data pipeline

Chapter 5: Stream Processing (51 slides)

  • Stream processing concepts
  • Examples of Kafka Streams
  • Kafka Streams architecture
  • Hands-On Exercise 5.1: Stream processing Twitter data

Need Help Finding The Right Training Solution?

Our training advisors are here for you.

Apache Kafka Course FAQs

Approximately 40% of time on the course is devoted to hands-on exercises, allowing you to gain extensive experience with Kafka. Exercises include:

  • Installing a Kafka broker for development
  • Writing Kafka producers and consumers
  • Leveraging Kafka features for reliable data delivery
  • Building Kafka data pipelines
  • Receiving and processing streaming data using Kafka

This course assumes a background in Java programming, to the level of Learning Tree Introduction to Java Programming Training.

Anyone developing Java or Python applications with core Java SE or Python skills and wishes to capitalize on the addition of Kafka to program in the publish/subscribe idiom and manage fast, streaming data.

Chat With Us