Denis Washington & Olli Salonen

Kafka Streams Microservices

Can you build a microservices platform entirely on Kafka Streams? Discover the power and pitfalls of using Kafka as your single source of truth.

Kafka Streams Microservices
#1about 3 minutes

Core concepts of Kafka and Kafka Streams

Kafka is a distributed event streaming platform using topics, partitions, producers, and consumers for scalable data processing.

#2about 6 minutes

Evolving from classic microservices to event-driven design

The architecture evolved from traditional request-response microservices to an event-driven model using Kafka as the single source of truth to improve decoupling and extensibility.

#3about 3 minutes

Understanding the system topology and failure scenarios

The system uses an API service with materialized views for robust reads and command processing topologies that can recover from failures by replaying input topics.

#4about 4 minutes

Building a searchable product catalog pipeline

A data pipeline cleans, deduplicates, and amends product data from various sources, then streams it to Elasticsearch to create a searchable materialized view.

#5about 2 minutes

Implementing inventory management using a CQRS pattern

A command processing pipeline implements the CQRS pattern by separating write operations from read models, using an event topic as the source of truth for inventory data.

#6about 7 minutes

Solving uniqueness constraints and race conditions

Race conditions caused by eventual consistency are solved by using manually updated state stores and repartitioning command streams to ensure data locality for validation.

#7about 3 minutes

Opportunistic data consumption for new features

New features like automatic warranty extensions can be added by deploying new services that consume existing data streams without modifying the original producers.

#8about 5 minutes

Key challenges and lessons from a pure Kafka approach

A pure Kafka Streams architecture presents challenges in development complexity, stateful operations, careful configuration for transactions, and operational tooling.

#9about 12 minutes

Evolving the architecture with a hybrid database approach

The architecture can be evolved by integrating traditional databases to simplify complex stateful logic, while using connectors to publish all state changes back to Kafka.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.

Rust and GoLang

Rust and GoLang

NHe4a GmbH
Karlsruhe, Germany

Remote
55-65K
Intermediate
Senior
Go
Rust

Kafka DevOps

REWE digital
Municipality of Madrid, Spain

Kafka
DevOps