Policy Enforcement and Data Quality for Apache Kafka with Schema Registry
Kai Waehner Blog » Microservices
by Kai Waehner
6M ago
Good data quality is one of the most critical requirements in decoupled architectures, like microservices or data mesh. Apache Kafka became the de facto standard for these architectures. But Kafka is a dumb broker that only stores byte arrays. The Schema Registry enforces message structures. This blog post looks at enhancements to leverage data contracts for policies and rules to enforce good data quality on field-level and advanced use cases like routing malicious messages to a dead letter queue. From point-to-point and spaghetti to decoupled microservices with Apache Kafka Point-to-point HT ..read more
Visit website
Apache Kafka for Data Consistency (and Real-Time Data Streaming)
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Real-time data beats slow data in almost all use cases. But as essential is data consistency across all systems, including non-real-time legacy systems and modern request-response APIs. Apache Kafka’s most underestimated feature is the storage component based on the append-only commit log. It enables loose coupling for domain-driven design with microservices and independent data products in a data mesh. This blog post explores how Kafka enables data consistency with a real-world case study from financial services. Apache Kafka = Real-time data streaming Real-time beats slow data. It is that e ..read more
Visit website
Decentralized Data Mesh with Data Streaming in Financial Services
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Digital transformation requires agility and fast time to market as critical factors for success in any enterprise. The decentralization with a data mesh separates applications and business units into independent domains. Data sharing in real-time with data streaming helps to provide information in the proper context to the correct application at the right time. This blog post explores a case study from the financial services sector where a data mesh was built across countries for loosely coupled data sharing but standardized enterprise-wide data governance. Data mesh and the need for real-tim ..read more
Visit website
Apache Kafka for Data Consistency (and Real-Time Data Streaming)
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Real-time data beats slow data in almost all use cases. But as essential is data consistency across all systems, including non-real-time legacy systems and modern request-response APIs. Apache Kafka’s most underestimated feature is the storage component based on the append-only commit log. It enables loose coupling for domain-driven design with microservices and independent data products in a data mesh. This blog post explores how Kafka enables data consistency with a real-world case study from financial services. Apache Kafka = Real-time data streaming Real-time beats slow data. It is that e ..read more
Visit website
Streaming Data Exchange with Kafka and a Data Mesh in Motion
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Data Mesh is a new architecture paradigm that gets a lot of buzzes these days. Every data and platform vendor describes how to build the best Data Mesh with their platform. The Data Mesh story includes cloud providers like AWS, data analytics vendors like Databricks and Snowflake, and Event Streaming solutions like Confluent. This blog post looks into this principle deeper to explore why no single technology is the perfect fit to build a Data Mesh. Examples show why an open and scalable decentralized real-time platform like Apache Kafka is often the heart of the Data Mesh infrastructure, compl ..read more
Visit website
Mainframe Integration, Offloading and Replacement with Apache Kafka
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Time to get more innovative; even with the mainframe! This blog post covers the steps I have seen in projects where enterprises started offloading data from the mainframe to Apache Kafka with the final goal of replacing the old legacy systems. “Mainframes are still hard at work, processing over 70 percent of the world’s most important computing transactions every day. Organizations like banks, credit card companies, medical facilities, stock brokerages, and others that can absolutely not afford downtime and errors depend on the mainframe to get the job done. Nearly three-quarters of all Fortun ..read more
Visit website
Event Streaming and Apache Kafka in Telco Industry
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
Event Streaming is a hot topic in Telco Industry. In the last few months, I have seen various projects leveraging Apache Kafka and its ecosystem to implement scalable real time infrastructure in OSS and BSS scenarios. This blog post covers the reasons for this trend. The end shows a whiteboard video recording exploring the different use cases for event streaming in telcos in detail. The Evolution of the Telecommunications Industry The telecommunications industries within the sector of information and communication technology is made up of all telecommunications / telephone companies and intern ..read more
Visit website
The Rise Of Event Streaming – Why Apache Kafka Changes Everything
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
I had the pleasure to deliver the keynote at OOP 2020 in Munich, Germany. This is a well-known international conference around topics like agility, architecture, security, programming languages and soft skill. My keynote had the title “The Rise Of Event Streaming – Why Apache Kafka Changes Everything“. Here are share some impressions and details of the talk… Abstract of the Keynote Presentation Business digitalization covers trends like microservices, the Internet of Things or Machine Learning. This is driving the need to process events at a whole new scale, speed and efficiency. Traditional s ..read more
Visit website
Apache Kafka as Digital Twin for Open, Scalable, Reliable Industrial IoT (IIoT)
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
This blog post discusses the benefits of a Digital Twin in Industrial IoT (IIoT) and its relation to Apache Kafka. Kafka is often used as central event streaming platform to build a scalable and reliable digital twin for real time streaming sensor data. In November 2019, I attended the SPS Conference in Nuremberg. This is one of the most important events about Industrial IoT (IIoT). Vendors and attendees from all over the world fly in to make business and discuss new products. Hotel prices in this region go up from usually 80-100€ to over 300€ per night. Germany is still known for its excellen ..read more
Visit website
Apache Kafka in the Automotive Industry
Kai Waehner Blog » Microservices
by Kai Waehner
1y ago
In November 2019, I had the pleasure to visit “Motor City” Detroit. I met with several automotive companies, suppliers, startups and cloud providers to discuss use cases and architectures around Apache Kafka. I work with companies related to the German automotive industry for many years. It was great to see the ideas and current status of projects running overseas in the US. I am really excited about the role of Apache Kafka and its ecosystem in the automotive industry. Kafka became the central nervous system of many applications in various different areas related to automotive industry. Mach ..read more
Visit website

Follow Kai Waehner Blog » Microservices on FeedSpot

Continue with Google
Continue with Apple
OR