
Confluent Community » KSQL DB
38 FOLLOWERS
The Confluent Community is for the discussions related to KSQL DB. ksqlDB is the event streaming database purpose-built for stream processing applications. It's available both as a managed service in Confluent Cloud, or you can run it yourself. Talk about facing issues with complex data processing, and get to know how to fix errors on startup.
Confluent Community » KSQL DB
1d ago
I want to run a kafka cluster locally to test some code
I ran ○ confluent local services start and I got the following error:
The local commands are intended for a single-node development environment only, NOT for production usage. See more: https://docs.confluent.io/current/cli/index.html
As of Confluent Platform 8.0, Java 8 will no longer be supported.
Using CONFLUENT_CURRENT: /var/folders/5x/1xdth1ss0mjf_qw0frhwhw4c0000gq/T/confluent.390395
ZooKeeper is [UP]
Kafka is [UP]
Schema Registry is [UP]
Kafka REST is [UP]
Connect is [UP]
Starting ksqlDB Server
Error: ksqlDB Server failed to start ..read more
Confluent Community » KSQL DB
1w ago
Hi,
I created a topic and register Avro schema for the Key and Value. They are both objects. I also verified the schema is registed in the schema registery. Then create a source table in the Ksqldb with below query to pull the data in the .Net. The reason I have to specify the schema id is because I need the property name stay CamelCase as it is defined in the topic.
CREATE source TABLE ExampleTopic WITH (KAFKA_TOPIC=‘ExampleTopic’, KEY_FORMAT=‘Avro’, VALUE_FORMAT=‘Avro’, KEY_SCHEMA_ID=‘1’,VALUE_SCHEMA_ID=‘2’);
Afetr I push the message to the topic, I get below serialization error when I try
s ..read more
Confluent Community » KSQL DB
1M ago
Hi guys,
I get above error when running confluent local service. Any thoughts on this. Thanks in advance.
5 posts - 2 participants
Read full topic ..read more
Confluent Community » KSQL DB
2M ago
Hello.
I have the following kafka record key : mqtt/temp/5
and the kafka record value :
{
“message_id”: “19ce932c-cc1f-4409-97f7-0e45a9a29c02”,
“temperature”: 86.82796014582067,
“timestamp”: “2025-01-24T17:35:50.756009600”
}
Can I use KSQLDB to extract that record key and add it to the record value so that the processed stream has
{
“producer_id”:“mqtt/temp/5”,
“message_id”: “19ce932c-cc1f-4409-97f7-0e45a9a29c02”,
“temperature”: 86.82796014582067,
“timestamp”: “2025-01-24T17:35:50.756009600”
}
??
I currently have this implementation. What should I modify if possible?
CREATE STREAM temp_stream ..read more
Confluent Community » KSQL DB
2M ago
Crack Aiseesoft Data Recovery is a powerful software designed to recover lost or deleted files from various storage devices, including hard drives, USB drives, and memory cards. It supports a wide range of file types such as documents, photos, videos, and audio files. With its user-friendly interface and efficient scanning capabilities, Aiseesoft Data Recovery allows users to quickly retrieve their important data with minimal hassle.
  ..read more
Confluent Community » KSQL DB
3M ago
Hi all,
I’m looking for some guidance on an issue I’m having with my hopping windows.
I have a pipeline set up which accepts a nested json message into a stream which pushes messages into a hopping window if certain criteria are met.
The hopping window will then aggregate these messages and produce an alert onto a topic if the calculations within the windows hit defined thresholds.
The issue I am having is that individual messages are being included in multiple alerts when they shouldn’t be. For example, if I have it set up so 3 messages of a certain type are required to trigger an alert then ..read more
Confluent Community » KSQL DB
3M ago
Hi guys, I’m working with the Confluent platform and experimenting with KSQL DB. I’d like to clarify the data retention periods for KSQL Tables and Streams. Here are my scenarios -
If I have a source topic with a retention period of 7 days and I create a KTable from that topic, will the data in the KTable also expire after 7 days?
When I join a Stream and a Table to create a persistent table, when does the data from that table get evicted?
Thanks
1 post - 1 participant
Read full topic ..read more
Confluent Community » KSQL DB
4M ago
In this post, I would like to share a way to rename fields in the TOPK output structure.
As you know, the function TOPK(sort_col, col0..., k) returns an array of STRUCT<sort_col, col0, col1, etc.>. To use the result in the future, it may not always be convenient to use these names.
But using the TRANSFORM function, it is possible to rename the fields of the structure as follows:
TRANSFORM(
TOPK(
x,
y,
z,
5
),
(res) => STRUCT(
x := res->"sort_col",
y := res->"col0",
z := res->"col1"
)
) AS top_xyz_array ..read more
Confluent Community » KSQL DB
4M ago
Hey all,
I have been using Kafka Streams for a problem I am working on and I started to think about implementing the same functionality with ksqldb to leverage the existing aggregations and functions as well as the flexibility of writing ad hoc queries rather than running new instances but I am having a hard time wrapping my head around it, maybe somebody here can help me with this. If all works out i am planning to write my own UDF and UDAFs to extend the available vocabulary for my processing needs.
I have a kafka topic where I am receiving events with polymorpic schemas that have the same k ..read more
Confluent Community » KSQL DB
4M ago
I notice the ksqldb.io domain now redirects to the confluent site, with the standalone deploy option gone, and mention that ksqldb is a commercial component of the confluent platform. However I’ve seen in the docker components of the confluent platform ksql-server continues to be a Community version. Is the package confluent-ksqldb still maintained? Has it changed to be cp-ksqldb-server and do we need a key to use it?
1 post - 1 participant
Read full topic ..read more