Confluent Community » KSQL DB
35 FOLLOWERS
The Confluent Community is for the discussions related to KSQL DB. ksqlDB is the event streaming database purpose-built for stream processing applications. It's available both as a managed service in Confluent Cloud, or you can run it yourself. Talk about facing issues with complex data processing, and get to know how to fix errors on startup.
Confluent Community » KSQL DB
1w ago
example code
CREATE STREAM USERS_DATA_PRITE010 (KEY_STRUCT STRUCT KEY, USERNAME VARCHAR, AGES INTEGER ,EMAIL VARCHAR)
WITH (kafka_topic=‘USERS_DATA_PRITE010’,value_format=‘AVRO’, key_format=‘AVRO’,partitions=3);
INSERT INTO USERS_DATA_PRITE010 (KEY_STRUCT, USERNAME, AGES, EMAIL) VALUES (STRUCT(USERID := ‘A2’), ‘USER-A-1’, 1, ‘userA1@abc.com’);
INSERT INTO USERS_DATA_PRITE010 (KEY_STRUCT, USERNAME, AGES, EMAIL) VALUES (STRUCT(USERID := ‘A2’), ‘USER-A-2’, 2, ‘userA2@abc.com’);
INSERT INTO USERS_DATA_PRITE010 (KEY_STRUCT, USERNAME, AGES, EMAIL) VALUES (STRUCT(USERID := ‘A2’), ‘USER-A-3’, 3, ‘use ..read more
Confluent Community » KSQL DB
1w ago
Hi Team,
As per my testing i have created a free confluent cloud ui and have connectors and ksqldb.
i want to access that ksqldb cluster in azure build pipelines (with basic command line task)
i have used curl command to install confluent platform.
but still unable to login to confluent cloud or connect to ksqldb.
Please help me in setting up this and connect to my ksqldb cluster.
Command line script:
curl -O http://packages.confluent.io/archive/7.0/confluent-7.0.1.tar.gz
tar -xzf confluent-7.0.1.tar.gz
rm confluent-7.0.1.tar.gz
echo “extracted package”
export CONFLUENT_HOME=./confluent-7.0.1 ..read more
Confluent Community » KSQL DB
1M ago
Is it possible to connect single ksqlDB cluster to multiple Kafka clusters ?
If in a hybrid setup, data originates in on-prem Kafka cluster C1 topic T1 and some data pipeline transforms, processes and pushes it to different Kafka cluster C2 topic T2 on cloud. Is it possible for one single ksqlDB cluster K1 to connect to both C1 and C2 kafka clusters to read data from T1 and T2, join them, and do streaming queries over the joined data?
Two old threads in this forum suggest that is not possible in KafkaStreams, implying ksqlDB may not be able to do this, but still checking whether newer versions ..read more
Confluent Community » KSQL DB
1M ago
Hi
Is MAP operator deprecated?
docs.ksqldb.io Data Types Overview - ksqlDB Documentation
Overview of data types in ksqlDB
Why am i getting this error
Can’t find any functions with the name ‘MAP’
select explode(VARIANTOPTIONS) as original ,
map(json_records(explode(VARIANTOPTIONS))) as json_records
from PRODUCTINSTANCE_VARIANT_OPTIONS_STREAM EMIT CHANGES;
1 post - 1 participant
Read full topic ..read more
Confluent Community » KSQL DB
1M ago
I have a Kafka topic that has messages generated from data generator. I built a stream from topic. I see messages in the stream.
I created another stream reading from first stream. i don’t see messages there.
is cascading streams allowed in ksqldb ?
what is the benefit of cascading streaming at all ? versus having multiple streams all from source topic, and only build tables after ?
1 post - 1 participant
Read full topic ..read more
Confluent Community » KSQL DB
1M ago
i have a avro kafka. can i make a json stream from it immediately or i need to create a avro stream first and then the second stream as json as SELECT * from stream1?
I have an avro kafka, i have created a stream but the select doesn’t return anything from it.
i am wondering whether this structure is correct i put in the CREATE Stream is correct or not.
CREATE STREAM TcProductVariant_stream_AVRO (
_comment string ,
data ARRAY<STRUCT<
productInstanceUuid string ,
productInstanceId string,
productInstanceDescription string ,
productInstanceRevision string ,
productVariants ARRAY<STRUCT ..read more
Confluent Community » KSQL DB
2M ago
I have an existing topic with events having null key and json values
Example event:
{
“consumer”: “abc”,
“src_event_id”: “abc123”,
“sc_type”: “cip”,
“sc_ven”: “eq1”,
“sc_res”: “PASS”,
“other_attr”: “efgh”
}
I am looking to create a KSQL table with a composite key consisting of src_event_id, sc_type and sc_ven and expose all other attributes of the json as columns. The objective is to allow users to run pull queries from the table based on src_event_id, even though uniqueness of the record is defined by the composite key.
1 post - 1 participant
Read full topic ..read more
Confluent Community » KSQL DB
2M ago
Hello,
we are running confluentinc/cp-ksqldb-server:7.6.0 and are facing a following problem with regard to daily windowing due to UTC timestamps.
We are running aggregations on data that is essentially similar to the following:
{
"id": "<uuid>"
"timestamp": "2024-02-27:02:56:00.000Z",
"quantity": 3.25
}
I want to create a table that groups data with the same id using windowing, as follows:
CREATE OR REPLACE TABLE aggregated WITH (KEY_FORMAT='AVRO', VALUE_FORMAT='AVRO')
AS SELECT
ID, AS_VALUE(ID) AS "dataId",
SUM(QUANTITY) AS "sumQuantity",
AS_VALUE(windowstart) as "day ..read more
Confluent Community » KSQL DB
2M ago
Hi team,
I tried to migrate ksql’s queries from DC1 to DC2 with CP that set up cluster linking. So I need some help with the design solution for failover ksql’s queries.
ksql’s queries : stream transaction join with table of master data
CP details: Set up as Active-Passive and normally DC1 : active / DC2 : passive
my question:
When promoting DC2 as new active we already sync topic, schema and consumer offset with cluster linking from DC1->DC2.
But in part of ksql’s queries, How do mirror all queries and all states from DC1 to DC2 to continue process data exactly where they left off without ..read more
Confluent Community » KSQL DB
2M ago
My objective is to create an alert stream for a condition when speed is non zero and Geo-spacial data was not updating. I want to do this by comparing the two events continuously. I understood that we can’t do that directly. So, i wanted to create an stream from the main stream containing that both current record and last record or ( last two records) data for further analysis.
CREATE STREAM STRGEOALERT_01
WITH (VALUE_FORMAT = ‘JSON’, RETENTION_MS = 3600000)
AS
SELECT deviceid,
LATEST_BY_OFFSET(speed, 2) AS speed_,
LATEST_BY_OFFSET(uniqueid, 2) AS uniqueid_,
LATEST_BY_OFFSET(devicedatetime,2 ..read more