Message Key use cases and examples
The Kafka Message payload has a key, value and timestamp. The key part of the kafka message is used for multiple purposes like for log compaction, distribution, downstream applications to create queries based on the key part of the message payload.
Kafka Writer provides three configurations for defining how the Kafka message key is constructed. These configurations are common across the messages written to different topics-partitions.
Message Key options
None
Messages will be published without a Kafka message key.
Default setting for Kafka Writer.
Events Supported :
Applies to JSONNodeEvent, AvroEvent, WAEvent and TypedEvent.
In case of selecting Partitioning based on Message Key will result in data written into partition “0” of mapped Topic.
Custom
Users can define the message key to have dynamic values for each Kafka Message by referencing fields from payload, metadata, or user data.
Users can add a static Message Key to be added to each Kafka message.
When the Custom option is selected for the Kafka message key, the field to be used as the kafka message key must be specified through the “Custom Message Key” property.
If multiple message keys are required, use the respective UI widget to add the values or provided as a semicolon-separated value via TQL.
Events Supported :
Applies to JSONNodeEvent, AvroEvent, WAEvent and TypedEvent. For CDC sources, The message key can be automatically constructed using primary key column values from the source table.
Example Static Value MessageKey : Custom CustomMessageKey : CName=”Striim” → Static value has to be enclosed in double quotes Dynamic Value For the WAEvent of OLTP Sources, MessageKey : Custom CustomMessageKey : Table=@metadata(TableName); For the TypedEvent SalesTransaction Type: MessageKey : Custom CustomMessageKey : TransactionID=transactionId; Mix of static and dynamic values For the WAEvent of FileBased Sources, MessageKey : Custom CustomMessageKey : CName=”Striim”;Table=@metadata(TableName);Operation=@userdata(OperationName) |
Using Primary Key value of the source OLTP table
The message key is automatically constructed using primary key column values from the source table. This is supported only if the event source is OLTP / Datawarehouse.
If no primary key columns are defined in the source, all columns will be added to the message key.
The source columns can be a combination of NOT NULL constraint and repeating values - NULL or any values (means source is not enforcing a unique value for a row) and this might affect the messages retained in Kafka if cleanup.policy is set to COMPACT.
Changes to primary key definitions due to DDL changes are reflected in the message keys too if Schema Evolution is set to Auto. If PersistSchema is ON, the Kafka message containing the DDL event will have the names of the primary keys with a NULL value.
Events Supported :
WAEvent - supported only with OLTP sources.
JSONNodeEvent , AvroEvent , TypedEvent :
File-based sources and any sources without a structured schema do not support primary key definitions. Selecting the Primary Key option for these sources will cause the application to HALT.
Note : Selecting Primary Key as Message Key when the source table doesn’t have primary keys defined will increase the size of the Message Key and might affect the messages retained during log compaction and hence it is not recommended. |
The structure of the message key varies depending on the formatter used.