Serializer use cases and examples
Kafka Writer can work with DSV, JSON, XML and Avro formatters. The message key type and format will be formatted based on the selected formatter.
Formatter | Kafka Message Key | Notes |
|---|---|---|
DSV |
| Message Keys will be the values of Message Key the user has configured, of type Strings and separated by Comma |
JSON |
| Message Keys will be in JSON Key-value pair, with json keys respective to the names of the field names specified the Message Key Configuration and json Values will be respective fields value, of type Strings |
XML | <?xml version="1.0" encoding="UTF-8"?> <messageKey> <EMP_ID>1001</EMP_ID> </messageKey> |
Message key serialization with Avro Formatter
Message Keys will be avro record with fields having Key-Value pair, with keys respective to the names of the field names specified the Message Key configuration and Values will be respective field’s value. Type of the field will be respective to the incoming event field’s type in case of OLTP sources. For non OLTP sources the type of fields will be String.
In case of OLTP sources, if the source schema was migrated using a wizard, Database Reader application, or schema cionversion tool, the case of the field names and type of the fields will be closely mapped to the source table type.
A separate schema will be registered based on the “<Subject Naming Strategy>-key".
If Message Key is set to the “primary key” option, Schema Evolution is Auto, Persist Schema is ON, then when a DDL event is received a new schema will be registered under the <Subject Naming Strategy> -DDLKey".
In the case of Avro format, by default serialization is done in Striim wire format, but you can change this setting.
Kafka Message Key - Avro Schema | Kafka Message Key - Avro Record |
|---|---|
{
"type": "record",
"name": "EMP",
"namespace": "Sch",
"fields": [
{
"name": "EMP_ID",
"type": "bytes",
"logicalType":"",
"precision":38
"scale":0
}
]
} |
|