I would like to be able to include fields from the Kafka record header into a table.
Looking at the code it seems fairly deep in the KafkaStreamPublisher that the headers would need to be processed. I wonder if it could make sense to create a fall-back solution that simply consumed the raw Kafka record objects to a single-column (blink) table and then as a user, I could always do whatever deserialization/processing I wanted as part of query logic, if that logic was incompatible with the Deephaven Kafka to Table wrapping
I would like to be able to include fields from the Kafka record header into a table.
Looking at the code it seems fairly deep in the KafkaStreamPublisher that the headers would need to be processed. I wonder if it could make sense to create a fall-back solution that simply consumed the raw Kafka record objects to a single-column (blink) table and then as a user, I could always do whatever deserialization/processing I wanted as part of query logic, if that logic was incompatible with the Deephaven Kafka to Table wrapping