Skip to content

pass large number of messages with file in docker volume #21

@stevehu

Description

@stevehu

When we produce or consume messages from the backend API, a large number of messages will be passed between the sidecar and the backend API through HTTP requests. The HTTP protocol is not designed to pass a large body of data efficiently and a large body will take a lot of time for JSON parser to convert the string to a JSON array. Depending on the business logic in the backend API, sometimes, the communication between the sidecar and the backend might be the bottleneck. In order to increase the throughput, we can put multiple MBs of data in a text file and only pass the location of the file to the other party for processing or consumption.

As the message is loaded line by line from the file, it is faster than parsing the entire file to JSON first. We need to add this feature to address the initial load performance issue for one of the customers who is trying to load several decades of data from the mainframe to Kafka.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions