Describe the feature
Data Firehose supports following convenient conversions:
Use Case
- To use Firehose Data Format Conversion (Parquet, ORC) or Dynamic partitioning, decompression must be enabled.
- These features enable us to avoid decompressing and extracting log entries explicitly on analyze collected logs.
Proposed Solution
They are implemented as processors.
PoC code:
declare const bucket: s3.Bucket;
declare const func: lambda.Function;
new firehose.DeliveryStream(this, 'DeliveryStream', {
destination: new firehose.S3Bucket(bucket, {
processors: [
new firehose.DecompressionProcessor(),
new firehose.CloudWatchProcessingProcessor(),
new firehose.LambdaFunctionProcessor(func),
new firehose.AppendDelimiterToRecordProcessor(),
],
}),
});
will generate a CloudFormation template like:
{
"ExtendedS3DestinationDescription": {
// other configurations
"ProcessingConfiguration": {
"Enabled": true,
"Processors": [
{
"Type": "Decompression",
"Parameters": [
{
"ParameterName": "CompressionFormat",
"ParameterValue": "GZIP"
}
]
},
{
"Type": "CloudWatchLogProcessing",
"Parameters": [
{
"ParameterName": "DataMessageExtraction",
"ParameterValue": "true"
}
]
},
{
"Type": "Lambda",
"Parameters": [
{
"ParameterName": "LambdaArn",
"ParameterValue": "arn:aws:lambda:..."
},
{
"ParameterName": "RoleArn",
"ParameterValue": "arn:aws:iam:..."
}
]
},
{
"Type": "AppendDelimiterToRecord",
"Parameters": []
}
]
}
}
}
Other Information
Related issues and PRs:
Acknowledgements
CDK version used
2.181.0
Environment details (OS name and version, etc.)
Linux
Describe the feature
Data Firehose supports following convenient conversions:
Use Case
Proposed Solution
They are implemented as processors.
PoC code:
will generate a CloudFormation template like:
Other Information
Related issues and PRs:
FirehoseDestinationorDataFirehoseDestination#32038Acknowledgements
CDK version used
2.181.0
Environment details (OS name and version, etc.)
Linux