Skip to content

Commit 9af3061

Browse files
author
GitHub Actions
committed
Configuring log aggregation and observability for sonataflow
1 parent 319b85f commit 9af3061

File tree

3 files changed

+13
-16
lines changed

3 files changed

+13
-16
lines changed

assemblies/extend_orchestrator-in-rhdh/assembly-configure-log-aggregation-and-observability-for-sonataflow.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ifdef::context[:parent-context: {context}]
88
:context: configure-log-aggregation-and-observability-for-sonataflow
99

1010
[role="_abstract"]
11-
You must implement a robust observability strategy to ensure your serverless workflows are production-ready. By configuring structured JSON logging and integrating OpenTelemetry, you enable automated log aggregation, process-instance correlation, and distributed tracing.
11+
You must implement an observability strategy to make sure your serverless workflows are production-ready. By configuring structured JSON logging and integrating OpenTelemetry, you enable automated log aggregation, process-instance correlation, and distributed tracing.
1212

1313
// Enable structured JSON logging for SonataFlow workflows
1414
include::../modules/extend_orchestrator-in-rhdh/proc-enable-structured-json-logging-for-sonataflow-workflows.adoc[leveloffset=+1]

modules/extend_orchestrator-in-rhdh/proc-configure-telemetry-exporters.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Choose an export strategy that matches your observability platform requirements:
1919

2020
** OTLP exporter with batch processing (Recommended)
2121
+
22-
For production environments, use an OTLP exporter with batch processing to reduce network overhead and improve performance.
22+
For production environments, use an OTLP exporter with batch processing to reduce network overhead and improve performance:
2323
+
2424
[source,bash]
2525
----
@@ -37,7 +37,7 @@ quarkus.otel.bsp.max.queue.size=2048
3737

3838
** Direct export to an external platform
3939
+
40-
For development or simple integrations, use a direct export configuration.
40+
For development or simple integrations, use a direct export configuration:
4141
+
4242
[source,bash]
4343
----

modules/extend_orchestrator-in-rhdh/proc-enable-structured-json-logging-for-sonataflow-workflows.adoc

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
= Enable structured JSON logging for SonataFlow workflows
55

66
[role="_abstract"]
7-
Configure your SonataFlow workflows to emit logs in structured JSON format. This enables machine processing and allows you to correlate log entries with specific process instances across your log aggregation stack.
7+
Configure your SonataFlow workflows to emit logs in structured JSON format. Structured logging enables machine processing and correlates log entries with specific process instances across your log aggregation stack.
88

99
SonataFlow workflows support structured JSON logging with automatic process instance correlation through:
1010

@@ -16,15 +16,13 @@ SonataFlow workflows support structured JSON logging with automatic process inst
1616

1717
* You have deployed SonataFlow workflow using the SonataFlow Operator on {platform-generic} or Kubernetes.
1818

19-
* The `io.quarkus:quarkus-logging-json` extension is included in your workflow `QUARKUS_EXTENSIONS` environment variable during the image build.
19+
* You have included the `io.quarkus:quarkus-logging-json` extension in your workflow `QUARKUS_EXTENSIONS` environment variable.
2020

21-
* You have cluster admin permissions for deploying log aggregation stack.
22-
23-
* You have knowledge of JSON logging and log aggregation tools.
21+
* You have `cluster-admin` permissions for deploying log aggregation stack.
2422

2523
.Procedure
2624

27-
. Update your workflow build configuration to include the required JSON logging extension:
25+
. Update your workflow build configuration to include the JSON logging extension:
2826
+
2927
[source,bash,subs=+attributes]
3028
----
@@ -56,7 +54,7 @@ quarkus.log.category."org.kie.kogito.services.context".level=DEBUG
5654

5755
. Save the ConfigMap and restart the workflow pod.
5856
+
59-
The following is an example of a workflow ConfigMap with JSON logging enabled:
57+
The following is an example of a workflow ConfigMap with an enabled JSON logging:
6058
+
6159
[source,yaml]
6260
----
@@ -79,14 +77,14 @@ data:
7977

8078
.Verification
8179

82-
* Check the pod logs to ensure they are in JSON format and contain the `processInstanceId`:
80+
* Check the pod logs to verify the JSON format and the presence of the `processInstanceId`:
8381
+
8482
[source,bash,subs="+attributes,+quotes"]
8583
----
8684
oc logs <workflow_pod_name> | grep processInstanceId
8785
----
8886
+
89-
Example of expected output:
87+
.Example of expected output:
9088
+
9189
[source,bash]
9290
----
@@ -95,9 +93,8 @@ Example of expected output:
9593

9694
[NOTE]
9795
====
98-
If the MDC fields are empty or missing, ensure that the JSON logging is active and verify the process context. Missing fields might indicate one of the following conditions:
99-
100-
* The workflow has not processed any instances.
96+
If the Mapped Diagnostic Context (MDC) fields are empty, verify the following:
10197
102-
* The SonataFlow version requires custom MDC configuration.
98+
* The workflow has processed at least one instance.
99+
* The SonataFlow version matches the required configuration for MDC propagation.
103100
====

0 commit comments

Comments
 (0)